Jump to Content
Financial Services

How ANZ Bank uses risk reporting as a change agent

May 8, 2023
https://storage.googleapis.com/gweb-cloudblog-publish/images/finserve_2022_GKyhDvb.max-2500x2500.jpg
Artur Kaluza

Head of Reporting and Modelling, Risk Strategy and Transformation, ANZ Bank

Matt Tait

Financial Services Solutions Architect, Google Cloud

Last year, Australia’s second-largest bank by assets under management - ANZ Bank - set out to accelerate its digital transformation to provide a single, consistent, high-quality set of corporate data across the organization. This dataset will enable personnel to swiftly glean insights, enhancing credit decisioning, credit risk comprehension, and the ability to reduce financial crime. 

This dataset requires a considerable migration - ANZ’s risk department plans to migrate and condense 100 distinct on-premise systems to 55 cloud-based systems, with one single Google Cloud-deployed risk data hub at the center.

At the program’s outset, ANZ identified a use case whose reimagined architecture would solve challenges not only specific to the use case, but endemic to the bank’s overall data discipline. The ANZ chose information delivery and reporting - to internal business stakeholders and internal risk analysts - because it required a design that would not only serve its own requirements, but also enhance the bank’s overall data capability.

Put another way, a platform capable of better supporting ANZ’s internal personnel for the reporting use case is sure to be capable of disseminating higher-quality, better-understood and better-controlled data throughout the bank - and this dissemination is primed to serve any number of other data use cases.

Today’s reporting challenges

In recent years, the demands of rapid digitization and increased regulatory demands have strained legacy reporting systems across the banking sector. The banking sector’s rapid shift toward a “digital first” model has demanded that banks disseminate trusted and widely understood data across the organization, to facilitate the insight and analytics foundational to new product development. Regulators across the world have also intensified their efforts to avoid a repeat of the 2008 financial crisis by generating a dizzying set of regulatory reporting obligations, which continue to grow in volume, variety and velocity.

In order to meet these requirements, banks have undertaken substantial technology work to gather the necessary data, implement the required systems and processes, and stand up the teams to operate the reporting processes. The complexity of the requirements and the stringent timelines result in fragmented and complex systems, band-aid solutions and manual stitching of data across systems. This in turn leads to intense manual effort and the potential for staff burnout.

Complicating matters, granular data processing is discouraged by aging data platforms, where large denormalized tables are too expensive and slow. The industry has therefore processed aggregated risk and regulatory data, creating a number of downstream inefficiencies. These include manual disaggregation to the end user’s desired level, manual adjustment to resolve quality or reconciliation issues, and prevention of intuitive drill down.

This intense manual effort leaves reporting personnel overworked, particularly at the peak of the reporting cycle. The high granularity problem leaves developers struggling to reuse existing reports or build new reports, and leaves banks fighting to keep pace with the management and regulator demand.

Leveraging cloud to improve and streamline the process

Although multiple report types and regulatory frameworks exist across the finance, risk and treasury domains, the underlying data required for each report have large degrees of overlap.

https://storage.googleapis.com/gweb-cloudblog-publish/images/Overlapping_data_requirements.max-2000x2000.png
Illustration 1. Overlapping data requirements among different functions at ANZ

In addition, each report’s construction process is similar, consisting of a data sourcing, calculation, and generation step, followed by a review. The process requires a control framework that helps ensure report quality.

https://storage.googleapis.com/gweb-cloudblog-publish/images/ANZ_blog_-_2.max-1500x1500.png
Illustration 2. Both the regulator and the institutional business users consume the regulatory reporting process output.

Using Google Cloud, ANZ has created a single unified data platform and architecture that addresses the key needs of risk and regulatory reporting. Rather than involving a new data platform at each processing step, the platform uses one central data platform across all steps. Data is loaded into a single storage layer with huge capacity, and is processed by large-scale ephemeral compute resources. This cloud-native approach contrasts with traditional reporting architectures, whose extended transformation sequences obfuscate data lineage.

https://storage.googleapis.com/gweb-cloudblog-publish/images/ANZ_blog_-_3.max-1900x1900.png
Illustration 3. Google Cloud has worked with its customers to help pioneer a new Regulatory Reporting Platform to address the challenges.

BigQuery is a critical solution component because it’s cost-efficient, offers minimal operational overhead, and helps solve the high-granularity problem. By storing and rapidly querying massive, granular datasets, BigQuery provides ANZ with a consistent source of high-quality data. This single source serves multiple report types. 

In concert with BigQuery, open source tools catalogue the source’s data, and provide a clear view of a report’s lineage. They also help ensure key system components can migrate to other cloud environments (or even on-prem), satisfying workload survivability requirements.

The basic ingredients of our approach

ANZ recognized that any technical architecture must be in service of a data production process, a development process and a control process. In other words, any successful reporting system must cater to all the relevant personas - engineers, auditors, regulatory users, business users and business analysts.

https://storage.googleapis.com/gweb-cloudblog-publish/images/ANZ_blog_-_4.max-2200x2200.png
Illustration 4. A representation of the risk and regulatory reporting process workflow.

From a data production process perspective, business users require timely access to management reports, regulatory reports and underlying data. They can self-serve this data via business intelligence tools, and then rely on the reporting system’s data quality and adjustment features during the review process.

From a development process perspective, data engineers can rapidly develop and ship functionality by applying modern technology stacks to the central BigQuery data source. For risk and regulatory reporting, ANZ adopted the open source dbt framework to enable the “extract-load-transform” paradigm, embedding transformation, quality and lineage functionality in a dedicated component integrated with BigQuery. dbt integrates well with Google Cloud and its transformations are included in a Cloud Composer-managed processing pipeline, alongside containerized data load and publication steps.

https://storage.googleapis.com/gweb-cloudblog-publish/images/5_jeOWTjF.1234070820021194.max-2800x2800.max-2200x2200.png
Illustration 5. System architecture of a Google Cloud-based risk and regulatory solution.

Finally from a control process perspective, auditors, business users and operational staff can easily inspect the reporting system’s process and output. dbt’s auto-generated documentation demystifies a report’s lineage, and the rich event detail published to Cloud Logging allows auditors to inspect the solution much more easily.

By using Google Cloud, ANZ built a reporting system that demonstrated substantial benefits, ranging from performance improvement, elevated operational efficiency, and cost reduction.

Artur Kaluza, ANZ Bank.

Empowered users and a faster, more affordable solution

By using Google Cloud’s technology stack and architecture pattern, ANZ built a reporting system that demonstrated substantial benefits, ranging from performance improvement, elevated operational efficiency, and cost reduction. The solution helps enable granular data processing, reducing downstream manual operations and adjustment processes. It not only helps save time but ensures accuracy that the system users - engineers, auditors, regulatory users, business users and business analysts - need. Ultimately this has enabled a leaner technology and operating model where the focus of the employees has shifted to higher value activities.

Posted in