Jump to Content
AI & Machine Learning

Supercharging Anti-Money Laundering (AML) with Generative AI at Strise

June 14, 2024
Adrian Trzeciak

Engineering Manager, Strise

Try Gemini 1.5 models

Google's most advanced multimodal models in Vertex AI

Try it

For financial institutions, maintaining compliance with national and international laws is a costly burden, with the banking industry spending over $200 billion to meet the strict scrutiny of regulators.  To make this process easier, and less burdensome, Strise has built an Anti-Money Laundering (AML) Intelligence System trusted by some of the largest financial institutions in the Nordics and fast-growing fintechs. Our AI-driven platform transforms AML from a resource drain into a winning strategy that equips compliance teams with the tools they need to combat financial crime. 

How do we do that? Well, out of the millions of events happening every day, only a handful are relevant to a company or person. At Strise, we combine the current state-of-the-art in Natural Language Processing (NLP) research with our AI-driven platform to filter out the noise, and identify the most important events for our users.

Opportunities with Gen AI in KYC and AML

Gen AI offers significant opportunities in enhancing Know Your Customer (KYC) and Anti-Money Laundering (AML) procedures. Financial institutions battling with time-consuming, error-prone, manual and costly KYC and AML checks, will benefit from the speed, efficiency, and precision AI provides; while still ensuring compliance. 

Combining the power of gen AI and multi-billion parameter large learning models (LLMs), optimizes automation of KYC and AML processes. Thanks to the additional power of these models,  data collection, validation, and risk assessment are more efficient. This would accelerate onboarding of new customers, improve customer experience, reduce error rate,  and potentially save billions of dollars that are currently spent on manual checks. LLMs can significantly improve the accuracy and reliability of data processing tasks too, which is crucial for AML applications. These technologies enhance the ability to conduct in-depth sentiment analysis from textual data, enabling the processing of vast quantities of information with heightened precision..

All this allows for additional possibilities in processing of data points connected to a company or a person that could signal money laundering activities. By enabling businesses to better connect the dots between seemingly disparate pieces of data, they will not only secure businesses from financial crime but also ensure regulatory compliance, protecting institutions from heavy fines and reputational damage.

Why we chose Vertex AI

We've been all-in on Google Cloud since day one, relying on a wide variety of Google services within our stack. Being part of a highly regulated industry, each market has unique requirements concerning data localisation, encryption, and security, as well as complying with EU regulations. Thanks to Vertex AI and their LLMs' tight integration with our Google Cloud-based services, together with its IAM and Governance capabilities, we're able to significantly shorten the process of bringing these services into production.

Google Cloud has been immensely helpful in facilitating our journey with LLMs, providing us with educational materials for our teams, and industry-specific advice so we  continue tailoring our products and services to our customers at scale.

How do we use Gen AI?

Driving frictionless user experiences
At Strise, we are committed to creating a product that is both simple and elegant. Human nature draws people towards software that is both visually appealing and easy to use. In our experience, we've yet to meet anyone who prefers  a complex user experience over a simple one.

KYC can be complicated, filled with complex processes across multiple systems, that make building a simple interface challenging. When we come up against these barriers, our commitment to maintaining the highest of regulatory standards comes first, even if it temporarily sacrifices our devotion to user friendliness. 

Imagine a world where you could simply express what you wanted, without navigating endless menus and options. That would radically transform how we interact with technology, saving those untold seconds and minutes and replacing it with seamless user experience. a

I believe we are on the cusp of this new reality. That's why Strise is developing an LLM-based AI co-pilot that helps you achieve your goals within our app’s current capabilities. In some areas of  the app, the co-pilot provides an alternative to the standard interface.

For example, we are developing a feature for continuous monitoring of a bank's customer portfolio. Whenever a person or a company receives new data, such as new financial information, sanction updates, or a change within politically exposed persons information, a review process is automatically triggered for the bank’s investigator. Although it sounds simple enough, the solution can combine multiple triggers into one so that you can essentially say, "I want to set up a trigger for all high-risk companies that have experienced a change in sanction information and a new EBITDA margin".

For a similar manual set up, you would have to combine information from multiple dropdowns and scroll through alternatives to get to the correct one. An alternative would be to input "all high-risk companies that have received a change in sanction information and new EBITDA margin," where the LLM would process the request and convert it into a set of supported triggers.

https://storage.googleapis.com/gweb-cloudblog-publish/original_images/2_v0HZ30o.gif

Why not take it even further? Could the whole application be a simple prompt? We will definitely experiment with shortening the amount of clicks required and transforming click-based flows to prompt based flows.

Generating code
Processing huge amounts of events each day demands quickly connecting various data sources. But the process for mapping source content into a useful format is repetitive and none of the engineers enjoy it. For an engineer, the process involves:

  • Reviewing the API specification and the available endpoints

  • Generating a request payload to test and analyzing the response

  • Developing mapping logic and the integration

During one of Strise’s recent LLM hackathons, we tried generating Scala code for new integrations following our standards, without writing a line of code. The prompt accepts example payload and response, and generates Scala code to perform the integration using existing libraries. From there, an engineer simply submits a pull request.

Reducing false positives 
The KYC process relies heavily on the availability of information for a given company or person. Incorrectly showing information may result in a customer being unable to do business with you. On the other hand, missing information could lead to doing business with customers you should be excluding.

Determining whether a company or person is sanctioned is a critical component of a compliance solution. While this approach may produce a higher number of false positives, compliance personnel can interpret the results. With LLMs, we can submit entity information and identified sanction records to Vertex AI. By providing a small dataset of example inputs and outputs in addition to the prompt itself, we can receive PaLM's interpretation of whether it is a true or false positive, along with an explanation.

https://storage.googleapis.com/gweb-cloudblog-publish/images/3_UfArs3v.max-700x700.png

Every day, our non-technical teams are exploring new ideas and use cases with generative AI, which inspires them to confront challenges and reimagine workflows. Google Cloud’s ease of use, lowering the bar to make highly complex large language models accessible to the average user, continues transforming our productivity in ways we still can’t envision. Whether our employees use generative AI to build chatbots for querying databases or summarizing data logs into actionable intelligence, our Google Cloud partners are there to support us. 

Posted in