Monday, 23 December 2024

Viral video

Next generation artificial intelligence disrupts data quality processes

5 min read

Victoria Harverson, Global Head of Business Development at SmartStream Air, a unit of SmartStream Technologies, discusses the challenges in traditional operational data management and how the latest technologies can address them

Data sits at the core of financial markets and transactions today. Financial products no longer take physical forms but are digital entries on electronic ledgers. Transactions are represented by movement of data from one spreadsheet to another, transferring ownership electronically between individuals, companies and locations.

With the massive volume of transactions and high level of automation, it is critical for institutions to maintain a complete, accurate and timely set of operational data to underpin these activities.

To this end, huge investments are poured into enhancing data quality and operational excellence – the process of aggregating, reconciling, validating and correcting transactional data. This however, remains hitherto a largely manual and ineffective process.

The impact of poor data quality and operational control could be severe, causing process disruptions, exceptions, as well as clearing and settlement issues that require costly investigations and manual reprocessing. Worse, it may result in significant financial loss, reputational damage and regulatory penalty.

These challenges in traditional operational data management can be addressed by the latest technology in artificial intelligence (AI) and machine learning (ML) according to Victoria Harverson, Global Head Business Development of SmartStream Air, a unit of SmartStream Technologies, a financial transaction lifecycle management solutions provider.

 

Following is the edited transcript of her interview

Foo Boon Ping (BP): Tell us what are the challenges financial institutions face when it comes to operational data management today and why is it so important to implement controls?

Victoria Harverson (VH): Well, whether you are a large multinational financial institution, or a smaller SME organisation, you're required to constantly validate and check a huge number of data sets for accuracy, completeness as part of your day to day operations. And the problem with this often is very high volumes of data coming into the organisation from a large variety of external counterparties. Or you're looking at data between disparate systems internally. The formats of that data are often non-standard complex messy data, unstructured data, this is very challenging to manage.

And then there's also a lot of underlying complexity, which is always changing, you may need to manage a certain attribute or look at certain characteristic of the data differently, one year to the next. So, you need to be able to manage that. And obviously, some solutions do exist to tackle some of these problems very well. But not all of these problems. And even the best tools that are preconfigured and designed to solve some of these problems in more of a predefined way they can lack the flexibility and agility needed to really get results quickly.

What you're left with in that type of operating model is a huge amount of manual spreadsheets and user developed application, “excel” work, basically. And this represents an unsustainable risk for financial services organisations. Human error is one of the big risks there with manual work. Keyman dependencies, if you have individuals working on various spreadsheets and putting these models together, and they leave the organisation, you're stuck.

And there's no audit trail with an “excel” operating model. You're also left with processes which aren't being done at all. Often, you're just missing opportunities to improve productivity and profitability.

Financial institutions cannot afford to have any number of reconciliation processes or data quality processes remain on spreadsheets, they can't wait weeks or months to find errors or to spot suspicious activity. And they certainly can't submit incorrect reports to regulators.

For all of these reasons, you have to wrap a control framework around all of your data and all of the reconciliations and be ready for anything and not miss anything that's very critical. And adopting AI to control and future proof your operation from every angle allows you to be ready for anything.

We find that for capital markets firms in Asia, the key focus for them is improving processing times in general. The evolution of the trade life cycle and its complexities have only intensified the need for quicker post trade processing, whilst keeping errors and expenses down to a minimum.

Two of the biggest challenges I think organisations face in this region in Asia, when adopting AI technology across the back office is less understanding of how applicable and what the use cases are for the AI technology, and also less internal skill set in the organisation to deliver it. Banks in Asia Pacific, that are only just going digital are very much now learning to partner with organisations like SmartStream, because that allows them to get the upper hand I suppose in their digital transformation journey. Financial institutions in the region, they really do want to ramp up their use of AI. And I think us having a presence here for the last 20 years and managing over 200 customers in the Asia Pacific region that really allows us to be very well placed to serve this market.

BP: It’s well known that poor quality data causes trading breaks, process exceptions, as well as clearing and settlement issues and getting the root cause of the issues can take time and several IT projects. What can be done to improve this? How have you intelligently disrupted this legacy process??  

VH: We've definitely disrupted the process. Firstly, by way of data normalisation. In most cases, data will come into an organisation from many external counterparties. For a bank, sometimes that could be thousands of external counterparties. And each of these will have their own formats, their own structures and standards.

And the challenge has always been taking that non-standard data, pre-formatting it, normalising it, cleaning it up, and often shaping it to the schema of a specific tool that you will be using. What we've achieved here, with latest AI solution is the ability to ingest any data in any format and straight from the source and this removes a considerable pre-processing and data analytics layer.

In SmartStream Air, we leverage pure AI to analyse any format of the data, it creates digital AI fingerprints of the data and generates auto match rules based on how the data is likely to correlate to each other and the structures of those data sets. You can even load in multiple formats, multiple files all at the same time, there's absolutely no constraints with this type of AI technology. One thing I always talk about actually is you don't even need to understand the data beforehand, the AI will do that analysis, it will recognise it, it will do the mapping for you. And it gives you the ability to match any data for any reason in an instant, without IT projects.

The delivery model would be considered disruptive in that it's fully software-as-a-service (SaaS), cloud native application. You can just switch it on and get to work on the same day if need be. There's no implementation or technology projects needed. It's robust and secure, it can be used anywhere in the world. And I think with a SaaS pricing model, total cost of ownership, because it's subscription based is really controllable, and it's very transparent for a very large enterprise or a very large organisation that's used to quite complex agreements with various fintech providers.

In terms of the use cases, where would this product be most useful? Automation of any transactional reconciliation, table-based spreadsheets that you need to compare that's absolutely perfect for SmartStream Air, being able to quickly validate regulatory reports, particularly if a third party is providing them to you. But it's still your responsibility to report to the regulator, Air can be used to rapidly check the quality of those reports. And another use case is the system migration projects and IT transformation projects, particularly in large organisations. They can be the biggest barrier to change, very costly, very expensive and time consuming.

With SmartStream Air, you're able to take a very large data set, it could be an architectural data set of an old system, compare it with that of the new system, check that everything's the same and nothing's missing. You can do that type of work in minutes, which is very compelling and it allows organisations to accelerate their IT projects, which is so important in financial services today.

BP: How does SmartStream Air v2.0, transform traditional data verification and reconciliation processes? What gaps does it fill?

VH: Well, Air v2.0 fills the need to control all kinds of data, do all kinds of data quality checks in any scenario and be ready for anything, I think at the core, AI is very much about data and about scale about being agile, flexibility to different formats and being able to get your end result very quickly without an IT project. And typically, if a business operations team doesn't need to put a data integrity  control in place, the business team will then start working with the IT team.

And there will be an iterative process where they're working on business requirement docs, and they are putting in logic and rules and they're working on that process together. They will be transforming the data, they will be using the sort of ETL (extract, transform, load) and normalisation processes. They will go through this whole process in a UAT (user acceptance test) a couple of times, then they'll promote it into production. And the business may then come back and say, “I want to make a change”, that might be another IT project. So we designed Air so that any user is able to log into the tool and compare data sets very quickly and get their results instantly. And the AI fully learns how data sets relate to each other and exposes any errors instantly.

Now, Air v2.0 is exciting, because we've enhanced in a couple of ways. Firstly, there's the PCI DSS certification that we've just secured, which allows us to host and process credit card and card data. So that allows, enables us to get a lot deeper into the payments segment, which is quite important to us, we have a number of payment customers, and we're working with digital banks. PCI DSS really increases the ability to work in that market, the UI, also the user interface, has been enhanced. So we've gone with the concept that we want this to behave like a consumer application, because if you're on your smartphone, you download an app and you can't use it just in the first couple of minutes, you're probably going to delete it. So we wanted the user experience to be ultra intuitive. And with 

SmartStream Air, absolutely no, no training is needed whatsoever. So the UI has been enhanced.

We're really excited to announce recently the introduction of Affinity, which is the feature we've introduced, which allows us to do observational learning, which is an AI technology. We've always used unsupervised learning techniques. But 2.0 introduces Affinity, gives me the ability to mimic your actions. So this is really ground breaking in that if the end user tweaks anything, changes, anything, manually matches fields, which may be, may not make any sense to anybody else but the user, the AI is going to learn from that behaviour, it's going to mimic it and it will capture it. And that understanding will be there for the next time around. This is really compelling. This is going to open a lot of different use cases. And it's also going to just enhance the performance of the AI, which is exciting.

Air 2.0 is actually very relevant in COVID-19 pandemic times, I suppose you would call it market volatility, straining financial system, staff working remotely. It's been really challenging for all of us. And we now expect much more AI adoption in Asia. digitisation is at the forefront of banks in Asia. And they are developing a greater reliance on data and analytics. They're fast becoming data driven through the adoption of intelligent machines that generate in depth, insights and analytics to really drive a lot better decision making.

BP: Tell us what other latest developments to expect from the Innovation Lab of SmartStream.

VH: Well, I'm so pleased to be working with the SmartStream Innovation Lab. They are a very talented team of people, mathematicians, data scientists, all based out of the office in Vienna and essentially, they work with our business teams and our product teams at SmartStream, who have many decades of subject matter expertise in the industry behind them.

And the product and the business teams are engaging with the clients all day, every day, they understand the challenges that our clients face and what their needs are. And that allows us to influence the innovation lab’s roadmap, because of customer feedback. So that's a really important thing to allow customers to influence that roadmap, but we're investing millions of dollars in R&D (research and development) in our innovation labs. The focus is on the performance of the AI and making sure it's improving over and over, particularly with problems it's never seen before.

And with all of our products, we are testing the AI at ultra high volumes, and we are looking to be achieving greater rates of straight through processing, greater efficiency, greater data quality. Whilst it's the most obvious application of AI and machine learning for SmartStream, it's certainly not one of the only solutions where we are deploying this type of technology. This year, and next year, we're rolling out AI and ML components to many other operational control solutions that we have.


Keywords: 2020, SmartStream
Institutions: SmartStream, SmartStream, SmartStream Technologies
Leave your Comments
Recent Comments