Home Finance Artificial Intelligence Storms the Back Office

Artificial Intelligence Storms the Back Office

by internationalbanker

By Richard Chapman, Head of Strategy, Reconciliation, FIS  

 

 

 

Financial services firms face a range of headwinds. The last thing they need is a regulatory tornado blowing the house down due to data integrity and reporting errors. The risk is real: failure to provide consistency and accuracy in financial and trade reporting is resulting in increasingly substantial fines (The UK’s FCA, for example, has levied £33 million in penalties since 2009).

In addition, reporting is often highly dispersed – third parties process some disclosures, whist others are filed by various in-house teams at different intervals – meaning data integrity checks at all phases of data interception can be operationally fragile and financially intensive.

Regulations like MIFID II, BCBS 239 and EMIR all state different data capture and reporting frequencies, and audit and control stipulations. These challenges are amplified by pressure to reduce cost and simplify procedures.

Perhaps, however, there is a ray of sunshine breaking through the dark clouds.

Data Integrity as a Service

There is significant potential for automated Data Integrity Services to provide the required regulatory resilience and insulation. A foundation of data accuracy and confidence will support regulatory reporting requirements, deliver stronger audit and control, while adding significant operational efficiency through elimination of manual processes.

There are however, four obstacles that cloud widescale adoption of data integrity services:

  • Time and cost of creating new integrity checks can be prohibitive or create a backlog of time consuming manual processes

  • Monitoring the delivery of large numbers of data integrity processes to a broad set of stakeholders is complex and expensive

  • Natural degradation of automated processes occurs over time due to environmental changes that are not catastrophic (so processing still occurs) but increase manual intervention

  • Exceptions derived from the integrity check are monitored, managed and resolved manually

It’s the removal of these challenges that represents the pot of gold at the end of the operational efficiency rainbow.

Eliminating the backlog

In the world of data integrity, combining AI heuristic techniques and statistical algorithms against data sets could enable data mapping, relationships, matching rule logic and documentation to be auto-generated. This would not only reduce the time taken to create automated integrity checks (thereby reducing the backlog) but would also enable business users, not technical experts, to manage the process. It is exactly this combination of capabilities that FIS used to create the IntelliMatch Accelerator solution. The AI engine actually builds, tests, documents and deploys new reconciliation and data integrity automation at the touch a few buttons – using only the data sources as a guide.

Monitoring the data integrity landscape

It is tough to manage the delivery of data integrity checks between hundreds of systems and thousands of data feeds spanning numerous systems, external parties and service partners. Particularly while ensuring adherence to service level agreements and meeting stakeholder demands. Many organizations resort to custom tooling and a series of fragile bespoke scripts to track and monitor delivery. However, by adopting a service based consumption model, this is avoided. FIS, for example, provides end-to-end service delivery that leverages proprietary monitoring and SLA management capabilities to ensure the requirements of every process and every recipient is met. Combined with broader economies of scale that are not possible within a bank’s four walls, this service-based consumption model is, understandably, on the rise.

Reversing automation degradation

As minor environmental changes to source systems, counterparties and references occur, tiny adjustments to downstream systems must be made. Without such refinement, automation drops. Data integrity checks are no exception to this counter-intuitive observation (one would expect that systems should become more efficient over time). However, machine-learning techniques provide the key to reversing the trend by capturing and analysing manual activity within systems and using this to repair and build new rules. At FIS, our AI engine analyses manual matching patterns to determine potential candidates for matching based on past user-driven activities. Using this machine learning technique FIS has found increases in automated match rates of up to 20%.

Managing the exception process

Our objective in the data integrity treasure hunt is to dig out inconsistencies or errors between data sources, and repair them to prevent operational failure or reporting inaccuracies. However, to achieve this often involves access and interaction with multiple systems and parties that, until recently, have been cost prohibitive to automate. Now, with repetitive yet complicated system interaction at the heart of exception resolution, Robotic Process Automation (RPA) can hold the key to taking straight through processing to the next level.

Could it be that machine learning and RPA will guide banks through the storm of regulation, control and cost reduction that surrounds them today?  Perhaps it will also provide the resilience to weather whatever the future brings. We certainly think it will.

 

Related Articles

Leave a Comment

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.