Home Slider Breaking Down The Barriers

Breaking Down The Barriers

by internationalbanker

By Martijn Groot, VP Strategy, Alveo

 

 

 

 

Why Data Quality and Data Integration Matter as Much to Financial Firms as Bringing Analytics and Data Together

 

Financial firms collect and store enormous amounts of data. But collecting data for its own sake is of little use in finance. Bringing intelligence and facilitating access to that data so that it can be used is key. Firms require increasing amounts of data from a broadening set of sources which often puts pressure on the existing data management infrastructure.

Keeping this cost of change in mind, the emphasis needs to be as much on the adaptability and extensibility of data models and the onboarding of new data sources as it is on data aggregation capabilities. Today, many firms are still limited in what they can do in this respect and face delays in onboarding or properly operationalizing the data sets they require.

New research from Alveo finds that nearly two-thirds (63%) of data scientists in financial services firms say their organisation is not currently able to combine data and analytics in a single environment. That’s a serious concern because, in an environment where analytics is increasingly pervasive in all business processes, only by ensuring that data and analytics are closely linked can financial organisations get the most out of the data they acquire.

When you factor in ever-present issues around data quality and growing data volumes that are often siloed in data stores and hard-to-access legacy systems, the scale of the problem becomes increasingly clear. Many organisations find it difficult to manage large data sets and to scale their infrastructure to the volumes they face. Financial firms have to overcome the challenges of getting data ready for use by analysts and quants so they can deliver enhanced insight and ultimately help shape the future direction of the business.

Ensuring data quality

Data can be gold, but only if it is of the highest quality and when processes and business users can access it. Poor or incorrect data can be damaging and lead to inaccurate analytics, mispriced products, wrong strategies, client loss and regulatory backlash. The ramifications for failing to meet strict regulatory demands on data quality are serious and include reputational damage, financial penalties, or suspension.

Analysts and quants often struggle to conclude whether or not the data is fit for purpose as they miss the context, such as the original source, licence permissions, timestamps, quality markers, or who has approved its use. Judging fitness for purpose without all this information is difficult and frequently models come to suboptimal results. For example, when the metadata is not updated, lineage and understanding of the relevant permissions are hard.

Ensuring high data quality is not a one-time project, but a continuous undertaking, which should start with understanding what data financial firms hold and setting data validity rules, such as accuracy, completeness, timeliness, and uniqueness. Only when data is validated, can a data quality framework be developed, the purpose of which is to help identify points at which data quality problems can occur. Identifying and adjusting business processes on an ongoing basis will help improve overall data quality.

Streamlining data sourcing

The challenges financial firms face with data extend well beyond ensuring quality though. Analysts often need to combine different types of data, with 38% citing integrating structured and unstructured data as one of the main challenges in bringing analytics to data. The number of data sources going into decision-making processes continues to grow and lack of a data catalogue leading to time-consuming data searches or double sourcing is the top issue for 28% of data scientists. More than three-quarters (77%) also say their organisation requests the same data multiple times from a single data vendor, leading to unnecessary duplicate data costs.

A lack of communication and therefore streamlining of data sources is also highlighted by 82%, who say their organisation’s front office teams use different vendor sources than their compliance, risk and operations teams. This often leads to costly inconsistencies that cause operational overhead and, worst-case, errors in external (regulatory or client) reporting or suboptimal asset allocation.

The move to data-as-a-service

The blending of data management and analytics helps users access multiple data sources and data types. As data management and analytics are increasingly linked, and as we now see firms often move analytics to where the data resides rather than moving large stores of often siloed data over to the analytics function, users within financial services organisations increasingly want to use these new, integrated capabilities to drive better informed decision-making. This move to data-as-a-service (“DaaS”), when combined with the latest analytics capabilities, is today making this happen for financial organisations.

Combined data management and analytics has been a great benefit for quants and data scientists. 27% highlight ‘improved productivity of data scientists and quants’ as one of the main benefits of more closely integrating market data and reference data into advanced data analytics. By adopting this approach, firms and their data scientists can gain access to multiple data sources and also multiple data types, from pricing and reference data to curves and benchmark data, ESG and alternative data. With the help of popular languages like Python and R, users can create a robust and scalable data meeting place, enabling users to share these analytics across their data supply chain and develop a common approach to risk management, performance management and compliance.

Quants and data scientists benefit from this through increased productivity and faster time to market. We are seeing many data analysts today that are looking to dig into the data to find indicators that help them discover investment signals and returns in the market. Data scientists are looking at historical data across asset classes to distil information down into factors including ESG criteria to operationalise it into their investment decision-making process, and increasingly too, they are at least starting to incorporate innovative data science solutions, including AI and machine learning, into market analysis and investment processes.

Future Focus

Today, technology, process, macro-economic factors, and business awareness are all joining forces to bring analytics and data together. Cloud-based integration and processing services, together with financial data subject matter expertise, bridge analytics and data management. This allows firms to effectively prime and explore financial information so they achieve ‘data alpha’. The result for financial institutions is a new world of opportunity where they optimise costs, drive user enablement through better access and awareness of available information and maximise the overall value they get from data.

Related Articles

Leave a Comment

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.