Home Banking Banking and Big Data: the Perfect Match?

Banking and Big Data: the Perfect Match?

by internationalbanker

By Samantha Barnes, International Banker

Although not a new concept, big data is now gaining the world’s attention like never before. Some call it the “new oil”, given its growing reputation as a valuable, largely untapped resource. Indeed, today we are seeing data being unleashed across many different walks of life, as a growing global consensus believes it could dramatically transform the way the world works.

But as most enterprises will admit, much work still needs to be done to ensure that big data can be successfully utilised. Although this digital commodity is now in rapidly increasing abundance, extracting meaningful and actionable information from it continues to remain a distinct challenge for its end users. This is especially true given that the overwhelming majority of “naturally occurring” big data is unstructured, which means that rigorous analytical techniques must first be applied if we are to at all benefit from its insights. Nevertheless, many industries now realise the potential of big data to radically advance business models; indeed, according to a recent study by the International Data Corporation (IDC), global revenues for big-data and business-analytics solutions will reach $260 billion by 2022, with a compound annual growth rate (CAGR) of 11.9 percent over the 2017-2022 forecast period. And the banking sector is set to be a principal driver of this growth.

The financial-services industry is already one of the world’s most data-driven. Banks currently have enormous quantities of customer data at hand, including through KYC (know your customer) compliance checks, customer activity at ATMs (automated teller machines), point-of-sales purchases and online banking profiles. And today, data and data analytics increasingly serve as the basis to not only understand more about customers but also to improve internal processes, such as operations and compliance, as well as greatly boost scalability.

One area in which banks are leveraging the advancements in big data is marketing. Big data can be used by marketers to gain a more granular understanding of customer preferences, and in response, which products and services to sell to them. Indeed, banks can even use customer data to monitor their behaviour in real-time. For instance, a bank could send an offer to a customer based on her use of a particular smartphone app or on credit-card data from an item she purchased. Or if a potential customer visits the bank’s website to browse a particular service—say, a loan service—a big-data marketing technique known as retargeting will allow the bank’s loan offers to be displayed on other websites visited by the customer.

Big data is also helping banks to more accurately assess the creditworthiness of potential borrowers and lenders, thereby providing more insight into their customers. A 2018 study in the Journal of Business Research entitled “Big Data Techniques to Measure Credit Banking Risk in Home Equity Loans” found that big-data techniques could be successfully applied to massive financial datasets “for segmenting risk groups”. By adopting such techniques, therefore, the paper concludes that there would be “less risk for financial companies when predicting which clients will be successful in their payments”, and as such, “more people could have access to credit loans”.

But are banks currently managing to exploit big data to its maximum potential? Most would suggest not. Having conducted a comprehensive analysis of the data-analytics maturity of more than 20 banks across European, Middle Eastern and African regions, McKinsey found that while they have generally constructed strong initial analytics foundations, “there is still room for them to improve performance”. The consulting firm also identified five areas in which banks can better utilise data analytics to improve performance:

  1. Align analytics priorities to strategic vision. Most banks still struggle to turn their analytics strategies into effective use cases. Although they are starting to use advanced techniques in some areas, top-down views limit the potential of analytics in their core strategic activities.
  2. Embed analytics into decision-making and workflows. Senior managers must do more to expand the use of analytics to have a full-scale impact. Some banks don’t have sufficient technical expertise to achieve this, while for others, the problem lies in the culture, which fails to support data-driven decision-making.
  3. Develop advanced-analytics assets and teams to scale. There need to be more large-scale analytics operations. McKinsey observed that more successful organizations can undertake such projects through advanced-analytics centres of excellence (COEs), which can help to expand their analytics scale, as well as the use of third-party vendors to develop “external capabilities, know-how, and assets”.
  4. Invest in critical analytics roles. Expand analytics teams to include sufficient technical personnel, including “data engineers, data scientists, visualization specialists, and machine-learning engineers”. Moreover, there needs to be more effective collaboration among these different roles. Translators who help data scientists understand business can provide a crucial link between business and analytics.
  5. Enable the user revolution. Control issues pertaining to data security, privacy and compliance leave banks’ data operations too constrained for analytics use cases to be widely developed across the organisation. Banks need to broaden their data practices to ensure their high-quality data can be used in more applications.

But attempting to induce any kind of meaningful expansion means that banks are likely to run up against one of their toughest challenges: ensuring their big data is supported by sufficiently robust infrastructure. Big data is often characterised by “the 3 Vs”:

Volume, which can extend into terabyte or even petabyte territory and thus render conventional data-processing and storage infrastructure too slow or altogether inadequate.

Velocity, referring to the speed of adding and processing new data (transaction data, for example). In some cases, near real-time insights will need to be generated if the data is to have any utility for the bank.

Variety, in the form of vastly different types and structures of datasets. For banks, data can be in the form of everything from customer transactions and credit profiles to unstructured social-media posts.

And with these “3 Vs” growing in magnitude all the time, it’s no surprise that the outdated IT (information technology) infrastructure in place at most banks is unable to sufficiently cope with the demands of big data. Most legacy systems simply can’t collect, store and analyse the data efficiently, which could threaten the stability of the bank’s entire IT system. As such, banks will continue having to boost their storage and processing capacities or, indeed, completely overhaul existing systems altogether.

That said, the pros of effectively utilising big data easily outweigh the cons for banks at this stage. The potential to unleash a treasure trove of actionable insights, the opportunity to convert more new customers, the ability to more accurately manage risk, and the cost-saving and revenue-generating potential of this new resource mean that if properly handled, big data can propel banks into an exciting new age of efficiency.


Related Articles

Leave a Comment

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.