The term “Big Data” is one that will be familiar to anyone in the Financial Services (FS) industry. Ask around however, and it quickly becomes clear that relatively few understand the extent of its penetration into core business operations, let alone its importance to competing in the market place in 5-10 years.
It’s likely that a large proportion of FS company’s capital expenditure over that period will go towards developing and deploying Artificial Intelligence (AI) & Robotics technology, transforming IT infrastructure, and developing new products and services. This return on this outlay is not solely dependent on data – selecting the right strategy and executing projects efficiently are equally important – but they cannot succeed without a high volume of accurate data about both their internal infrastructure and target customers. With that in mind, it’s worth reminding ourselves of the key constituents of Big Data, what it underpins and why it’s so important.
Big Data refers to large volumes of structured and unstructured data that businesses manage, which can originate from external factors (e.g. customers, clients) or internally (data on IT infrastructure or trading operations) and can be said to comprise of differing levels of the Five V’s.
The 5 V’s of Big Data
The Digital Economy Forum – “How Your Digital Footprint Generates Big Data”
The ability to handle high Velocity, Variety and Volumes of data (on the left of the V in our image) is of critical importance given the exponentially increasing number of sources; both from machines in IT infrastructure and devices associated with the Internet of Things (IoT).
Technologies such as Hadoop (which provides massive storage for any kind of data, enormous processing power and the ability to handle virtually limitless concurrent tasks or jobs) and Splunk (which can collect all unstructured machine generated log data and provide real time analytics) can be utilized individually or in tandem for data collection and visualization. Tools like these, see “Hadoop and Splunk Use cases”, provide organizations an opportunity to quickly bring the data collection, storage and analysis up to speed with the competition.
Data Value and Veracity (on the right of the previous image) are in some ways more interesting topics to consider, as they are typically more difficult to achieve with a one-stop software solution.
Many large banks have developed their infrastructure over decades, with significant ongoing manual effort required to keep their infrastructure and data up to date.
Modernization away from these legacy processes to deliver accurate and valuable data is not cheap nor is it simple, but critical to the successful deployment of advanced technology such as Robotic Process Automation. In addition, it enables strategic decision making using advanced analytics to draw conclusions from the data.
This analysis carried out using high quality data can be applied in many areas, including:
- Customer Targeted Offers; collecting data on customers previous spending habits and behaviors to feed targeted product offers. Collection of customer data for any purpose of course brings with it a level of risk due to the regulations (particularly in Europe) on protecting customer data by ensuring their data is collected, held and / or destroyed within a strict framework.
- Internal Infrastructure; some organizations may have unreliable and manually collated data, making it difficult to make significant changes or improvements to the infrastructure estate, such as rolling out new technologies, or moving services to new platforms. Banks with excellent data on their infrastructure can analyze usage, model changes and implement AI / machine monitoring technology more efficiently.
- Risk Management; making a profit in an ultra-competitive market requires advanced tools and machine learning to carry out high frequency trading. On the credit side, banks can now manage risk on a macro level, incorporating data from all their credit lending globally to ensure they aren’t taking on too much risk in one specific area, as well as taken together in totality.
Big Data underpins many of the technologies driving digital transformation in the financial sector, and can lead to substantial gains in efficiency, deep and genuine insights into trends, and the ability to make proactive decisions. While attempting to leverage Big Data does bring in new risks (i.e. data theft and regulations around storing and use of data like GDPR), they are vastly outweighed by the potential benefits that can be realized from its strategic use and application.
Though the ever-changing digital landscape can be challenging to grasp in its entirety, Big Data can create the clarity needed to make strategic decisions around current infrastructure use and highlight what is possible through migration to new services and platforms.
Perhaps most importantly, an organization’s mastery of Big Data underpins the ability to roll out new AI and Robotics technologies – which depend absolutely on large volumes and high quality of data.
Without a solid grasp of Big Data, it is near impossible for a player within the financial sector to be competitive in today’s market. For a company to truly be a leader, they need to maximize the benefits of Big Data and the technologies of the future that it enables.