This is the first in a series of blogs that explore how data can be an asset or a risk to organizations in an uncertain economic climate.
Humanity has always valued novelty. Since the advent of the Digital Age, this preference has driven change at an astronomical pace. For example, more data was generated in the last two years than in the entire human history to date, a concept made more staggering by Machine Learning and Artificial Intelligence tools that allow users to access and analyze data as never before. The question now is: how can business leaders and investors best make sense of this information and use it for their competitive advantage?
Traditionally, access to good data has been a limiting factor. Revolutionary business strategies were reserved for those who knew how to obtain, prepare, and analyze it. While top-tier decision making is still data- and insight-driven, today’s data challenges are characterized more by glut than scarcity, both in terms of overall volume of information and the tools available to make sense of it. As of today, only 0.5% of data that is produced is even analyzed.
This overabundance of information and tech tools has ironically led to greater uncertainty for business leaders. Massive data sets and powerful, user-friendly tools often mask underlying issues, resulting in many firms maintaining and processing duplicates of their data, creating silos of critical but unconnected data that must be sorted and reconciled. Analysts still spend between 80% of their time collecting and preparing their data and only 20% analyzing it.
Global interconnectivity is making the world smaller and more competitive. Regulators, who understand the power of data, are increasing controls over it. Now, more than ever, it is critical for firms to take action. To remain competitive, organizations must understand the critical data that drives their business, so they are able to make use of it and alternative data sets for future decision making; otherwise they face obsolescence. These are not just internal concerns. Clients are also requesting more customized services and demanding to understand how firms are using their information. Firms must identify critical data and understand that all data is not, and should not, be treated the same so they can extract the full power of the information and meet client and regulatory requirements.
Let’s picture data as an onion. As the core of the onion supports it outer layers, the ‘core’ or critical enterprise data supports all the functions of a business. When the core is strong, so is the rest of the structure. When the core is contaminated or rotten – that is a problem, for the onion and for your company.
Data that is core to a business – information like client IDs, addresses, products and positions, to name a few examples – must be solid and healthy enough to support the outer layers of data use and reporting in the organization. This enterprise data must be defined, clean and unique, or the firm will waste time cleaning and reconciling it, and the client, business and regulatory reports that it supports will be inaccurate.
How do you source, define and store your data to cleanly extract the pieces you need? Look at the onion again. You could take the chainsaw approach to slice the onion, which would give you instant access to everything inside, good and contaminated, and will probably spoil your dish. Likewise, if you use bad data at the core, any calculations you perform on it or reports aggregating the data will not be correct. If you need a clean slice of onion required by a specific recipe (or calculated data required for a particular report), precision and cleanliness of the slice (good core data and unique contextual definition) is key.
Once your core data is unique, supported and available, clients, business and corporate users can combine it with alternative and non-traditional data sets, to extract information, enhance it and add value. As demand for new “recipes” of data (for management, client or regulatory reporting) is ever increasing, firms who do not clean up and leverage their core data effectively will become obsolete. These demands can be anything from data needed for instant access and client reporting across different form factors (i.e. Web, iOS & Android apps), to data visualization and manipulation tools for employees analyzing new and enhanced information to determine trends. Demand also stems from the numerous requirements needed to comply with the complex patchwork of regional financial regulations across the globe. Many different users, many different recipes, all reliant on the health of their core data (onion core).
What is the actionable advice when you read a headline like: “A recent study in the Harvard Business Review found that over 92% of surveyed firms agreed that data analytics for decision making will be more important 2 years from now”? We have some ideas. In this blog series, FRG Data Advisory & Analytics will take you through several use cases to outline what data is foundational or core to business operations and how to achieve the contextual precision demanded from the market and regulators within our current environment of uncertainty, highlighting both how data can be an asset, or a potential risk, if not treated appropriately.
Dessa Glasser, Ph.D., is an FRG Principal Consultant, with 30+ years experience designing and implementing innovative solutions and organizations in data, risk, and analytics. She leads the Data Advisory & Analytics Team for FRG and focuses on data, analytics and regulatory solutions for clients.
Edward Hanlon is a Senior Consultant and Engagement Manager on FRG’s Data Advisory & Analytics Team. He focuses on development and implementation of data strategy solutions for FRG, leveraging previous experience launching new Digital products and reengineering operational models as a Digital Technology platform owner and program lead in financial services.