Pre-empting that it would become the currency that fuels economic growth, the World Economic Forum started identifying data as an asset class in 2011.
A decade later, data has reinforced its value, with majority of data-driven businesses reaping critical advantages during the current crisis.
It is therefore no surprise that the push to “become even more data-driven” is a prominent fixture in economic recovery plans. For companies that are looking to leverage data to build back better, much has been said about the need for strong capabilities in analytics, and rightfully so. However, the core tenet of ‘data management’ is equally important, but underemphasised.
Good data management fundamentally results in a user being able to access the right data at the right time in the right place. And to do so requires organisations to treat data like prudent individuals would handle their finances.
Bring together assets to increase ‘net worth’
Where COVID-19 has forced us to review how we deploy our personal finances, each Australian citizen can now make better decisions through an on-demand, consolidated view of their deposits, withdrawals, investments, pensions, home loans, etc. Like parts of a consumer’s information being confined in separate places, many businesses have a data silo problem. In fact, two in five data-driven organisations still suffer from an erratic view of their business’ growth trajectory, due to the presence of at least 50 internal data silos.
These include unstructured files and datasets stuck in employees’ PCs and smartphones, servers and all-flash arrays in office premises, data centres, and public clouds like AWS, Microsoft Azure, or Google Cloud.
The baseline process for weaving these together should include acquiring organisational buy-in, discovering where data is stored, having distributed data ‘storage units’ connected and working as a single system, governing how data is being used, and ensuring adherence to cybersecurity protocols.
While these steps can seem like a heavy undertaking, the results are worthwhile. On average, data management leaders – as compared to laggards – have reported 69 percent more revenue, 57 per cent more profits and 72 per cent greater customer satisfaction.
To overcome limited time and in-house skills, leaders like AstraZeneca have turned to automated software capabilities to ensure its data stores are interwoven and delivering value. The company’s COVID-19 vaccine production efforts were accelerated by its ability to consolidate data from two billion doses being simultaneously deployed across the globe. Crucial to this was its creation of a software-driven data fabric across its data centres and multiple clouds.
‘Stash or withdraw’ based on your current need
Once an enterprise has developed the tapestry that connects disparate sources of information, it must start treating the data within these sources like bank notes.
We treat and use $100 and $5 bills differently based on their value. When you are about to purchase a bigger ticket item, you would ensure that the right amount is in your wallet. Any amount of cash that is not planned for immediate use is stored elsewhere for safekeeping.
In a similar way, not all data is equally valuable. For data that is being accessed in the immediate term – also known as ‘hot data’ – businesses can consider keeping them ‘nearby’. ‘Cold data’ that is not being used should then be stashed away till when it is needed again.
There are two reasons for this. Firstly, businesses will inevitably generate more and more data. A healthcare provider’s data volumes, for example, already grows by as much as one terabyte per day.
Secondly, just like how it is impossible to stuff your life savings into your wallet, all-flash arrays in office premises allow users to access data quicker but have limited capacity. Like a savings account, the cloud is a more sensible place to store most of your data, with software fulfilling the ‘ATM function’ of dispersing information wherever and whenever needed. This capability, also known as tiering technology, automatically identifies inactive data and moves it into cloud storage units for the price of a cup of coffee.
For data to fulfil its promise as the currency of our new economy, it must ultimately be stored cost-efficiently and allowed to flow seamlessly across a fabric-like architecture. These technologies are making it more feasible than ever for organisations to harness the bits and bytes they need for unearthing new business opportunities, accelerating innovation, and optimising operations.