How Much Data Is Enough?
Data warehouse management and data analytics always had the challenge to decide what data to store and for how long to keep the data. This is even more relevant with today’s Data Lakes and the possibility of storing increasing volumes of information at cheaper cost in the cloud. Additionally, new data types, such as IoT or Social Media, have emerged that provide millions of records that may or may not be important to analytics at later stage.
Regulatory compliance is often used as the benchmark of what data should be stored and for how long. This can range from a few years to decades or indefinitely. In the airline industry, aircraft maintenance records must be held for the life of the asset plus 7 years. For most aircrafts that period is 20-30 years in which records need to be retained. The magic 7-year mark is another benchmark that is mainly relevant to financial and tax data but, has been defined as the regulatory retention policy in many cases.
Whilst regulatory retention compliance is pretty clear-cut and well defined, what about the other data types that make up the majority of corporate data.
We can probably all agree that the value of raw data diminishes over time to the point that it is no longer relevant. Aggregated data might never reach the point of no value. A good example is trend information such as stock prices or stock index information that is still somewhat relevant even after 100 years.
The problem is that we don’t know whether data might be useful in the future. New analytics models might want to analyse new patterns and formulas and test them over longer periods of time and in different market situations. With that in mind a lot of data managers are reluctant to remove data with the result that in a lot of cases organisations have large pools of stale data that is increasingly hard to manage.
It all comes down to the cost/benefit assessment which in itself is problematic. How can we put a value on data that is easily measurable? One way to do this is to assess the usage patterns and translate data access into a data governance metric that will be used to assess the retention policy.
A clearly defined data governance framework must be put in place. The cornerstone of the governance framework is a clear understanding and profiling of the data that is available. Secondly, a clearly defined retention policy must be defined. This should not be a sweeping statement across the organisation but must be defined within the individual data domains.
Some hard decisions need to be made on how long raw data should be held. As hard as it sometimes is to let go of data the cost/benefit ratio for keeping records is often not warranting the further retention of data even if it is held in low cost archive storage.
Let Fusion Professionals work with you to develop the most appropriate Data Governance framework and strategy for your organisation.
Achim Drescher is the Managing Consultant of the Big Data and Analytics Practice at Fusion Professionals.
With 30 years in the IT industry he is an Expert in Enterprise Software and Data Architecture, Data Governance frameworks and modern analytics platforms for Big Data and Data Lakes.
Many organisations don’t realise it, but in our current environment Data has become the main differentiator in the market. Most…MORE INFORMATION
Professional services, one of the fastest growing sectors of the Australian economy, covers a broad group of companies and organizations…MORE INFORMATION
We experience an increasing polarisation in our political landscape with tribalism becoming a real issue. This is partially to be…MORE INFORMATION
Oracle’s introduction of the self-driving, self-securing, and self-repairing Autonomous Database draws upon its decades of expertise in automating databases and…MORE INFORMATION
In a recent blog post from Dataiku, the leading data science, machine learning, and AI platform, Lynn Heidmann explored ways…MORE INFORMATION
“With Great Power Comes Great Responsibility” One of the biggest ongoing responsibilities that comes after commissioning an Exadata appliance is…MORE INFORMATION
According to Constellation Research, a little more than half of traditional Fortune 500 companies have disappeared due to the lack…MORE INFORMATION
Fusion Professionals has signed a partnership agreement with Dataiku, one of the world’s leading machine learning platforms that moves companies…MORE INFORMATION
Statistical language models apply probability distributions to a sequence of words. These models are finding increasing use as natural language…MORE INFORMATION
Challenges The Company, one of Australia’s largest and fastest growing Telco companies had 2 primary SharePoint environments that had different…MORE INFORMATION
Containerization allows applications to run on any machine- anytime, anywhere so long as they are compatible. By virtualizing your OS,…MORE INFORMATION
So you’ve finally decided that the cloud is safer than corporate data centers and digital assets and you’ve chosen to…MORE INFORMATION
Building a system that houses your organisation’s data can be daunting, especially now that data acquisition is growing rapidly. The…MORE INFORMATION
Human-to-machine communication has not yet been perfected, but enterprises are already beginning to integrate this groundbreaking technology into their operations,…MORE INFORMATION
Fusion Professionals has signed a partnership agreement with MapR Technologies, provider of the industry’s leading data platform for AI and…MORE INFORMATION
“Big data is at the foundation of all of the megatrends that are happening today, from social to mobile to…MORE INFORMATION
In recent years data volumes have been increasing dramatically. This has created major challenges for traditional analytics platforms in terms…MORE INFORMATION
With the increasing volumes of data that can be cost effectively stored in the cloud, comes increasing responsibility. The current…MORE INFORMATION
With the advancement of technology and abundance of data your business receives on a daily basis, companies are now in…MORE INFORMATION