Why Companies Need Data Validation

PinIt
data validation

Some 40 percent of business initiatives fail due to poor-quality data. Michael Ludwig, chief product architect for Blazent, makes the case for data validation.

To make informed, effective business decisions, CIOs depend heavily on the company’s ability to gather, align and analyze comprehensive data. Too often the emphasis is placed on data analysis without factoring in the complexity of aligning the data collection. It’s all too easy to put the cart before the horse when industry buzz and analyst firms are clamoring, touting the power of Big Data analytics and business intelligence to impact the bottom line. However, CIO decisions are only as good as the data the enterprise collects, and the data is only as good as its level of accuracy and its ability to produce meaningful analytics.

Big Data, as it name implies, represents an avalanche of information pouring in from across the enterprise. As the IT data landscape continues to become more fragmented by the consumerization of compute resources, advanced performance analytics and a proliferation of third-party service offerings, it is increasingly more difficult for enterprise IT organizations to holistically collect, integrate and process the data being generated so that it produces meaningful, accurate analytics for decision making. Not surprisingly, as the complexity of IT environments continues to rise, so does the level of inaccuracy in enterprise data. According to Experian, upwards of 92 percent of organizations suspect their customer and prospect data to be inaccurate. Additionally, Gartner credits poor data quality as a primary reason why 40 percent of all business initiatives fail to achieve their targeted benefits.

The 40% Syndrome

These sobering statistics lead to an alarming result – the estimation that at any moment 40 percent of an enterprise’s IT data that is fueling work stream efficiencies or driving decision making is missing or wrong. The business ramifications of this staggering figure are evident to all who work within enterprise IT:

• IT compliance issues that can drive up costs
• Operational shortfalls that can increase risk and security
• Lack of governance and visibility into the IT landscape
• Failing IT or software audits
• Overspending on maintenance and support for legacy or underperforming assets
• Long triage and resolution times for incidents and problems affecting SLAs

No one wants a failed IT initiative, but avoiding one is easier said than done. Few organizations are able to keep up with the three V’s of Big Data: volume, velocity and variety. And if they are, they encounter the most important, often overlooked, “V” of all: validation. That is why data quality has emerged as such a critical issue.

Addressing Data Validation

While the 40 percent syndrome is a frightening reality, acknowledging its existence gives IT executives the opportunity to address the root cause of the issue – poor data quality. It doesn’t matter how fast, how much, or how diverse the kinds of data an enterprise collects if the data is misaligned, missing key attributes, or is unreliable. It dampens workflow effectiveness and is dangerous for input into decision making models.

The benefit of data validation is the ability to mathematically interrogate the veracity of the sources of data flowing and apply scoring models during data consolidation so that the results of the final derived dataset are of the highest quality obtainable. Using this methodology, IT can obtain an accurate contextual history of its data across both the environment, gauge the time to address and resolve current issues, and provide insight into future problems.

In the case of a configuration management database (CMDB), companies often find themselves with a CMDB populated with old, incomplete, unverified data due to information being collected from manual sources and a small number of electronic tools. By leveraging the right data management and analysis solution – one that not only adapts to data volume, velocity and variety, but also provides data validation – enterprises can gain direct, real-time insight into a comprehensive inventory of all the configuration items (CIs) in its environment. By doing so, organizations can attain a level of visibility that includes those CIs that might not have been previously known to exist, as well as possess a complete view of all the relationships among these various entities. With this knowledge, IT can quickly perform incident triage, problem resolution, and change management; provide accurate service delivery; and implement on-budget infrastructure investment and planning.

Infrastructure Management

Another example of the importance of data validation comes in to play at the heart of infrastructure management. Asset lifecycle management is a challenging task that can only be accomplished when the enterprise has a clear understanding of the active or inactive state of the components delivering services. The issue arises from the traditionally manual process of lifecycle status management, which relies on people across different disciplines to work together to ensure that statuses are correctly identified and maintained. Not surprisingly, this method consistently fails due to process breakdown. Without accurate asset lifecycle status information, enterprises suffer from poor change management planning, resulting in increased incidents, poor financial management and increased complexity of resource planning. With the help of data validation, IT can identify the location of each asset, recognize which assets are active, inactive, or underutilized and make informed decisions about the future need and relevancy of all its assets. Ultimately, data validation arms IT with the actionable information it needs to accurately assess asset investments and reduce long term wasteful spending.

When it comes to Big Data, leveraging the right data management and analysis solution can empower companies to maximize real-time IT data intelligence, while simultaneously minimizing the cost and effort to manage their data. The proper foundation enables for complete, accurate and auditable IT data collection and validation. It also ensures that enterprises have the highest data quality for both up-to-date decision-making and historical analysis. With these pieces in place, companies can have confidence in the reliability of their data to conduct intelligent analytics.


Want more? Check out our most-read content:

Frontiers in Artificial Intelligence for the IoT: White Paper
Five Big Data Trends: Emerging Technologies
No Recalls: A Smart Testing Approach for IoT Devices
IoT Connectivity: What’s the Frequency?

Liked this article? Share it with your colleagues!

Michael Ludwig

About Michael Ludwig

Michael Ludwig, chief product architect at Blazent, has 30 years of IT leadership experience. He has held executive level operational, service management, and software development roles in several industry verticals including telecommunications, aerospace, defense, and education. As chief product architect, he is driven by providing forward-looking products that solve real-world customer problems.

Leave a Reply

Your email address will not be published. Required fields are marked *