General topics

Improve data management and quality with the Data Fabric . strategy

Improve data management and quality with the Data Fabric . strategy

To improve performance, companies need to properly manage their data and improve its quality by moving away from traditional methods.

It is well known that extracting value from external and internal data requires a focus on both data management and data quality. Gartner defines data governance as “the definition of decision rights and accountability frameworks for ensuring appropriate behavior in relation to the assessment, creation, consumption, and control of data and analytics.” Data quality is largely determined by how accurate and current the information is; Without accurate data and knowledge of who is using the information in your organization, it is very difficult to drain that knowledge.

Although it is well known that data management and data quality are of paramount importance to organizations – and given the tremendous advances in data technology and capabilities – organizations still struggle to ensure data quality and management.

Check also:

A recent study by EY found that 41% of organizations find the quality of their data to be the biggest challenge. Gartner suggests that poor data quality costs organizations an average of $12.9 million per year.

In addition, the EY report found that 14% of organizations have problems accessing technology infrastructure and related data. Without adequate availability, technology, and data infrastructure, it is extremely difficult for companies to implement an effective data management framework.

Challenges related to data centralization

Many of the barriers preventing companies from achieving their data quality and data management goals stem from the traditional data-centric approach. As the organization grows, the influx of operational resources creates data silos. Companies are trying to overcome this problem by gathering data from these sources in one place. While there have been no arguments for this reasoning in recent years, in times of increased amount and complexity of data, it has led to many significant challenges.

For example, integrating new data sources into a centralized environment requires a lot of time and effort. The cost of data centralization is significant given the investment in storage, computers, and interfaces, and the task of standardizing data formats across all data sources. Meanwhile, data silos are on the rise because there is a natural separation between those who create and consume data — and data engineers experienced in big data tools. This is because engineers lack business and domain expertise, and data product owners lack technical expertise. As a result, organizations do not have the ability to see data consumption across the organization.

Technical aspects of data centralization can also contribute to the negative effects of regulatory policy; Internal competition can result in departments refusing to share their data assets with other departments. Lack of visibility and availability in a data-centric environment can encourage storage of data assets and thus the loss of many data monetization initiatives by the enterprise.

Data consolidation issues in a centralized environment also lead to the use of outdated data. For example, as the organization grows over time, a third party can interact with many different business units within the organization, each with a different operating system. This leads to a lack of data synchronization – some data is up to date and some other information is no longer accurate. This hinders implementation and knowledge discovery and thus affects business outcomes.

Finally, companies cannot dictate how data is used. When data is centralized, it is complicated to put in place the least and most accurate access controls, so achieving corporate governance and compliance is a challenge.

A new decentralized approach to data

It is therefore clear that the traditional and data-centric approach presents organizations with many challenges to overcome. An alternative strategy is to adopt a decentralized approach. The Data Fabric concept – one of Gartner’s most important strategic directions for 2022 – can help, based on multiple data management technologies working in tandem, improving data retrieval and integration across the company’s ecosystem.

One such technology is data virtualization, which allows access to data resources from any operating source without having to repeat them. In other words, instead of copying data from an operational source to a central data warehouse, data sets can be viewed and analyzed (even using complex AI techniques) from where they are. The real Data Fabric approach would also enable the creation of virtual data lakes in real time as needed; This means that data lakes can be created and removed at any time without affecting existing applications and infrastructure.

This provides a simpler and more cost-effective alternative to integrating data sources and providers, and enables a single view of the data flow across the enterprise. By achieving this level of visibility, organizations can act on data in many ways. First, through the use of advanced control mechanisms based on traits and roles, it can restrict visibility and access to the lowest, most detailed level, allowing control decisions to be better enforced.

Second, because data resources are more accessible, organizations can coordinate data sharing between teams and reduce isolated data resources. This ability to dynamically improve data usage is part of the true value of Data Fabric, according to Gartner. The research firm says analytics as part of Data Fabric can reduce data management efforts by up to 70% and speed up time to value.

Importantly, Data Fabric’s excellent approach does not mean abandoning existing central data lakes or warehouses, but rather integrating data within them as part of a dynamic and resilient infrastructure. Data Fabric can be used by an application or platform and allows data to be enriched, processed and visualized at any time, so that companies stop locking their data in silos or replicating it across multiple applications.

Organizations seeking to improve business outcomes by modernizing data quality, management, and discoverability need to consider their end-to-end data approach and ask themselves whether a traditional and centralized approach can help them achieve their goals. A strategy that uses the texture of data can certainly do just that.

author:

Doctor. David Amzalaj – Head of Product and CEO of Digital Transformation, BlackSwan Technologies.

Source: IDG Connect

 

Related Articles

Back to top button