Home > Business transformation with data analytics

Data Quality Challenging but Essential for Impactful Analytics

Enterprises in Vietnam are keen to adopt advanced analytics to drive their business decisions, but they often struggle with data quality issues as they work to move legacy data infrastructures to the cloud.

Most Vietnamese businesses have yet to build infrastructures that can deliver data analytics across their organization. They recognize the need for a platform that runs on cloud architectures and are eager to migrate their data to tap the benefits these modern technologies can facilitate, such as data visualization and real-time analytics.

Ensuring data quality, though, is proving to be a major challenge, especially as organizations migrate and modernize their applications for their shift to the cloud. Without a proper framework to safeguard data quality, they can end up with missing data and incompatible data formats. This, in turn, can lead to poor data insights and ineffective business decisions.

Vietnamese companies are not alone in this predicament: According to Gartner research, poor data quality costs organizations an average of $12.9 million a year, with a direct impact on revenue and the ability to make good business decisions.1 The research firm suggests that this problem is likely to intensify as data environments grow increasingly complex. Companies with multiple business units across different regions, in particular, and those with a wide ecosystem of suppliers, customers and partners will experience the greatest data quality challenges.

As it is, 52% of companies point to data quality and data access as top challenges in scaling and operationalizing their artificial intelligence deployments, according to research firm IDC.2 In addition, 50% of that time is spent on data preparation.

It is not surprising, then, that companies are dedicating resources to address data quality issues. In fact, focus on data quality will heighten as more organizations tap data analytics to drive business decisions. Gartner forecasts that by 2022, 70% of companies will track data quality levels via metrics, resulting in improvements of 60%.3 It estimates that a10% improvement in customer data quality, for instance, can lead to a 5% improvement in customer responsiveness, as quality and trusted data can enable companies to provide better service and do so more quickly.

Proper Framework Key to Ensuring Data Quality
Cloud vendors such as Amazon Web Services (AWS) understand this and offer a host of solutions designed to help enterprise customers govern data quality as well as prepare their data for analytics and machine learning models. A robust framework is also critical in ensuring the right processes are in place to guide an organization's cloud migration and application modernization efforts. Each organization needs to find the most suitable framework, which can be unearthed during the discovery phase or adapted and customized from existing frameworks developed by other cloud vendors. 

AWS partner TechX, for example, helps customers design a three-stage roadmap to deploy its data analytics solutions. This encompasses an initial assessment that looks at the customer's data maturity and challenges, ensuring the company’s data strategy and architecture are aligned with its business strategy.

In the second stage of the roadmap, TechX walks the customer through its migration and modernization plan, including moving data from on premises and developing reusable frameworks for data ingestion, ETL (extract, transform, load) and data governance. Automated, reusable ETL frameworks deliver standardized processes on which data is ingested, transformed and enriched. These processes accelerate the data platform implementation and further ensure data quality.

The final stage of the roadmap covers the implementation of a data governance framework, optimization of data operations, and enabling AI and machine learning analytics.

Roadmap for Lower Operational Costs, Faster Insights
TechX's three-stage data roadmap has served as a beacon for many of its customers, including HDBank, a commercial bank in Vietnam serving retail and corporate customers.

The bank lacked a reporting self-service reporting system that could enable its employees to tap data analytics and drive their business objectives. The cost of running online analytical processing (OLAP) workloads on its existing Oracle databases also was escalating and increasingly tough to control. HDBank worked with TechX to put together a roadmap to migrate from OLAP and modernize its data analytics architecture.

Using AWS solutions, HDBank built a lakehouse to serve as a central data repository. It also modernized 10 reports using AWS-native data processing services.

The implementation and move away from traditional Oracle databases reduced HDBank's operational costs by 42%, shifting part of the bank's Capex to an Opex model, with AWS Data Analytics running as the foundation. This also slashed the time and effort required to build analysts’ reports by 40%, resulting in faster, automated reporting pipelines to inform the bank’s business decisions.

With the proper data approach, organizations such as HDBank can not only establish a 360-view of their operations but also embrace a business culture where critical decisions are driven by real-time analytical reports and forecasts.

1How to Improve Your Data Quality,” Gartner, July 14, 2021
2The Data Dilemma and Its Impact on AI in Healthcare and Life Sciences,” IDC, June 23, 2021
3 Ibid. footnote 1

Shutterstock

Close