It’s no surprise that becoming a data-driven company is at the top of the corporate agenda. A recent IDC whitepaper found that data-savvy companies reported a threefold increase in revenue improvement, almost tripling the likelihood of reduced time to market for new products and services, and more than doubling the probability of enhanced customer satisfaction, profits, and operational efficiency.
But according to a January survey of data and information executives from NewVantage Partners, merely a quarter of companies describe themselves as data-driven, and only 21% say they have a data culture in their organizations.
Several key factors help explain this disconnect, but cultural issues were cited by 80% of respondents as the biggest factor keeping them from getting value from their data investments, while only 20% pointed to technology limitations. Based on the experience of experts who have surmounted these roadblocks firsthand, others remain as well.
Recognizing bad data
Even the best of analytics strategies can be derailed if the underlying data is bad. But solving data quality problems requires a deep understanding of what the data means and how it’s collected. Resolving duplicate data is one issue, but when the data is just wrong, that’s much harder to fix, says Uyi Stewart, chief data and technology officer at Data.org, a nonprofit backed by the Mastercard Center for Inclusive Growth and the Rockefeller Foundation.
0 of 42 minutes, 12 secondsVolume 0%
“The challenge of veracity is much more difficult and takes more time,” he says. “This is where you require domain expertise to allow you to separate fact from fiction.”
Simple technical skills are not enough. That’s what Lenno Maris found out when he joined FrieslandCampina, a multinational dairy cooperative, in 2017, when the company was embarking on a strategic plan to become a data-driven company.
It was a big challenge. The company has over 21,000 employees in 31 countries, and has customers in over 100 countries. It quickly became clear that data quality was going to be a big hurdle.
For example, inventory was reported based on the number of pallets, but orders were based on unit numbers, says Maris, the company’s senior global director for enterprise data and authorizations. This meant that people had to do manual conversions to ensure the right quantities were delivered at the right price.
Or take commodity codes. Each plant put in the commodity code that best fit the product, with different plants using different codes that were then used to reclaim import and export taxes. “But tax reporting is performed at the corporate level, so consistency is needed,” says Maris.
To fix the data issues, FrieslandCampina had to evolve its data organization. At the start of the project, the team focused mostly on the technical details of data entry. But that changed quickly.
“We’ve been able to retrain our team to become process experts, data quality experts, and domain experts,” Maris says. “That allows us to transition to proactive data support and become advisors to our business peers.”
Similarly, the technology platform chosen to help the company improve its data quality, Syniti, had to adapt as well.