Restated in IT terms, this means the quality of data held in IT systems will deteriorate unless steps are taken to maintain its accuracy and consistency.
Given that a majority of organisations rank the data they hold as a primary asset, what should you be doing to look after that data quality more effectively?
Indeed, it is fair to say that most organisations tend to only act to ensure data quality when they implement new systems, undergo a major update, when they need to integrate with another platform, or when something goes disastrously wrong.
Allegedly referred to as a “dual scan” scenario, what this truly is is a terrible bug that was not caught months ago. And despite following them to a T – our company was put into a terrible position where devices were saturating Internet links when Windows 10 updates were released. What’s further frustrating is reading the comment section on the Microsoft Blog that brought this to my attention. The problem here is that once a machine is managed by SCCM (and WSUS/updates are configured) these settings are supposed to be hidden.
Fortunately, tools to improve data cleansing have advanced dramatically in recent years.
In addition, there are service providers, including cloud suppliers, which specialise in this area.
Given the fact that data cleansing and integrity checking involves effort and, therefore, costs, managers should bear in mind that not all data is of equal business value.
Which data belongs in which category (and why) should be a business decision, not an IT one. Impure data is like impure blood – not good for the system.