Columbus UK Blog

How to fill the gaps in your key data assets

Written by Columbus | May 21, 2019

Without a full data set to support the calculation, of any operational KPI, that you want to use to tune your manufacturing processes, its integrity will be called into question. Before embarking on a project to build that killer dashboard that will focus your team on improving the metrics that really matter, you need to make sure that the data that will inform it is complete and accurate.

Vital statistics

Performing a gap analysis on the significant sources of raw data is the first step on the road to a successfully utilised measure. Industry bodies such as APQC publish the definition of the metrics that you are likely to want to calculate, the difficult part is mapping the required inputs to data tables within the source. Increasingly cloud-based applications are shielding underlying databases from power users and even if you can get access, reviewing the array of columns can be a daunting task.

Leveraging the skills and knowledge of your analytics partner to identify the key data in question will accelerate your project, as the most important tables and columns will be prioritised based on previous experience.

Ruling out errors

Once you know the critical inputs to your metrics, defining business rules relating to your source production data is the next step. Examples of these rules might be:

  • Identifying empty or null values
  • Values that fall outside expected ranges
  • Values that have become orphaned from related records over time

Using bulk data management tools such as the Azure Data Factory in the cloud or the SQL Server Data Quality service on-premise, means you can build automation into the data pipeline that will feed your business metrics. Substitution, completion and enrichment of data can be carried out en masse with the help of your analytics partner and these rules applied to future regular data extracts.

Take control

In our experience, most manufacturing organisations don’t utilise any form of change management on key data assets such as:

  • Item masters
  • Bill of materials
  • Production routes

Introducing a business data controller, responsible for sense checking the completeness and integrity of all requests for change to the information that drives shop floor processing can be a very quick win. The benefits of putting data quality rules in place and applying them upfront, rather than reactively correcting issues downstream soon become obvious.

Adding a data quality dashboard to your toolset will help you monitor the overall status of data feeds coming from a line of business applications and other sources. Modern visualisation tools like PowerBI can also offer automated quick insights such as:

  • Outliers
  • Correlations
  • Changes in a data series over time
  • Trends and seasonality

But to take advantage of these, you need to have your dataset in the right place and to make the process optimal, remove duplicated or irrelevant data from the set.

Ready to begin your data audit?

The data that flows through your production processes is as much of a part of our business DNA as it is yours. If your business applications have been in place for several years, it's likely that the gap between the reality of your organisation and the data that models it in an application has begun to widen. It can also be the case that the people in your team who originally owned the data have moved on and key knowledge has been lost.

At Columbus our industry skills, expertise and experience mean that we can help you perform an audit that gets to the root of your data quality problems quickly without the need for large investments in new technology.

If you’re ready to take the first step towards a production dashboard you truly trust, contact us today.