The quality of data is a key factor in the success of your PIM implementation. In this blog post, we present a 5-step framework for successfully conducting a data quality audit.
Clean product data is the lifeblood of your PIM, and how well you sanitise your data will determine the long-term success of your PIM deployment.
Over the last few weeks, we’ve been discussing PIM. If you’ve been following this series, you’d have realised that data cleaning is a crucial part of any PIM implementation. In previous posts, we discussed how to kickstart your data cleaning efforts. We spoke about how to consolidate your data and how to assess your data using a data quality framework.
Now comes the biggie – conducting a detailed quality audit of your product data before its eventual migration to a PIM solution.
This process requires you to critically examine all your data and fix it in accordance with your quality metrics. Here’s a data quality audit checklist for you to refer to.
1. Understand the importance of different data attributes
What is important data and what isn’t? The answer lies in the eyes of the beholder. Customers consider some product attributes crucial to their purchase decision. Your legal team might mandate other attributes to comply with government requirements. Meanwhile, the product marketing team might want to shine the spotlight on the USPs and the need-gap being met. Understand what data is important, by talking to different teams such as customer service, product marketing, and legal.
When you perform your data audit, pay particular attention to the quality of these vital attributes.
2. Arrive at data quality rules
Determining your data quality benchmarks requires two inputs — the data quality framework discussed in the earlier post, and the insights you gained from various stakeholders about data attributes. Consider both to decide what is acceptable data quality for different product families and data sets. For example, you could decide that a data set gets a clean chit only if it is 85% complete and 100% accurate.
3. Identify incorrect/missing parameters across categories and products
It’s time to put your data under the microscope! Measure your data sets against your quality rules for any problem areas. For instance, the unit of measurement for different products within a family might be inconsistent. Various abbreviations of the same word (e.g., ct vs. count vs. CT) might be in use. Some data could be duplicated, while other data may be missing.
Based on your predetermined quality rules, separate your data into two buckets — ones that are good to go, and ones that need fixing.
4. Identify the source of missing information
As you go through your inventory of ‘bad’ data, probe into the source of error for each instance. Is the supplier not providing information? Is the product team not uploading them correctly? Or is the supply chain team not updating information based on sales?
As you identify the source of errors, also document the weak links in your data chain, so that sustained efforts can be made to strengthen them.
5. Plug the gaps
Plug the gaps in your data, with the help of various data owners. Use data scrubbing techniques to eliminate errors — while a well-designed PIM can reject certain kinds of ‘bad’ data, it’s a better practice to input good quality data to begin with.
At the end of this comprehensive exercise, you will be left with squeaky clean data that is PIM-ready!
And with that, you’ve completed what is arguably the most arduous process in your PIM journey — getting your data in order!
While this audit helps you take stock of your current data and correct it, it cannot guarantee the good health of all future data. For the long-term, you have to fix the processes that caused your data to go awry, and oversee all new data entering your PIM. Nothing helps this cause as much as a data governance council that plays watchdog to your organization’s data and PIM setup. More on that in the next post!
Meanwhile, if you need any help strategizing your PIM initiative, do get in touch and we’d be happy to help.