Technology is evolving at a rapid pace and some powerful business intelligence tools have emerged in response. But with great power comes great responsibility. While BI tools can give you a beautiful picture of what is going on with your business, it may be useless or worse, detrimental, if the data being used is inaccurate because it was improperly prepared for analysis. One of the most critical responsibilities in business intelligence tools is to know what data will be used to come up with insights.
Data must be prepared properly to be of value. If you jump right to investing in a new reporting tool there will most likely be a phased approach which will include digging deep into the trenches of your database/s and the data that will be used for the tool. Unfortunately, this may be a myopic approach to accommodate the needs of the reporting tool instead of gaining a bigger picture about what your data can offer, as well as provide a reality check on the limitations of your data.
How do you know your data is ready for a new reporting tool? Keep in mind that the goal is not about having “perfect” data since that is unrealistic; rather, the goal is to understand the strengths and weaknesses of the data being stored, determine how it can be improved, and optimize how the data can best be used to support your business needs.
Ask yourself some key questions: “How clean is your data?”
What is the potential that the data entering your database is corrupt?
- Do an assessment on your data by looking at the values that are currently populated by all of your key fields.
- If your database is growing and more data is entering the database, what is the likelihood that new incoming data is corrupt? Do you manage the way data is coming into your database effectively by rejecting data or flagging data that may be erroneous before allowing it to enter the database?
- Ideally, you should have a data dictionary that includes a range of acceptable values per data field. The data dictionary should be used to enforce how data should be captured across touchpoints (such as front-end tools used by customer representatives to add data, website forms, BRC’s, or other sources that capture incoming data).
- You should also have a file transfer agreement or interface agreement with the vendor/s or channel partner/s that sends incoming data. The interface agreement should include a data dictionary with expected values per field.
- Prior to allowing data to enter the main database, any incoming data should be examined and there should be automated flags that identify whether there is erroneous data entering the database. Such errors should be prevented in the future by understanding what caused the error. Was it a manual error such as a typo for the value in a field? If so, try to minimize the chance of manual error by using data capture options that contain a list of values such as a drop-down list instead of options that allow free form text.
“How complete is your data?” Does the data that you have available meet your business needs and can it provide actionable insights? If not, what data do you need to help complete the picture?
- Sometimes the data that is available can be leveraged enough to provide further insight than originally thought. Combining certain fields into meaningful new operational variables may provide broader insights.
- In order to understand the completeness of your data, identify the key variables being captured and whether these variables can be used to answer your top business questions.
Are you capturing the type of data that you say you’re capturing, and is this data reliable over time?
- If your data field is Date of Birth and the value of the field 1/5/1865 for Person X while the Age data field shows 49 years old for Person X, then your data is misinforming you. Something is wrong. Either the Date of Birth should be changed to 1/5/1965 or the Age value should read 149 years old. This is example is an obvious one on how to correct the data. However, when dealing with large datasets other automatized methods should be in place to check the accuracy and consistency across your key data variables.
The power you get from a powerful business intelligence tool comes from the quality of your data. Understanding the strengths and weaknesses of your data and how it should be used will yield greater insights and, ultimately, superior decision making.