10 Data Sourcing Best Practices for Reporting

Reporting often relies on “data pipelines” to collect, combine and transform source data. With 62% of people relying on others to supply their data, here are 10 data sourcing best practices.
 
The 10 practices, explained in more detail below include:
  1. Letting the desired business outcome dictate what data you need.
  2. Profiling your data.
  3. Getting as close to the source as possible.
  4. Consolidating sources and keeping it simple.
  5. Setting and managing data quality expectations.
  6. Catching issues early in the data journey.
  7. Measuring and acting on data quality issues.
  8. Embracing change.
  9. Implementing change management controls.
  10. Allowing for data collaboration – in a controlled way.

1 – Let the desired business outcome dictate what data is required

The term most suitable to this topic would be ‘Analysis Paralysis’. Companies often over-analyse their data sourcing issues, so-much so they forget to act.
 
Another common issue is when the data determines what reporting you produce. This dynamic should be the other way round. The desired business outcome for reporting must be the starting point in determining:
  • What data you source
  • How and where you focus your data sourcing and quality efforts
  • Rules and controls around how you manage and maintain the supply of data.
Companies must focus on data sourcing activities that have the most impact. To do this, you need to have a clear and concise understanding of the desired business outcomes.
 
Our data sourcing recommendations:
  • Always start data sourcing activities by asking what you want to achieve.
  • The first question should NEVER be “What data can we get?”
  • Start with “What do we need to achieve through our reporting?”
  • Follow up with “what data do we need to support this outcome?”
  • Final question should be “How can we source it and ensure it is of a high quality?”
  • Prioritise your data sourcing activities based on the business outcomes that the data supports.

2 – Profile Your Data

Do the providers of your source data know who you are? Do they understand (or even care) what you intend to do with their data? It’s OK (and normal) if the answer to these questions is NO.
 
Finding the provider of the data is step one. Once found, you will need to ensure that the data you are sourcing has the profile that meets your needs. That is, the structure, granularity, age, frequency and availability of the data.
 
One company we worked with had requested a daily extract from their general ledger. After working their way to the front of their IT queue, the day finally arrived – a GL extract was now available each morning. Unfortunately, the extract was not granular enough for the transactional level insights required.
 
When it comes to data sourcing, you must communicate with the providers of the data. Take the time to ensure the data provider can answer critical questions like:
  • How often do you need the data?
  • What format does the data need to be in?
  • How granular must the data be?
Doing this early on can also help mitigate risks around data quality.
Our data sourcing recommendations:
  • Take the time to profile your data. This includes the granularity, frequency, structure, format and method of delivery.
  • Meet with the providers of the data. Ensure they can provide data AND that they understand what you intend to do with their data.
  • Identify potential issues and inconsistencies as early as possible. Put plans in place to either enhance the source of data or find an alternative source. Otherwise, recalibrate your expectations for reporting.

3 – Get as close to the original source as possible

It is common to source data from another report or spreadsheet – which is being prepared manually. For example, a sales performance report may rely on sales data prepared manually by other members of the finance team.
 
Each instance of manual intervention introduces risks to data quality. Inconsistent or erroneous upstream processing can have unexpected consequences for your reporting,
 
While you may sometimes not have a choice, it is always worth taking the time to survey what data is actually required for your reporting. This may highlight opportunities to remove manually prepared data sources or switch to system extracts.
 
Our data sourcing recommendations:
  • Be cautious of using a manually prepared report or spreadsheet as a source of data.
  • Look for opportunities to use the inputs into the manually prepared report or spreadsheet rather than its outputs.
  • Collect the data you need from the earliest possible point of the data supply chain. This will limit exposure to uncontrolled manual intervention.
  • Establish data quality expectations with suppliers of manually prepared data. Communicate with data providers to ensure early warning of upstream changes that may impact your reporting.

4 – Keep it simple and consolidate sources. Avoid the temptation to hoard data.

The principle of hoarding extends to data. Companies often duplicate data sources. They also have a tendency to collect and store all data “just in case” they ever need it.
 
For example, a finance staff member enquires about data relating to sales for individual channels. They are told that data is available across several extracts. In addition to the data they require, these extracts also contain information relating to inventory and taxes. Instead of focusing on what is really needed, the staff member attempts to accommodate all of the data that is made available.
 
As the volume and variation of data increases, so does the complexity, effort required and the general level of headaches.
 
Our data sourcing recommendations:
  • Focus on what you need. Don’t fall victim to the allure of hoarding data.

5 – Set and manage expectations around data quality.

Some data, as you’d expect, is more important than other data. The tolerance for error will differ from report to report. Your data sourcing must be lead by where the focus of your audience lies.
 
For example, management may focus on a particular ration, movement or comparison in your report. Ensuring the supply of high quality data to these specific areas of interest is paramount.
 
It is critical that you understand the decisions that are being made from your reporting. This will allow you to determine the potential business impact of data quality issues. Ultimately, this will allow you to prioritise your data sourcing efforts.
 
Our Recommendations:
  • Determine and rank the business impact of data quality issues for each source of data.
  • Distribute your time, energy and focus on defining and measuring data quality in proportion to the business impact.

6 – Catch data quality issues as early as possible.

The cost of data issues increases the further the data has moved along the supply chain. A common phenomenon observed is the 1-10-100 rule. That is, a data issue will cost:
  • $1 to fix at the beginning of its journey.
  • $10 to fix when identified in the middle of its journey.
  • $100 to fix when identified only after the data has been output for reporting.
The 1-10-100 rule is highly relevant for data sourcing. This is because data is often transformed several times during its journey. The effort to unravel these transformations late in the journey can be substantial.
 
Start by defining the business impact of potential data issues. Next, identify metrics to measure the quality of data along its journey. Finally, implement controls as early as possible into the data journey.
 
Our data sourcing recommendations:
  • Implement quality controls as early in the data supply chain as possible.
  • Always check/vet the source data. Basic checks for format and structure can dramatically reduce the cost of errors.

7 – Measure and act on data quality issues.

After you identify potential data quality issues, you must start measuring for quality. This will help ensure you alert to errors and can act on them going forward.

 
Create metrics to increase the visibility and oversight for data quality issues. You will also need a way to ask simple questions when collecting data such as:
  1. Is the data in the correct format?
  2. Is the ordering and structure of fields consistent with expectations?
  3. Are the number of records within acceptable tolerances?
  4. Are there any duplicate records?
The purpose of these measures is to catch errors and allow for corrections to your data. Recurring data quality issues will need an automated solution for cleansing the data.
 
Our data sourcing recommendations:
  • Always define measures or metrics for the data quality.
  • Implement processes to check source data against the metrics you have defined.
  • Define tolerances and expectations and trigger alerts to the appropriate people when necessary.
  • Address consistent data issues through automated data cleansing.

8 – Expect and embrace change.

The landscape you are operating in will most likely change. Examples of changes that impact data sourcing include:
  • Mergers and acquisitions
  • New systems
  • Change in management
The direct impact of such changes is often changes to source data and evolving needs for reporting (outputs). For example, a management restructure may result in a new business hierarchy. As a result, KPI reporting will need to adapt to the new hierarchy.
 
Whilst nobody posses a crystal ball, we should all expect and prepare for change. Reporting is undermined when it is not able to react to changes. Likewise, rigid and brittle data collection and processing can be the achilles heal for a finance department.
 
Our Recommendations:
  • Write a shortlist of expected or possible changes to the reporting requirements and supply of data. Score each item for the likelihood of it occurring and the impact that it would have (Low – High). Ensure that you have a plan in place to mitigate the impact of changes that rank Med – High.
  • Facilitate communication to ensure that you are aware of changes ahead of time.
  • Define tolerances and expectations and trigger alerts to the appropriate people when necessary.
  • Manage expectations of sponsors around the ongoing maintenance and cost of adapting to changes.

9 – Establish change management controls and encourage knowledge transfer to simplify changes.

Having accepted and embraced the likelihood of change, what happens when it is time to act? In particular, how will you ensure changes don’t result in data quality issues? The answer lies in good planning and change management practices.
 
For example, imagine your company changes its CRM, a key data source for your reporting:
  • Who will be responsible for transitioning the supply of source data for your report?
  • How will you ensure the same data profile is available from the new CRM system?
It is imperative that you embed controls for managing change. You must also ensure stakeholders have easy access to the knowledge they will need to implement changesStaff must first be able to understand the data supply chains used for reporting. Next, those with permission should be able to make changes and have them reviewed and approved by stakeholders. These changes must also be documented for future knowledge.
 
Our data sourcing recommendations:
  • Ensure that knowledge of the data supply chains is readily available through documentation.
  • Establish procedures and rules for change. Specify who is responsible to request, make and approve changes.
  • Ensure approved changes are recorded and logged. Document changes for accurate knowledge transfer in the future.

10 – Create a controlled and audit-able environment for business users to adjust or manipulate data for reporting.

Collaborative data analysis is a double edged sword. Human expertise and experience, when introduced into reporting, ensures relevancy of the reports. That said, the more people involved, the greater the risk of data quality issues.
 
Recurring data quality issues resulting from human error will reduce the impact of your reporting. The alternative – a “black box” reporting environment, void of collaboration, is simply too rigid. You must have a balanced approach to data sourcing.
 
The focus must be on facilitating human interaction in a controlled and auditable manner. Ensure that staff can access data at controlled points in the supply chain. When they do access data, create discrete ways for adjustments and manipulation that are:
  • Logged with audit trails; and
  • Have version control.
Doing so will allow you to achieve the best of both worlds in your data analysis and reporting.
 
Our Recommendations:
  • Avoid uncontrolled manipulation and adjustment of data.
  • Define points along the data supply chains where human interaction is expected and allowed. Define rules around what can change and who can change it.
  • Encourage workflow within the organisation to review and approve or decline changes to critical data. E.G. a change that will impact the report should be reviewed and signed off by a responsible person.
  • Ensure changes are logged and auditable so that the reporting can be traced back to the individual changes that have been made. This will facilitate the changes/adjustments while maintaining credibility.

Webinar

You can watch a recording of a webinar that we presented on data sourcing best practices below:

 

Recent Posts

Leave a Comment

Start typing and press Enter to search