Balancing The Dichotomy Of Trust And Speed

Are you constantly being torn apart by the balance between delivering data that can be trusted and how this conflicts with the requirement to deliver insights at high speed? Read on as we explore how to achieve equilibrium between these two conflicting objectives to optimise value from your data.

The only thing that really matters is business value. Your organisation’s leadership doesn’t care about how much efficiency is added by your fancy new tool, or the framework you’ve implemented to improve the consistency with which you can produce good quality analytics outputs that can be trusted.

But value looks very different in various scenarios. In some cases, value means data that is fully compliant, governed to a “T” and of a very high quality. In other cases, the requirement is to produce something quickly even if it’s a little rough around the edges. The key to producing business value is to understand the requirement from a business point of view and to be able to clearly articulate how that requirement will produce value. Then establish the balance between trust and speed to manage expectations for delivery pace versus quality of data.

Let’s explore the concepts of “trust” and “speed” before we dive into the helpful tips on how to optimise data value in your environment. These two concepts may be broader than you think.

Trust

For many data professionals, data trust is related only to data quality. However, there are many other concepts that need to be brought together to reach a point where we are providing trusted data in all consumption.

Some of these important data trust components are:

  • Data Quality – Is the data I am using complete? Is it accurate? Is it the latest available version of this data?
  • Data Lineage – Can I tell where this data came from? Is it internal or externally sourced data?
  • Business Glossary – Can I tell the difference between which chart shows our customers and which chart shows our distributors (if we sell to end-consumers and through channels). And are we using those concepts (terms) consistently in all our analytics?
  • Business Rules – Are the analytics results I am looking at correctly applying the rules defined for these calculations and are we using these rules consistently in other similar reporting?
  • Data Ownership – Is ownership for the data I am looking at clearly defined? If these customer categories I am analysing are wrong, can I log a query for support and know that there will be someone at the other end of that query that will take care of the required update within a reasonable timeframe? Will I be updated with progress made on my support call?

The above list is non-exhaustive and establishing a comprehensive data trust environment is a multi-faceted, multi-year journey.

Speed

The three core concepts of speed are:

  • Time to insight and action (i.e. real-time delivery, batch delivery, action-driving workflow).
  • The speed at which we can deliver new solutions (i.e. our development pace).
  • The speed at which we can adapt to change (i.e. when something unforeseen arises how quickly can we make a change or respond to and resolve a query).

Optimising Data Value

The following helpful tips provide some pointers for how you can bring equilibrium between trust and speed to your data environment to optimise the value that you deliver.

1. Begin with the end in mind

Architecture plays an important role in knowing where you are headed. Just like you would never start building a house without a plan, you shouldn’t start building in your data environment without a clearly defined architecture. But you also need to understand the business requirement thoroughly and be able to verbalise the business value in order to have a clear view of where you are headed with a project or requirement.

2. Let the business define value

It is important to connect directly to key business objectives in everything that you deliver. If the business defines value as something quick-fire and rudimentary, you need to sing to that tune. If the business operates in a compliance-heavy industry where consumer data abounds and privacy is a major concern, the right processes for building trust in data privacy will be important.

3. Build “just good enough”, then move on

The plight of data engineers in their pursuit of technical excellence is inescapable. It’s just how they are wired. We need to define “just good enough” for our environment and then move on quickly from there. Incorporate this definition in your planning processes so that team members know when to carry on and when to stop and move on.

4. Learn to embrace “technical debt” and “process debt”

Following on the previous point, building “just good enough” and then moving on often leaves loose ends untied. This is referred to as “technical debt” and often we incur “process debt” when we don’t stop to define processes to manage what we have built. This is acceptable and necessary, so embrace it. Just make sure you track your technical and process debt and plan to come back to it when the time is right

5. Define processes and templates to predict, measure and optimise value

When attempting to optimise value, follow this 3-step process of predicting, measuring and then optimising. This will provide some groundwork and make the optimisation step easier and more effective.

Step 1: Predicting is the business case that you draw up to get approval for the data project you are embarking on. You should be documenting indeterminate and intangible benefits in your business case because often the financial benefits are difficult to determine and measure directly.

Step 2: Measuring is the process you do once a solution is deployed and is in use. Build the measurement capability as part of the solution so that it is easy to determine and report on value. If you can’t easily measure value, you should be questioning whether this is a solution you should become involved in building.

Step 3: Optimising is now much easier because you have documented benefits in the business case and a clear measurement of the value created by the solution. If you routinely review these for each data solution and roll this up on an overall environment level, you will be able to clearly see the solutions with opportunities to optimise value, because the actual benefits are not meeting your expectations.

As you start testing these tips and seeing the benefits, these activities need to be added to your standard processes so that they become routine and so that you can reap ongoing benefits from them. Remember that anything you don’t document and enforce will not be done routinely because people don’t do what you manage, they do what you measure. So document and enforce, and then measure and monitor.

Optimising data value is a fine balance that requires skill, experience and often some trial and error. Try these tips in your environment and you will start seeing your data value delivery accelerate with great benefit to your organisation.

Written by Karl Dinkelmann, CEO, Nexus Data