Big data, small data, sour data

Big data, small data, sour data

11.5.2016 -

Digitalisation: a slightly annoying word, but an unavoidable phenomenon. It’s a foregone conclusion: in this digital world, the active fuel and result of interaction is data.

And here we should pause for a moment.

A private sector company or a non-profit organisation must be confident the data they use to run processes, conduct business, make decisions and serve customers is valid and accurate. They simply must be able to trust that it is so.

But valid data doesn’t just magically happen.

If data is not reliable, there are two alternatives: either you believe your data but it leads you astray, or (as too often happens, unfortunately) you have data at your disposal, but need an army of Excel artisans to arrive at a satisfactory collective understanding.

”We won’t get the figures before Tuesday.”
”That is accurate enough.”
”Wait, I’ll make one more small calculation.”
”Do you think you could double-check this one more time before I send it forward? Thanks.”

For a while, a fundamental principle of the data-centric world has been GIGO: Garbage In, Garbage Out. In short, it means that if the incoming data is poor quality, the final resulting new data will, in the least, be of equally poor quality.

The first task in correcting this: ensure you receive good quality incoming data.
It’s time to set a Cerberus at the gates to admit only data that arrives in confidence-inspiring condition. That’s a good place to start. As long as we remember that there are several streams flowing into our lake of data. It doesn’t really seem like the number of streams would be drying up any time soon. But don’t let it dampen your spirits. It will work out: practices and tools are available.

The second task – and a bigger concern: the cursed data might turn to mush.
The context in which organisations operate is changing. New players are constantly joining the game, one system, one application and micro service at a time, producing, using or changing existing data.

It’s not that we’re all afraid of entropy caused by applications but that data might well reach its best-before date.

All of these factors impact data flow and must be closely monitored. The longer the time frame, the greater the number of layers to drill through; the more bits and bytes of data combined in the analysis, the more closely data needs to be audited. We are able to dig up yesterday’s sales – fortunately we have the transaction data. But the deeper we go into all the other operative data, as well as its integrity and links between them, the greater the degree of difficulty. But it needs to be done. A great deal has gone into integrations in different systems, but if the data is out of date, no one benefits.

Get the basics right

Forecasts and analyses, process automation, accuracy and ease of reporting, turning data into information, Watson’s diagnoses, and everything good and beautiful are within reach. They are. But they will not come on their own. The application of data requires new roles and processes. If we want to rise to the next level of business development in our digital world, we must first get the basics right. And it’s not just about the question of big data. We also need to pay heed to small and medium-sized data.

The issue is neither new nor insurmountable. It’s actually a very positive thing. A happy thing! When you pay attention to your data, it shows you its secrets: you might obtain insights. Take care of your data. Watch over it. You can’t thrive without it.

Janne Huovilainen

Janne Huovilainen

I’m a digital commerce consultant and my current tenure has lasted over twenty years. I’m a process oriented humanist and as a living I ponder the phenomena of data with our customers.

Read all articles

Subscribe to our newsletter

We won't spam, promise!