<img height="1" width="1" style="display:none;" alt="" src="https://dc.ads.linkedin.com/collect/?pid=41671&amp;fmt=gif">

 Tags: Risk Management

Luis_Passos_CarvalhoFor decades IT departments have urged users to look at aggregates of the data for business analytics instead of using the more granular data that was available in the source systems. Of course there were many technological reasons for this, but one cannot forget the business merit of it. Highly detailed data, in all of its complexity, can hide the actual insight that you’re looking for. You fail to see the forest because you can only see the trees. If I have hundreds of signals fighting for my attention, do I choose to focus on a few, ignoring the large majority or, at the other extreme, spread my attention thin looking into every single one, accumulating an ever increasing backlog until, for all intents and purposes, I’m dead in the water unable to decide or react in a timely fashion.

The big data technologies that are now easily available have essentially removed most of the technological arguments that previously made looking at the detailed data an unattractive prospect. And so, we must ensure that we can extract the necessary value of the highly detailed data that we can now deliver.

Business Intelligence (BI) tools have already begun providing a much needed window into these huge repositories of unstructured data, but this doesn’t address a basic problem. Our brains haven’t had the same rate of evolution as our technology. We simply can’t cope with the speed at which our systems generate data. Even though these tools can show us the data that is present, it is still necessary to limit the number of data points shown to something that our brains can comprehend. Each report we look at shows us a portion of reality, much like looking into a room through a keyhole. And so we spend our days peeking through the key hole time and again at slightly different angles, ensuring that everything is ok, while knowing that, the vast majority of times, there is nothing to be seen.

So what is best…detail or aggregate? To effectively monitor and control our business, new techniques need to be put in place to act as a buffer so that we are not washed away in a flood of data. Now more than ever, our systems need to be not just reporting tools, but true assistants that look at the detailed data, discarding the noise or the irrelevant, and only bringing to our attention the events that really need our decisions. Ideally our systems should prepare a dossier of relevant data to help us in our decision making. They should take into consideration what we have decided in similar situations, what data we looked at then, and what is different now, while still allowing us the flexibility of looking deeper into the data for more complete information.

An Enterprise Business Assurance system is perfectly placed to fill this role. It shows us the high level indicators of the health of our business while, at the same time, sifting through all the detailed data that we now have available to us. It searches for the early signs of things that are not going according to our expectations and brings them to our attention so that we can focus on the details that matter.

Subscribe Our Blog

Let Us Know What You Thought about this Post.

Put your Comment Below.

You may also like:

Lessons qualified professional staff should learn from the Post Office Scandal

Welcome back to my fourth blog on the scandal of the Post Office’s Horizon system and the devastating consequences of th...

The Post Office Scandal – what can Fraud, Revenue Assurance professionals learn from it?

My previous blogs on the Post Office introduced you to the most widespread miscarriage of justice in British history and...

How the lack of leadership and values created the Post Office IT Scandal

Last week, I published my first blog on the scandal of the UK’s Post Office IT Scandal. I described how this IT scandal ...