We use cookies. You have options. Cookies help us keep the site running smoothly and inform some of our advertising, but if you’d like to make adjustments, you can visit our Cookie Notice page for more information.
We’d like to use cookies on your device. Cookies help us keep the site running smoothly and inform some of our advertising, but how we use them is entirely up to you. Accept our recommended settings or customise them to your wishes.

Do we see what we want to see? Unexpected results in data analytics

Businesses depend on the objectivity of data to inform better decision making, so what happens when you get unexpected results? Aquila Insight resists the temptation only to see what we want to see.

Data analytics has transformed the way businesses measure and understand human behaviour, to answer complex analytical problems and drive informed decisions. A data scientist’s role involves examining and manipulating large datasets to uncover unknown patterns, hidden correlations and information useful for the business.

Unexpected results

What happens though when a significant amount of money, time and effort has been spent on an analysis which at the end of the day fails to reveal useful results? What if the outcome cannot be translated into actionable insight? Or the data only tells us what we already know without uncovering any interesting, eye-catching results?

Scientists have been known to adjust parts of the analytical process in order to come up with a desirable outcome.

Oversimplifying the analysis, cherry-picking which outcome to measure, presenting in isolation, tweaking the methodology, sample size or time period, are all tricks they resort to, to ‘do the job’.

Don’t always start with the end in mind

Sometimes it can be useful to have in mind what the end result will look like at an early stage of the analysis. However, when the data doesn’t show what we expect, or the outcome is not positive enough, then do we try to make it look like what we originally thought? Do we ignore the observed outcome? Do we simply see what we want to see? And is what is being presented back a true representation of what we are trying to measure?

Confirmation bias and data analysis

Lately, I’ve been reading lots of scientific studies supporting all the amazing health benefits of coffee. Of course, many other studies show the exact opposite. The truth probably lies somewhere in between, but as someone who can’t live without this magic poison, I choose to believe everything that is good. I fail to see the flaws in the analytical process, for example, the small sample sizes, biased population and so on.

Objective data

As data analysts, we ought to be as honest as possible with our analysis and interpretation of results by excluding personal preferences and external pressures. What we should communicate to clients is what data tells us and what we believe to be closest to the general truth. At Aquila Insight, we know that businesses depend on objective data for better decision making.

At all stages in the scoping of analysis, designing, data collection and preparation, statistical analysis and dissemination we strive to keep the outcomes and what we communicate protected from any pressures. Even if, sometimes, the results are not entirely what we expect or would wish for.