The uncertainty principle in analytics

Everyone loves data. But there is a hidden problem lurking underneath the increasing reliance on analytics in the world of marketing and design. It seems like the more that we measure things, the less that we actually know and sometimes more measurement just  ends up making things worse.  This got me wondering about the tension between art and science.

Werner Heisenberg was one of the creators of quantum physics. In 1927 he published the “uncertainty principle” for which he is now best known. The principle states that: “It is impossible to determine accurately both the position and the velocity of a particle at the same instant.” Electrons are particularly pesky because to know where they are at any given time you have to stop them and measure their location. However, if you want to know how fast an electron is moving, then you have to let it run free and measure its speed instead. The uncertainty principle says that you can’t know both the speed and the location of an electron at the same time, and that you have to trade off between these two types of knowledge.

The uncertainty principle is a useful way of understanding the limits of human knowledge in any area where you need to balance the accuracy of a measurement and the effort put into taking the measurement. The principle applies to marketing and design because the more precisely you want to measure user behaviour, the more impact it will have on the user (which can then distort their behaviour). There are several types of hidden tradeoffs that people don’t realise about adding more analytics to their business.

1. Tradeoffs that impact behaviour

In email marketing, we use tracking links to tell whether someone has clicked a particular link in an email. The extreme versions of these links can identify a click down to an individual user, their device and even their geographic location. But to do this type of analysis you need to redirect the user through a link cloaking tool or a link shortener. The tradeoff is that spam filters hate these link cloaking tools because they can be used by scammers to send people to random destinations, instead of where ever  the user thought they were going. When you add link tracking to your emails you gain better analytics, but lose deliverability because some email servers may think that you could be a scammer.

The balanced compromise is to use UTM tracking codes added to the end of a normal web address. The added parameters can feed data to a tool like Google Analytics without hiding the true destination from the user. But even these links carry a tradeoff because they look ugly and can be confusing to some users, who are less likely to click on them. Again you have to trade measurement for performance.

Tradeoffs where increased measurement can decrease performance are the most common and most harmful instances of the uncertainty principle in marketing because modern marketing says that if a tree falls in the woods and there were no analytics to measure it, then did it really fall? But too much measurement can accidentally hurt your campaign.

2. Tradeoffs that impact performance

One of the most heated (and obscure) arguments that I have ever had in my professional life was about the placement of the Google Analytics tracking pixel on a website.

If you place the tracking code at the top of a page, then Google Analytics will load before the content and will do cool stuff like measuring page load times and tracking people even if they leave before the page has loaded. The tradeoff is that this can cause tiny delays that slow down the loading time of the web page. If you put the tracking code at the bottom of the page, then you gain speed, but you lose precision in your tracking.

The compromise is supposed to be to use a technique called “lazy loading” that allows the content and the tracking to load at the same time without competing with each other. But I’ve found time and again that this is a false promise and lazy loading makes the page even slower and also makes the tracking worse so no-one wins.

Over the years, the school of hard knocks has taught me to look for the “minimum viable tracking”. Which I define as the least disruptive tracking you can install that will allow your team to make meaningful decisions and let you improve the website design by more than the cost imposed by the addition of the tracking tools.

3. Tradeoffs that impact design

The last type of uncertainty principle is much more subtle. Sometimes adding more tracking can lead you down a path to making bad design decisions. The popular case studies for proving the value of analytics always seem to involve A/B testing the colour of a button and discovering that one colour of button outperformed another coloured button.

A/B testing is the process of showing two versions of a website to users and testing to see which version performs better. This is seductive because it reduces soft skills like design, copywriting and brand strategy down to the level of provable mathematics. But all this testing seems to lead to incremental improvements rather than true breakthroughs.

Optimising only a small part of an overall process can have surprising knock-on impacts. I’ve seen this in a user sign-up process where making the first step easier ended up hurting the last step so much that the overall process was less effective. Mathematicians call this seeking a “local optimum” instead of a “global optimum” (which would include the entire system). Local optimisation May feel good at the time, but it usually has a long-term tradeoff.

Unfortunately, to measure and optimise an entire system can require hardcore integration between various marketing metrics and CRM systems that are too much for a small team. So most startups are left optimising each step in isolation in the hope that the entire system falls into place eventually.

Finding the balance

I love the customer insights that good data analysis can bring, but these days I’m much more conscious of the tradeoffs that the uncertainty principle brings with it. Every insight comes with a hidden cost. So I treat every chance to observe user behaviour as a golden opportunity worth making the most of. Good user data is a terrible thing to waste.

Recently I’ve been finding that the best user data doesn’t come from tracking tools and analytics platforms, it comes from the company’s own databases.

Your own transaction records, user databases and CRM could be hiding a goldmine of user insights that don’t carry such a heavy analytics tradeoff. So instead of always looking to add more tracking, maybe try making better use of the data that you already have.