Or to be more accurate, the abundance of analytics and the paucity of insights.
“I’m out of ideas for what to add”, he said with a frown. “I have a feeling we’re eternally this close to the answer”, he said holding his fingers a whisker apart, “but I just can’t imagine what we are missing.”
“Can you show me what your analytics look like?” I asked. He nodded and led me into a conference room. It instantly looked familiar.
A Beautiful Mind
“A perfection of means, and confusion of aims, seems to be our main problem.
I fear the day technology will surpass our human interaction. The world will
have a generation of idiots.” ― Albert Einstein
The walls of the war room were a crazy quilt of Excel reports, tacked up to the wall in rows one after another, after another, after another. I couldn’t make heads or tails of the color-coding system, but one thing was instantly clear: that analytics team was working their tails off.
CEO: (sigh) “There are no insights in here, are there?”
EXEC: “No Ma’am. This is where insights come to die.”
There were reports on paid search and organic search. There were reports for content marketing, for email campaigns, for video, for banner ads. There were separate reports for social – lots of Facebook ads – and another for social. Data was broken out by day part, by site engagement, by day-by-day site traffic benchmarked against year-ago, traffic by source, traffic by content, top viewed pages, site performance by channel (online video, direct, search, etc.), top 5 exited pages, monthly display overview, monthly search overview, social overview and conversions by channel. There were white boards everywhere, with lots of flow chart arrows.
The Data Crack Den
I had a feeling that if I kept looking I’d eventually find 27 8×10 color glossy pictures with circles and arrows and a paragraph on the back of each one explaining what each one was to be used as evidence against us (maybe over there by the Group W bench – not to be confused with the Group M bench), but I had to stop. I was literally getting dizzy.
Luckily it was right at the moment that I saw the first thing on that wall that was truly revealing: a three-page list of objectives and sub-objectives, segments and sub-segments… and extensive notes about potential new data sources.
I realized I had stumbled into a data crack den. This wasn’t about information any more. The team had rounded the corner into addiction.
As gently as I could, I asked, “would you be willing to look at this another way?”
He said, “of course, everything’s on the table.”
I explained that in my experience we almost never get to insight by starting with data. It’s like looking through the wrong end of the telescope. Data often has intriguing pieces of the answer, but at the outset we don’t need answers.
The first and most important step of any analytics process is figuring out what the right questions are.
That’s why when I think about analytics, I always start with questions first.
Cunniff’s “Three Things” Method
I have a simple method for wringing insights out of data. I don’t try to know everything. I don’t even try to know very much. Instead, I pick just three things I want to know.
Don’t worry if you’re not a data geek: everything I’m about to say is in plain English.
Instead, ask “if I could wave a magic wand and instantly know anything in the universe about my customer, what are the Three Things I’d Like To Know?” Note that this question is not “what data would I like to have?”, “what data is available?”, or what are the 8,924,081 things I’d like to know?”
It’s “What are the Three Things I’d Like To Know?”
Step 2: Are my Three Things I’d Like To Know actionable?
For example, if I knew exactly how many haircuts each of my customers would have in 2017, what actions would I take for my brand today? If I don’t instantly know what I would do — or why I would do it – that is a clear signal that my “What I’d Like To Know” question will not lead to an actionable insight.
If you know exactly what you’d do if you had the answer, then you know you’re asking the right kind of question.
Step 3: Looking at my refined list of Three Things I’d Like To Know, I ask three diagnostic questions.
- Can the answers be known?
- How precise must the answers be?
- What is the minimal directional answer that is useful?
Step 4: NOW we move to data. Here again, there are three diagnostic questions.
- What data do we already have that might answer the Three Things I’d Like To Know?
- Look online AND offline – sometimes you’re lucky and you’ll find you had the answer to one of those questions all along.
- What additional data might add useful dimension to those answers?
- Will paying (in dollars or staff time) to obtain this extra data improve our results enough to pay for the effort? If the team is less than 50% certain that it will, wait six months before deciding to invest in additional data.
Step 5: Now, let’s build a dashboard.
The secret to a good dashboard? Smaller is better. More lights on the dashboard are only a good thing if your business sells light bulbs for dashboards.
How do you keep a dashboard focused?
- Report ONLY on the data that relates to the Three Things I’d Like To Know.
- Do NOT include data that is unrelated. Capture it, store it for later, but do NOT report it.
- Do NOT include data that you wouldn’t know how to act on (e.g. if you have no clue what would you do if you learned that people hovered over a button for 10 seconds or 1 second, do not report it.)
Deep Analytics Require You To Take A Deep Breath
If you want more and better insights, you can get them. All you really need is the ability to take a deep breath, a willingness to ask simple questions, and a little bit of courage.
If I can help with any of that, just let me know.