So you’ve got forty squillion clicks, or engagements, or opens, or batted eyelashes, or whatever the hottest metric is this month. Are you basking in the warm glow of success … or is that just the heat from a flaming pile of money?
All right, we’re going to let you in on something big.
The secret ingredient.
The magic formula.
The seven herbs and spices.
Want effective online marketing?
Here’s how to make it happen. Drumroll please…
… you MEASURE EFFECTIVENESS.
What – you thought there was going to be some kind of newfangled tool, or a hot new social channel, or a genius data robot or something?
No. Getting good at online marketing means doing the following:
Step 1: Clearly define success.
When building any campaign, you need to work out specific KPIs for each channel. Then you need to ensure you’re reporting on those agreed KPIs, and ignore any peripheral BS.
What’s best to measure will change depending on the brand you’re working on and the goals for a particular campaign or piece of advertising. For example, sometimes it’s entirely appropriate for a KPI to simply be ‘brand awareness’. In one campaign we ran recently, we decided to put 80% of the focus on awareness (what we good-naturedly call ‘Insta vanity’). We measured how many people viewed the posts, because that was what mattered most for that particular promotional phase.
If signups matter, measure signups. If clicks matter, measure clicks. If shares or sends matter, measure those. If it’s a combination of a number of things (as it usually will be), measure those things specifically and don’t worry about the rest. More data doesn’t necessarily mean more insight – often it simply means more confusion and less focus on what really matters.
Which leads us on to …
Step 2: Avoid ‘bullshit metrics’ and focus on ‘true data’.
If you’ve spent any time at all in the world of digital and analytics, you’ll nod your head knowingly when we mention the term ‘bullshit metrics’.
Bullshit metrics have been around since the days of dial-up – they change as the industry does, but they’re as old as the Internet. In the early days of banner ads, plenty of clients went away happy after being told their campaign had been ‘shown’ 45,000-odd times. Never mind that 18,000 of them were below the page fold, 20,000 were multiple showings, and only 16 people clicked on the damn thing. Ditto any agency that used to talk about ‘hits’, or focused on views rather than unique browsers back in the day.
The metrics may have changed, but there’s still plenty of stats around that need to be carefully considered or looked at in a wider context. A Google Ad campaign with 500 clicks might look awesome for the budget, until you realise the ad’s been shown half a million times, and 498 of them were people with fat fingers who bailed before your site loaded.
There’s a huge number of great data gathering tools out there. Some measure traffic, some keep track of your database, some monitor eDMs, still others are concerned with ad sales and post interactions. If it’s measurable, there’s a tool to measure it. Whether what you’ll get is meaningful or not is another matter.
So how do you measure effectiveness in a truly multichannel age? There’s no one ‘right’ answer, but here’s how we do it.
Firstly, our clients have access to Hubspot – which means they can see everything we’re running and how it’s tracking. We think it’s important to foster trust, and it’s an investment that pays off in the long run as clients become more digital-savvy and understand what it is we’re trying to do.
Our internal analyst, Mariya, also collates digital data across numerous different sources –
Yes, that’s a lot of places to pull data from – but seeing them all is the only way to get a clear picture of success. Based on the results Mariya compiles regular reports, focusing on the metrics we’ve agreed are most important. She summarises what’s been happening, gauges how well we’ve performed against our chosen metrics and KPIs, and tweaks the campaigns to better optimise them.
You might argue it’s too expensive to put this much effort and resource into data measurement.
I would argue it’s far more costly not to properly analyse what you’re spending and what you’re getting in return.