Just in case you are one of the few people on the planet yet to catch the jingle of the Dumb Ways to Die campaign, it’s a public safety initiative recently launched by Metro Trains in Melbourne, that it is being hailed as an “internet sensation”.

According to Wikipedia, the campaign video was uploaded on 14 November and made public two days later. It was viewed 2.7 million times within 48 hours, and 4.7 million times within 72 hours. Within two weeks it had been viewed 28 million times and spawned 85 parodies. The campaign song was in the top 10 on the iTunes chart within 24 hours of its release.

But will all this online action translate into real-world behavioural change?

In a recent article for mUmBRELLA, Hugh Stephens, the director of a Melbourne-based social media consultancy, Dialogue Consulting, suggested that it is unwise to use “vanity metrics” as a marker for behaviour change.

But what do the wider public health crowd think of the campaign, and how its impact should be measured?

[youtube]https://www.youtube.com/watch?v=IJNR2EpS0jw[/youtube]

***

Do marketers really understand public health?  

Hugh Stephens writes:

I recently posted a quite controversial article in mUmbrella titled Stop using vanity metrics to measure behaviour change. It’s been fascinating seeing how people have responded – both positively and negatively.

I have a fair bit of experience and knowledge in public health (but wouldn’t call myself an expert), and have tattooed behind my eyelids the words ‘evidence-based practice’ as a result of years of education.

And sometimes, the work that marketers do (yes, myself included) often forgets the evaluation part of a campaign, which might be more expensive than the campaign itself – 2 year longitudinal followup studies don’t come cheap.

If your major objective is behavioural change, only measuring and reporting on how many Facebook Likes or competition entries you get is ineffective, and misses the point. “Reach” / “Impressions” / “Views” are not equivalent to a change in behaviour, or an intent in changing behaviour, and should never be viewed as such.

You don’t see the Cancer Council lauding a success until the research has come out, often some years later, usually through publishing a cross-sectional longitudinal study in a peer-reviewed publication. And many organisations (including some marketing agencies – after all, social marketing is a profession) do great, evidence-based work: just look at ReachOut, TAC or RTA campaigns, sexual health campaigns and more.

In public health there is a lot of research conducted and done. There are whole institutions set up to conduct, measure and evaluate public health campaigns, from small and targeted through to mass media national initiatives. Awards are given based on long-term followups and evaluations.

This is quite different to advertising and marketing – where awards are given for catchy, unique campaigns, well and truly before they can be evaluated. (There was a very interesting comment on the article (#9), saying that the point of the campaign was just to win awards because “they don’t give awards for boring but effectively targeted ads which had an impact that can only be reliably measured several years later”).

Dumb Ways to Die is a great, catchy, cute campaign. It’s ‘gone viral’, with millions of views and shares. Which is awesome. And it has certainly been highly successful in changing the sentiment around people talking about Metro – indeed, some have commented that this may have been the primary objective in the first place. If that’s the case, hats off – well done, and hopefully the effect will be lasting.

But as someone with a strong research background, I’m unsure how campaigns like this (I’m not singling it out, but it’s quite a topical example…) actually change consumer behaviours or are adequately evaluated to measure as such.

Sure, there is a lot of theory and rhetoric around the idea of Nudging’ people rather than telling them to do or not do something…but I feel that this is something best applied to marketing and advertising, not public health. It’s more for when you’re in the supermarket and need to choose between two brands than when you’re running late for work and need to run across the crossing to get your train (IMHO).

There is distinctly an overlap between marketing and public health – and both sides have many things to learn from each other.

But in health, we always focus on the cost / benefit. How can we use the money available to change the maximum number of behaviours that occur (note, not ‘tell/reach the most people’)? And how do you adequately measure that? Pre and post studies are one way, but as you increase the size of your target population you need to increase your budget and timeframes.

So I’m waiting with bated breath to see what happens when they release the statistics on rail-related deaths from this year, to see if they are different. Given that the advertising industry usually doesn’t publish evaluations in peer-reviewed journals, that’s probably the best opportunity I might get to eat my own words, or not.

In the meantime, I’ve stirred the advertising and marketing pot and a lot of people have both good and bad things to say about my opinions and thoughts – which was exactly my objective. Hopefully this conversation will encourage marketers to be more transparent with their evaluations, and public health professionals to perhaps get more involved in marketing campaigns.

Weigh in your thoughts in the comments below or on Twitter @hughstephens.

• Hugh Stephens is the Director of Dialogue Consulting, a social media consultancy based in Melbourne.