downloadable PDF of our ARF presentation here

Transcription

downloadable PDF of our ARF presentation here
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Yes – Advertising works! We are going to address - how different is it across media
platforms?
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
1
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
We are awash with questions about advertising response and differences between
media.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
2
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
So we pulled together an august team of industry experts to help us find answers in
our norms database.
Jim Spaeth and Alice Sylvester from Sequent Partners; David Poltrack from CBS; Britta
Cleveland from Meredith and Tony Marlow from Yahoo!. Jasper Snyder from the ARF
was also an advisor.
Once we had our team, we dug into the data – and boy did we dig! I know this team
was surprised to launch this several months ago with literally hundreds and hundreds
of graphs. We wanted, in fact needed, to understand what was required to make
comparisons across media possible!
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
3
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
We started with 2500 studies and culled them down to 1400 that had complete, CPG
only data where we were measuring the incremental sales for the same brand
advertised. The media costs also had to be actual media costs. This is very important
since we are reporting ROAS which reflects the costs of the media.
There are 11 years of data across 450 CPG brands.
All numbers reported are for groups of campaigns with at least 10 studies. Those
with between 10-20 studies are shown with faded color.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
4
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Linear TV since 2009 – TV networks & cable networks
Linear TV since 2009 – This is broadcast TV: TV networks & cable networks
Magazines since 2012 – Major publishing companies; only very large schedules
Online Display 2004/Video 2008 – Major publishers & portals; typically premium
inventory with virtually no programmatic campaigns.
Mobile since 2013 – In-App measurement. Very small campaigns
Cross Media 2013 – studies with multiple media.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
5
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Objective: The key objective was to figure out how to make sure that the values we
are reporting actually reflect the differences between the media.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
6
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
This has been a challenge!
Different mix of media over the years. A large part of those years were recessionary
(I will share what we see across years – it turns out it isn’t a discriminator and more
often reflects the mix of studies included.)
Most importantly – how we classify media is changing. In this presentation – we have
tried to keep them clean, but they aren’t clean and the boundaries are getting more
and more blurred. (Magazine campaigns commonly include the digital content – we
put those into cross-media, TV is often delivered across screens. We have worked
hard to try and keep these clean, but the world is changing.)
Many of the drivers of the differences between media couldn’t be controlled for – like
creative.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
7
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
8
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Measuring Lift starts with data. NCS connects exposure both big data and currency
quality “small” data to CPG buy data at the household level to create true single
source data. That means, we have ongoing transactional data to understand the
commercials that are seen and how that changes what consumers buy across most
major media.
We use both big data and small, currency quality data together to make smart data.
Using the small data to inform the biases and holes in the big, convenient data. This
is true for purchase as well as exposure data.
Please get the presentation to understand more about our data
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
9
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
10
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
11
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
ROAS – Return on Ad Spend – it is the incremental dollars due to spending each dollar
in advertising. (this is not ROI since it doesn’t include the brand’s margins.)
Incremental sales per exposed HH – removes the variable of cost and focuses on
reached HHs
Incremental sales per thousand impressions removes the media costs and the reach
and just looks at the value of each impression.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
12
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
These three slides share the key findings for this study.
Notice how close most of the media are to each other. The two standouts are:
Digital Video – what we are seeing here is the impact of supply and demand. Digital
video is hot and there is very little inventory. That drives the price up and the ROAS
down.
The opposite is true for magazines. Magazine reading is engaging, provides context –
but is not currently seen as being as sexy as other media. The supply is also flexible,
so supply and demand are not the driving forces. The total audience may also not be
given as much credit as it deserves.
You might wonder how digital does as well as the other media. Remember, this is
predominantly premium inventory with virtually no programmatic included.
These are return on ad spend, so the budget matters. The average spend for linear
TV was almost $10 million. Magazines’ average: almost $2 million; Cross-media – just
over $1million; Digital video – just over $500K; Display – almost $500K, Mobile $300K. These values are at very different scales.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
13
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
When we remove the cost of the media and only look at the incremental sales per
reached HH, the pattern changes. Linear TV delivers the highest return. This is
interesting because the average reach for these linear TV campaigns is 57%, while
most of the other media deliver one-tenth as much - between 3-8% reach.
Magazines, being all large campaigns have an average reach of 25%.
This isn’t surprising – TV has sight, sound, motion and delivers the highest return. It
is expensive because reach is always expensive.
The display is for our partners and doesn’t include any of the larger “walled gardens”
larger campaigns.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
14
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
When we remove reach and look at the return for each impression – Mobile does
very well. The mobile campaigns are small – sharing the lowest reach with digital
video and the lowest frequency – a 12 vs. an average of between 15-18 average
frequency for the other media.
Call out the importance of reach. Average frequency: Mobile – the lowest with 12.
Digital Video 18; Display 15; TV 16; Cross Media 16
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
15
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Each measure paints a different picture across media. We will return to these at the
end for discussion.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
16
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
NCS has traditionally reported a brand’s ROAS as compared to brands in the same
category. Brands with shorter purchase cycles and higher weekly sales consistently
show higher return.
Baby and Pet products are purchased more often and cost quite a bit. It is easier to
drive higher ROI than for brands that are smaller and have longer purchase cycles.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
17
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
This isn’t as clear across media.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
18
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
And certainly not when we look at incremental sales per reached HH.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
19
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Or impressions.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
20
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Based on the work we have shared around the NCS- Neuro project, it is clear that
incremental sales are influenced by the size, penetration and purchase cycle of the
brand. Here we clustered brands into 3 groups.
The Marquee brands are much larger brands. Mid-size speaks for itself and
Infrequent Use are brands that have long purchase cycles and much more consistent
levels of purchasing.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
21
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
The clusters have very different performance across all measures. Marquee is
substantially higher than Mid-Size which is substantially higher than infrequent use.
This is one finding that will improve NCS’s work. We now know that brands should be
compared to their clusters rather or in addition to comparing them to their category.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
22
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Unlike category – we see this same pattern when we look inside brands. (faded cells
reflect smaller cells.)
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
23
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
And this continues for reach.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
24
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
And impressions.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
25
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
As promised, we looked across time. There is no consistent pattern. This is much
more reflective of the mix of campaigns measured than the marketplace.
We dug deeply into this variable and didn’t see a consistent pattern either across
years or across years and media.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
26
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
27
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
28
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Slide up during the panel discussion: Ok – Apologies for the speed – but now we get
to the interesting part with our panel.
Jasper, would you handle questions?
Key questions raised:
Why are cross-media plans not higher?
NCS consistently finds that there is great value in synergy. Households
exposed to both media have higher incremental sales. However, mostly the
overlap between media – households exposed to both media are small.
Therefore, the impact of that synergy tends to be small when we look at the
total campaign.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
29
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
30
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
31
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
One way to explore the norms across media is to look at individual brands and
compare the results
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
32
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
This is a single brand – during the same 8 weeks. There are three portals and nine
different tactics. While none achieve an ROAS above $1.20, you can see how
different the return can be.
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
33
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
On the other hand, this brand has consistently high ROAS – above $2.00 and as high
as $3.40 across years and media. They are doing something right!
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
34
Presented at the ARF’s Audience Measurement Conference - June 14, 2016
Source: Nielsen Catalina Solutions, Copyright 2016 © Nielsen Catalina Solutions
35