By Bill Goodwill
One of the most frequently asked questions we get when evaluating PSA campaigns, is how are we doing?
The simple, but not helpful way to answer the question is to respond: “compared to what?” While it
is a simplistic response, it forms the backdrop for the need to compare any non-profit PSA campaign to a standard.
Most organizations distributing PSAs want to know how well their campaign
has performed, based upon objective criteria.
Essentially there are two ways
of providing this information.
First, individual PSA campaigns can be compared
ones launched by the organization. Or a campaign can be compared against others with similar characteristics,
comparable quantities distributed. Obviously no two campaigns are alike in terms of subject matter, timing
and creative quality. However, by examining usage data taken
from a wide variety of campaign issues, some interesting patterns
Our newest PSA evaluation procedure which is aptly titled: “Benchmark Report,” compares any given client
campaign to 64 national TV broadcast PSA campaigns we have distributed since 2010. The reason for
beginning in this year is because that is when most TV stations migrated to “Hi-Def,” and as part of
that process, they each got up to six sub-channels, many of which are used to air PSAs. This fact, in
turn has resulted in a very significant increase in our client campaigns from 2010 onward.
Using the benchmark data, our software computes an average of these campaigns, against which the client
campaign is compared. To keep the data uniform for comparison, we select similar tracking periods,
and we use only data resulting from the Nielsen SIGMA tracking service, which is the most accurate available.
The real reason why this is a valuable exercise is not just to see how well any given campaign is
performing against the standard; it is to do this analysis early in the months after the campaign was
distributed. We think the answer to why we do this is obvious, but just in case it is not, if you wait
until the campaign has run its course, it is too late to affect change.
The above graphs shows how benchmark data is tracked monthly for broadcast TV only, but we also have
benchmark data for 21 radio
releases, 11 print and four outdoor campaigns as well.
The graph depicts the dollar value from all sources which is $5.7 million, using Nielsen tracking data, response cards
PSA clippings as the data source.
Examining your organization's PSA data using benchmarks
is not an idle exercise.
Knowing what to expect from a campaign in terms of response rate, market
penetration, and dollar value of exposure is the first
step towards taking corrective
action, if necessary.
When response rates or usage levels are less than average, an
has objective data on which to base follow-up activity.
It is also important to monitor your progress as your campaign matures. If, for example, you wait until the campaign
evaluation is completed to do your analysis, it is too late to take any corrective action. You need to monitor and
correct weaknesses as they develop to truly use evaluation data in a meaningful way.
Also you will get far different results - particularly from TV - the longer you monitor your campaign. As shown in this
graph, a typical TV PSA will generate about $3 million in ad equivalency value over a 26 week period, but that total nearly doubles over 52 weeks.
This fact has led some organizations to continue to track their PSAs for a year and a half and even up to two years in some cases.
Follow-up techniques vary in terms of cost and how easily they can be executed. In some cases they may consist of
personal contact by community partners, telephone surveys or a
While we have good data to tell us how
well a campaign is performing in broadcast television, for radio we must rely on bounce-back cards for evaluation data.
Data resulting from dozens of our client radio PSA campaigns
indicates that only about 17% of stations that receive a
PSA report usage. Unless some follow-up technique is used, any other exposure that occurs goes unrecorded.
One of our successful techniques is to send a
radio reminder postcard to non-responding stations. It is a two-part
postcard with a note to the public service director on the top with the same evaluation reply card that was originally
sent to the station on the bottom. The station simply completes the brief questionnaire and returns it for keypunching.
Its purpose is to obtain a more comprehensive accounting
of true PSA usage.
As the graph indiates these postcards can have a dramatic
impact upon reported usage rates and estimated PSA exposure values.
PSA program planners need to know what they expect from a PSA
campaign, prior to even launching them and what
types of corrective actions they can use to overcome weaknesses. And remember a very important point...always thank
the media that provide you with hundreds of thousands - and even millions - of dollars worth of advertising support.
Following are several different indicators of PSA success that should be
monitored for each campaign by media type.
Number of cities and states in which PSAs are used
Number of media outlets using PSAs
Total number of PSAs used and response percentage for the campaign
Total estimated dollar value of exposure
Usage by spot length
Usage by daypart (TV SIGMA data)
Usage by market size
Where you are NOT getting PSA usage geographically or specific areas
Other Near Term Measurement Tools
As the campaign is maturing, we have two other tools we review monthly to give us a picture of how
well the campaign is performing, which includes an Interactive Map of the U.S showing four different levels
of PSA usage, and a network report. More details on each of these reports can be accessed at:
Merchandising Your Data
You paid a lot of money to track your PSAs, so why not merchandise your reports to all of those
who have anything to do with developing your brand image, including:
Your chapter networks
Your ad/PR agency staff
In our non-profit PSA workshops we always include a Powerpoint slide that says the following:
“PSA evaluation for the sake of data collection is a meaningless exercise.”
It is what you do with the data to affect a more positive outcome that is important.
Bill Goodwill has supervised the distribution and evaluation of over 700 national PSA
campaigns on behalf of 167 non-profit organizations and federal agencies.