‘Fake surveys suck!” This is a piece of comment mail I’ll never forget. Scrawled boldly across the survey in thick, black marker, the donor’s opinion of the organization’s effort to engage him couldn’t have been more succinct.
Or more true. So many variables determine a survey’s success … or its suckiness.
How many questions are ideal? Is asking a donor to rank items in priority order a good idea? What about “True/False” questions? And if the issue requires some explanation, should each question have an educational preamble to help donors decide their answers? Or, is that more confusing and could it make them feel stupid?
A review of a dozen survey packages I received in the past few months reveals mixed practices.
The number of questions ranged from six (the National Republican Senatorial Committee) to 32 (the Democratic Congressional Campaign Committee). Many surveyors grouped questions under numbered subheads and gave the illusion of a shorter or simpler survey.
Three surveys with lists of issues or multiple-choice options directed, “Please fill in all that apply” or “You may check more than one.” Excellent, because that way there are no “wrong” answers, and it prevents donors from setting the package aside until they have time to think it over — the very last thing we want them to do!
Many surveys asked for “optional” or “voluntary” personal information such as age, profession and voter status, but Judicial Watch probed even deeper with an inquiry into my estate-planning intentions and my interest in receiving information on planned giving. A little odd mixed with a series of questions about corruption in Congress, illegal immigration and the likelihood of rampant White House wrongdoings if Hillary Clinton is elected president in 2008 — but if it doesn’t hurt response and will surface planned gifts for the organization, it’s a bonus.
Two workhorses
Because I’ve seen them repeatedly through the years, I believe two of the survey packages in this mix are long-standing controls, or at least reliable second-string contenders that have been up-dated and refreshed over time.
The first is the American Civil Liberties Union’s National Survey on Freedom, Justice and Equality. It’s vintage ACLU, and the outer envelope art hasn’t changed much in more than 10 years. I’m sure the letter copy has been rewritten time and again as the specific threats to various liberties have developed, but the core offer remains: a chance to “help shape a genuine debate about bedrock American values.”
ACLU’s survey has 10 questions. Each asks for an affirmation (or denial) of a preceding statement that begins, “I believe …” No question requires an educational preamble nor are any of the questions “dumbed down” — on the whole, this survey appears carefully crafted precisely for its target audience. It’s still in the mail after all these years, anyway.
The Democratic Congressional Campaign Committee’s 2007 New Directions Survey is much longer and more varied in question format than ACLU’s. I think the most difficult question for respondents is the first one, which requests a ranking of 12 legislative priorities and the directive, “1 = most important.”
Asking donors to decide among 12 priorities seems arduous to me — picking what’s first and last might be easy, but respondents could get hung up in the middle with choosing which ones are five, six and seven. And anything that stalls out the compulsion to write a check is not good.
However, after the ranking request, the DCCC survey questions are easier to answer and mostly meaningful, on point with the copy platform and the Democrats’ intent to set the country in a new direction with donors’ input and financial support.
There are only a few throwaway questions, such as “Which political party do you trust most to …” with “Democrats” or “Republicans” as answers. Those could probably be omitted. Indeed, I wonder if the DCCC has tested fewer questions or if it’s proven 32 is the magic number for it — but here again, because this package or variations of it have been in the mail for many years, something about this survey clearly is working.
The sealed envelope
The shortest survey offer from the National Republican Senatorial Committee calls the device a ballot rather than a survey, but the donor-involvement intent is the same. The kraft outer has an oversized peek-a-boo window showing a canary envelope inside, and it’s the addressing inside the sealed canary envelope that flies the package.
Trent Lott’s letter explains, “If for any reason you cannot participate, please complete the information on the outside of the sealed Strategy Ballot carrier and return it unopened so we may select someone else in your community to help determine Republican campaign strategies.”
Brilliant!
Not only can I not stand the thought of one of my loony neighbors voting in my place, but I’m also dying to find out what’s on the ballot since Lott has made such a big deal about the sealed envelope and returning it unopened if I’m not going to fill it out before I send it back. Pretty darned irresistible — and reinforced on every component of the package.
My only complaint: The actual NRSC Republican Strategy Ballot is a bit of a disappointment. It’s an 8.25-inch-by-6.75-inch, two-panel form jam-packed with copy on both sides. A larger size would have given the ballot a more appropriate level of importance, especially given the letter copy and the teasers on both of the envelopes.
So after a survey of surveys, what can I advise?
Make your rationale for conducting the survey clear and legitimate, make the questions germane to your organization’s area(s) of work, and avoid topics that require a lot of explanation. Don’t ask your survey to do a letter’s work in making the case. Test the optimum number of questions and the format. And most definitely, always remove any potential impediment to quickly completing the survey and securing the gift — because all surveys aren’t fake, and they surely don’t all have to suck.
Kimberly Seville is a creative strategist and freelance copywriter. She welcomes feedback about your experience with surveys. Reach her at kimberlyseville@yahoo.com.
‘Fake Surveys Suck!’
‘Fake surveys suck!” This is a piece of comment mail I’ll never forget. Scrawled boldly across the survey in thick, black marker, the donor’s opinion of the organization’s effort to engage him couldn’t have been more succinct.
Or more true. So many variables determine a survey’s success … or its suckiness.
How many questions are ideal? Is asking a donor to rank items in priority order a good idea? What about “True/False” questions? And if the issue requires some explanation, should each question have an educational preamble to help donors decide their answers? Or, is that more confusing and could it make them feel stupid?
A review of a dozen survey packages I received in the past few months reveals mixed practices.
The number of questions ranged from six (the National Republican Senatorial Committee) to 32 (the Democratic Congressional Campaign Committee). Many surveyors grouped questions under numbered subheads and gave the illusion of a shorter or simpler survey.
Three surveys with lists of issues or multiple-choice options directed, “Please fill in all that apply” or “You may check more than one.” Excellent, because that way there are no “wrong” answers, and it prevents donors from setting the package aside until they have time to think it over — the very last thing we want them to do!
Many surveys asked for “optional” or “voluntary” personal information such as age, profession and voter status, but Judicial Watch probed even deeper with an inquiry into my estate-planning intentions and my interest in receiving information on planned giving. A little odd mixed with a series of questions about corruption in Congress, illegal immigration and the likelihood of rampant White House wrongdoings if Hillary Clinton is elected president in 2008 — but if it doesn’t hurt response and will surface planned gifts for the organization, it’s a bonus.
Two workhorses
Because I’ve seen them repeatedly through the years, I believe two of the survey packages in this mix are long-standing controls, or at least reliable second-string contenders that have been up-dated and refreshed over time.
The first is the American Civil Liberties Union’s National Survey on Freedom, Justice and Equality. It’s vintage ACLU, and the outer envelope art hasn’t changed much in more than 10 years. I’m sure the letter copy has been rewritten time and again as the specific threats to various liberties have developed, but the core offer remains: a chance to “help shape a genuine debate about bedrock American values.”
ACLU’s survey has 10 questions. Each asks for an affirmation (or denial) of a preceding statement that begins, “I believe …” No question requires an educational preamble nor are any of the questions “dumbed down” — on the whole, this survey appears carefully crafted precisely for its target audience. It’s still in the mail after all these years, anyway.
The Democratic Congressional Campaign Committee’s 2007 New Directions Survey is much longer and more varied in question format than ACLU’s. I think the most difficult question for respondents is the first one, which requests a ranking of 12 legislative priorities and the directive, “1 = most important.”
Asking donors to decide among 12 priorities seems arduous to me — picking what’s first and last might be easy, but respondents could get hung up in the middle with choosing which ones are five, six and seven. And anything that stalls out the compulsion to write a check is not good.
However, after the ranking request, the DCCC survey questions are easier to answer and mostly meaningful, on point with the copy platform and the Democrats’ intent to set the country in a new direction with donors’ input and financial support.
There are only a few throwaway questions, such as “Which political party do you trust most to …” with “Democrats” or “Republicans” as answers. Those could probably be omitted. Indeed, I wonder if the DCCC has tested fewer questions or if it’s proven 32 is the magic number for it — but here again, because this package or variations of it have been in the mail for many years, something about this survey clearly is working.
The sealed envelope
The shortest survey offer from the National Republican Senatorial Committee calls the device a ballot rather than a survey, but the donor-involvement intent is the same. The kraft outer has an oversized peek-a-boo window showing a canary envelope inside, and it’s the addressing inside the sealed canary envelope that flies the package.
Trent Lott’s letter explains, “If for any reason you cannot participate, please complete the information on the outside of the sealed Strategy Ballot carrier and return it unopened so we may select someone else in your community to help determine Republican campaign strategies.”
Brilliant!
Not only can I not stand the thought of one of my loony neighbors voting in my place, but I’m also dying to find out what’s on the ballot since Lott has made such a big deal about the sealed envelope and returning it unopened if I’m not going to fill it out before I send it back. Pretty darned irresistible — and reinforced on every component of the package.
My only complaint: The actual NRSC Republican Strategy Ballot is a bit of a disappointment. It’s an 8.25-inch-by-6.75-inch, two-panel form jam-packed with copy on both sides. A larger size would have given the ballot a more appropriate level of importance, especially given the letter copy and the teasers on both of the envelopes.
So after a survey of surveys, what can I advise?
Make your rationale for conducting the survey clear and legitimate, make the questions germane to your organization’s area(s) of work, and avoid topics that require a lot of explanation. Don’t ask your survey to do a letter’s work in making the case. Test the optimum number of questions and the format. And most definitely, always remove any potential impediment to quickly completing the survey and securing the gift — because all surveys aren’t fake, and they surely don’t all have to suck.
Kimberly Seville is a creative strategist and freelance copywriter. She welcomes feedback about your experience with surveys. Reach her at kimberlyseville@yahoo.com.