As third-party cookies come to an end, brands are focusing on collecting more zero-party data directly from their customers and prospects so they can better understand their needs and wants.

Although larger surveys will play a role in that process, most brands will gain insights a few answers at a time through signup forms, profile pages, polls, and other forms of progressive profiling.

It sounds simple, but writing a good question and collecting reliable answers is harder than it seems. Things can go wrong...

  1. Before you write a question
  2. When writing the question
  3. When writing answer choices
  4. Around the timing
  5. When analyzing the responses
  6. When repeating data collection

Let's talk about best-practices and things to look out for during each of those six steps.

1. Before You Write a Question

It's possible to go entirely wrong before you even begin with your progressive profiling efforts. Consider the following two things.

Always start by understanding exactly what you want to learn from your audience

What's your objective? Why do you want a certain piece of information from them? How are you going to use or operationalize that data point? Does that data give you the insights you want?

That last question gets at a disconnect that many brands struggle with. The goal isn't to collect data. Not really. It's to gain insights you can use to drive the desired outcome. Yes, you need data to get insights, but the two aren't the same thing.

My favorite example of data vs. insight is from B2C marketers who ask customers about their gender. Most of the time that data is used to personalize message content about products for men or women. However, a person's gender doesn't tell you what kind of products they want to buy, because they could be buying primarily for someone else or they could be interested in products for the opposite gender or both genders.

So, the better question is the more direct one: Are you interested in products for men, women, or both?

Don't ask questions you won't act on

Simply asking a question sets an expectation that you'll use that information to make the customer experience better in some way—even if you're just sharing it out with the community.

If you don't do anything with it, however, that can lead to disappointment. And it can lead to lower response rates for future progressive profiling efforts.

Also, gone are the days when you'd collect information because you might need it in the future... at some point... maybe. You don't want the liability of retaining data you're not using, so don't collect it in the first place.

2. When Writing the Question

Once you're clear on your objective, then it's time to craft your questions. Here are some things to keep in mind.

Craft questions that are universally understood

To the degree that it's appropriate, avoid jargon or technical language. If it's needed, consider providing quick definitions in parentheticals.

If your audience is international, think about non-native speakers, who might struggle to understand some long words, colloquialisms, and cultural references.

And finally, use unambiguous time windows, such as saying the past 12 months instead of the past year, which some might interpret as the previous calendar year.

Provide any needed context before the question

The primary concern here is that some people, once they've read a question, will skip to the answer choices because they're in a hurry. (And everyone's in a hurry.)

Another reason is that in the absence of immediate context, people bring their own context to questions, which forces your post-question context to work harder to override the respondent's initial thinking.

Avoid context and introductory statements that might impose a bias on answers

For example, you shouldn't ask, Given the current state of the economy, do you think now is a good time to change supply chain management software providers? You'll get more accurate answers without that introductory clause.

Ask judgment-free questions

Marketers are great at asking leading questions in marketing copy, but you don't want to do that in polls and surveys if you want meaningful results.

Sometimes that means you need an introductory statement or clause that gives the respondent cover to answer truthfully about something that might otherwise make them look or feel bad.

For example, you might preface a question with a clause like Recognizing that you don't have full control over your program... to make it easier for respondents to answer truthfully.

Recognize that people are bad at remembering past behavior

People provide the most reliable answers about now and the recent past. When you're asking about the actions of their organization, things can get even hazier, because the respondent may be relatively new to their company. Consider asking about actions or behaviors from the previous 12 months, at most.

Avoid redundant questions

The more questions you ask, the lower your completion rate will be. So try to ask as few questions as necessary. For example, I saw a recent B2B lead-gen form that asked for both the person's country and world region. If you get the person's country, you can figure out the region of the world, so that question was completely unnecessary.

3. When Writing the Answer Choices

Most likely, the vast majority of the polling or surveying you'll be doing will involve answer choices rather than open-ended questions. So consider the following when crafting those answer choices.

Make answering easy

While this has a lot to do with the questions you ask, the answer choices you provide also have a major impact on how easy a question is to answer.

For example, here's a recurring question I've asked marketers:

What percentage of your company's email marketing revenue is generated by automated and transactional emails?

  • Less than 20%
  • 20% to 50%
  • More than 50%
  • Not sure

Ranges make answering the question much easier, because the chances of knowing the exact percentage is low—and you absolutely don't want people to go hunting for information, because they probably won't come back.

Five- and three-point rating scales (e.g., Always, Sometimes, Rarely) generally produce the best results while keeping things easy.

Be careful when using subjective measures

Sometimes, beauty is indeed in the eye of the beholder. Other times, it's not. For instance, some people think that an email deliverability rate of 50% is good, but it's actually horrible.

So, if you asked about brand' email deliverability, you'd likely get very different distributions if you ask whether their inbox placement was Excellent, Good, or Poor versus Over 95%, 90%-95%, or Below 90%.

Provide an N/A option

Even if answering a question is optional, give people the option to not answer the question by selecting N/A, Not sure, or Don't know—or a combination of those, such as Not sure or don't know. Otherwise, they'll guess or put down the answer they think you want to hear, degrading the accuracy of your responses.

4. The Timing

When you ask your audience questions depends on several factors, but the most consequential is whether the responses are useful long-term or short-term.

Answers That Are Useful Long-Term—for Many Months to Years

These include demographic information, such as a prospect's company name or industry, and technographic information, such as details about their tech stack.

That type of information doesn't change often, so answers are useful over a long period of time. It also means you can collect it throughout the year, over the course of multiple campaigns.

Answers That Are Useful Only in the Short-Term—for a Few Weeks or Months

These include, for example, whether prospects are attending the MarketingProfs B2B Forum this fall and whether they're interested in attending an email marketing meetup. This kind of information is incredibly valuable.

However, such questions need to be asked close enough to the event to ensure respondents are certain they're going, but not so close that you don't have enough time to act on their responses.

5. When Analyzing the Responses

Get it all right up to this point... and you can still stumble when it comes time to interpret the results. Consider the following four issues.

A. Whether to Report N/A, Not sure, and Don't know Responses

Generally, such answers aren't meaningful, so it's best to remove this noise from your reported results.

However, it can be telling if, say, most or a plurality of respondents select those types of answers. That can signal that the technology, tactic, product, or whatever else you're asking about has low awareness, which can be interesting in and of itself.

It could also signal that your question is confusing.

B. Look for Opportunities to Simplify the Story

Just as you don't have to report your N/A answers, it's OK in some cases to roll together responses to tell a cleaner story.

For example, say you asked respondents to respond to a statement using a 5-point Likert scale of (1) Strongly Disagree, (2) Disagree, (3) Neither Agree nor Disagree, (4) Agree, and (5) Strongly Agree. In some circumstances, it might make sense to combine the two disagree answers and the two agree answers when reporting results.

C. Reporting Statistically Significant Results Across Segments

Particularly with demographic questions, brands often provide a long list of answer choices because they want more granular data. For example, they might ask about company size and provide lots of choices, such as Fewer than 10 employees, 11-25, 26-50, 51-100, 101-200, 201-500, 501-1,000, 1,001-2,000, ,2001-10,000, More than 10,000.

Reporting those responses is fine; it gives your audience valuable perspective on your respondents. However, sometimes, brands then try to report how each of those groups answered other questions. Depending on your brand's audience and how the survey or poll was fielded, you might have a relatively small number of respondents in some of those buckets—too small for the results to be meaningful.

In many instances, it makes sense to combine some answer choices. For example, in past surveys, I've rolled together respondents by company size into two buckets: 500 or fewer employees and More than 500 employees. That dividing line created two groups that were fairly evenly powered and provided interesting insights about the differences between what smaller companies and larger companies were doing.

D. Understanding Intent

Just as people are bad at remembering things that happened more than a year ago, they tend to be bad at predicting what they and their organization will do in the year ahead.

In my experience—and depending on what you're asking about and how much effort, cost, and buy-in is required—fewer than half of respondents follow through on what they say they'll do in polls and surveys.

Needless to say, that's totally fine. Your respondents are answering to the best of their ability. That said, when you report the results, you shouldn't overstate the results of such intent-related questions.

6. When Repeating a Question, Survey, or Poll

Collecting data on the same question over and over (e.g., every year) is powerful, but there are a couple of issues to be mindful of.

Be mindful of changing previously asked questions and their answer choices

Doig so will likely render historical answers useless for comparison or trend purposes. Even changing an introductory statement or clause can change the responses so much that you can't compare them to past responses.

That said, if you've identified a serious flaw in the wording of a previous question, don't hesitate to reword it so you get more reliable answers next time.

Similarly...

Be mindful of controlling your audience

For example, if you promoted a poll question to your email and social audiences last year, but then this year you work with partners to have them also share your poll question, the poll results could be materially different because of that audience change.

That's not to say that expanding your audience is bad, but don't disregard that change when analyzing results.

As Privacy Protections Strengthen...

Companies need more ways to collect information about their customers and prospects so they can understand their audience better and create more relevant experiences.

Asking your audience questions through forms, surveys, polls, and other progressive profiling mechanisms is a highly valuable way of staying close to them and serving them better.

More Resources on Survey Questions and Questionnaire

The Basics of Questionnaire Design

The Secret to Great Marketing Research: Ask the Right Questions

Two Survey Questionnaire Flaws to Avoid: Scary Matrix Questions and Unbalanced Scales

How to Improve Your Research ROI: Six Best-Practices

Enter your email address to continue reading

Surveys, Polls, Forms, and Progressive Profiling: How to Write Questions That Deliver Valuable Insights

Don't worry...it's free!

Already a member? Sign in now.

Sign in with your preferred account, below.

Did you like this article?
Know someone who would enjoy it too? Share with your friends, free of charge, no sign up required! Simply share this link, and they will get instant access…
  • Copy Link

  • Email

  • Twitter

  • Facebook

  • Pinterest

  • Linkedin


ABOUT THE AUTHOR

image of Chad S. White

Chad S. White is the head of research for Oracle MarketingConsulting and author of four editions of Email Marketing Rules, as well as nearly 4,000 posts and articles about digital and email marketing.

LinkedIn: Chad S. White

Mastodon: @chadswhite

Twitter/X: @chadswhite