Survey Experience

Rethink Card Sorting
February 5, 2025
Rethink Card Sorting
February 5, 2025
Insights

Don’t Just Design Surveys for the Data, Design for the People Giving It (That’s How You Get Better Data)

A poor survey experience, from missing answer choices to clunky logic, can feel alienating or frustrating to respondents, leading to higher drop-off and lower-quality data. A well-designed, inclusive experience builds trust and encourages people to stick with it.

Here are three questions we should be asking ourselves:

  • How do we ensure our answer choices feel inclusive and representative of all respondents?
  • What signals tell us when a question or survey flow might cause frustration or dropout?
  • How can we audit surveys from the respondent’s point of view to improve experience quality?

A Personal Wake-Up Call

As a market researcher, I’ve designed, fielded, and analyzed my fair share of surveys. We have a thorough testing process and do our best to ensure logical soundness. But it wasn’t until I recently took a casual survey for a friend (yes, I know I should’ve been screened out!) that I was reminded just how much the experience of a survey matters, and how directly it impacts data quality and completion rates.

As researchers, we're trained to focus on getting answers to our research questions. But in doing so, we sometimes overlook the process our respondents go through to give us those answers. We obsess over logic, structure, and analytical outputs, and unintentionally create experiences that feel rigid, confusing, or alienating.

There were two moments in that survey that stuck with me:

  1. When I couldn’t find the answer I wanted to choose.
    The list of options was long, but not in a helpful way, it reflected what the survey designer wanted to hear, not what I actually wanted to say. In the end, I selected something random just to move on. My data? Not representative.
  2. When the question flow felt like a one-way tunnel.
    The logic assumed I’d answered a certain way and didn’t allow for any detours. I started to feel like I was “in the wrong place.” I almost dropped out, and probably would have, if it weren’t a favor for a friend.

That experience made something clear: we don’t just want any data. We want good data,  authentic, representative, decision-ready data. And the path to that starts with designing a better survey experience, not just better questions.

Let’s Break Down Those Three Questions

  1. How do we ensure our answer choices feel inclusive and representative?

When we write answer lists, it’s easy to unintentionally build them around what we expect people to say, or what we want to analyze. But inclusive answer choices are about giving people the tools to accurately describe their reality,  not about funneling them into tidy categories.

  • Are we offering enough flexibility (e.g., “Other – please specify” when needed)?
  • Are the labels free of judgment or bias?
  • Do the options reflect a wide range of lived experiences?

Even small tweaks, like adding “I’m not sure” or “It depends”,  can go a long way toward helping respondents feel seen.

  1. What are the signals that tell us a question or flow is frustrating?

Survey dropout is the clearest signal, but there are other indicators too:

  • High “prefer not to answer” or “Other” rates
  • Unusually fast or slow completion times
  • Feedback from soft-launch or preview groups

Pay attention to these red flags. If respondents are confused, bored, or boxed in, they’ll either abandon the survey or push through in a way that gives you messy, misleading data.

  1. How can we audit surveys from the respondent’s point of view?

This one’s simple, but often skipped: take your own survey. Or better yet, watch someone else take it.

  • What’s the emotional journey like? Are there points of friction or fatigue?
  • Do they pause at any questions? Frown? Scroll up and down looking for an option that isn’t there?
  • Does the tone of the survey feel human and respectful?

Also, get feedback from people who don’t live and breathe surveys. Sometimes we’re too close to the instrument to notice the friction points.