Garbage In, Garbage Out: 7 Survey Mistakes That Can Derail Your Website Redesign

Why Surveys are a Valuable Website Strategy Tool

One of the tools we love to use in our website redesign discovery process is surveys for both internal and external stakeholders. When your website strategy depends on survey data, the quality of your questions can make or break the redesign.

Ask the wrong questions, and you’ll collect misleading insights - sometimes referred to as “garbage data.” And when you build strategy on garbage data, the result is a new website content strategy that doesn’t actually serve your users.

At Aten, we’ve helped organizations from airports and nonprofits to universities and libraries redesign their websites based on evidence-driven insights. Surveys can be a powerful part of that process but they are a technical instrument and require care when crafting.

I'll share seven common mistakes to avoid when writing survey questions and show you how to create surveys that deliver meaningful, actionable data.

When (and When Not) to Use Surveys

Surveys are incredibly powerful when used in the right way and in the right context. Surveys build helpful quantitative insights, while other tools build qualitative data. Here are a couple of simple rules of thumb.

Use surveys when you want to:

  • Measure how often something happens
  • Understand preferences and priorities
  • Explore audience demographics

Choose interviews, focus groups or other qualitative methods when you need to:

  • Uncover why users behave a certain way
  • Collect rich, contextual stories

Seven Principles for Writing Effective Survey Questions

Surveys can only produce useful insights if the questions are designed well. Here are seven common mistakes to avoid—and better ways to ask.

Avoid Double-Barreled Questions

A double-barreled question asks about two things at once but only lets respondents give one answer.

Example: “It is important to be able to see my flight time and gate information on the website.”

The problem? If someone agrees, you don’t know if they care about flight times, gate info, or both. That makes your survey data noisy and unreliable.

The Fix:

  • Split into two separate questions, each addressing one variable.
  • Or, use a rank-order scale to let users show priorities.

Better alternatives:

  • “How important is it to see flight time information on the website?”
  • “How important is it to see gate information on the website?”

Avoid Leading Questions

Leading questions push respondents toward a specific answer—biasing your data before you even collect it.

Example: "How easy is it to navigate the Tampa International Airport website?”

The problem? This assumes the site is easy to navigate, which influences the response.

The Fix:

  • Frame questions neutrally. Instead of “how easy is it to …” make sure you offer the other direction e.g. “how easy or difficult is it to …”
  • Offer a balanced Likert scale (easy → difficult).

Better alternative:

  • “On average, how did you find navigating the Tampa International Airport website?”
    • Very Easy
    • Somewhat Easy
    • Neither
    • Somewhat Difficult
    • Very Difficult

Avoid Loaded Questions

Loaded questions are like leading questions but add an emotional bias. They often include words like “new,” “improved,” or “best,” which nudge users to respond positively.

Examples:

  • “The new airport website was redesigned by the best team in the world. Did you love our new website?”
  • “In the past 3 months, how have you enjoyed our new and improved airport map?”

The Fix:

  • Strip out emotional language.
  • Anchor the question to a specific timeframe or behavior. Doing so helps ensure accurate recall and grounds them in a specific time and place.

Better alternative: “In the past 6 months, how many times have you used our airport website to find flight information? When you used the airport map, how easy or difficult was it to find your destination?”

Avoid Ambiguity

Ambiguous questions are too broad, leaving you with vague answers that don’t help your redesign.

Example: “What do you think of our website?”

This could mean design, navigation, content, or usability—and you won’t know which one users are referring to.

The Fix:

  • Focus on specific features or tasks.
  • Pair scale-based questions with optional short-text responses.

Better alternative: “On a scale of 1–5, how satisfied were you with finding parking information on the site?”

Avoid Absolutes

Absolute questions force users into all-or-nothing answers, which rarely reflect reality.

Example: “Do you always park at the airport?”

The problem? This doesn’t capture the nuance of different travel habits. This is especially important in environments such as air travel, where our habits can change depending on weather, traffic or a host of other circumstances.

The Fix: Use frequency-based response options that capture variability.

Better alternative:

  • Almost every time I fly
  • Most times
  • About half the time
  • Less than half the time
  • Almost never

This gives you usable insights for design decisions and you can determine how often, in general, users use this feature.

Avoid Confusion

Confusing questions happen when you assume your scale or language is obvious to respondents.

Example: “On a scale of 1–10, please rate your experience of the Security tab on the airport website.”

But what does “10” mean? And what aspect of security—ease of use, staff behavior, sense of safety?

The Fix:

  • Define clear scales (e.g., difficult → easy).
  • Be specific about the feature you want evaluated.

Better alternative: “On a scale from difficult to easy, how would you rate finding security wait time info on the Tampa International Airport website?”

Always Test Your Survey

Even small surveys can go wrong if you don’t test them first. Respondents may interpret your questions differently than intended, leading to misleading results.

The Fix:

  • Run a pilot survey with a small group.
  • Revise based on feedback before rolling it out widely.

At Aten, we test every survey to ensure clarity, neutrality, and alignment with redesign goals. That way, you get insights you can trust.

Bringing It All Together 

Well-crafted survey questions lead to dependable insights. Paired with qualitative methods like interviews and focus groups, they provide the what, the why, and the how often—the full picture you need for a strong content strategy.

The difference between a survey that produces garbage data and one that fuels a successful website redesign often comes down to how the questions are written and tested.

We help organizations translate user research into actionable web design decisions—whether it’s through surveys, workshops, or audience interviews.

Ready to Build Surveys That Deliver Real Insights?

If you’re planning a website redesign, don’t risk making decisions on bad data. Our UX strategy team has helped organizations like Tampa International Airport, the Smithsonian Institution, and the City of Oxnard, California design research-backed surveys that uncover what their web audiences truly need. 

Get in touch to start shaping your research-driven redesign.

Strategy

Read This Next