Surveys Stink: 7 Mistakes to Avoid a Crappy Survey Experience

Woman on a Laptop

For the past six months or so, I’ve been “volunteering” as a survey panelist through SurveyMonkey Contribute.  I’ve used SurveyMonkey quite a bit for my personal and UX related survey needs, so when my account manager explained to me how my audience panel would be sourced, I decided to give it a try myself.  Unlike paid survey sites, SurveyMonkey Contribute only compensates panelists by donating a small amount for each survey taken to the panelist’s chosen charity (cue the soft, squishy feelings).  They also offer a chance to win a $100 Amazon gift card, but after nearly 50 surveys–I can promise you–the odds are not in your favor.  However, the real reason I contribute isn’t because of the feel-good feelings or possibility of scoring Amazon loot.  It’s because all of that time taking surveys will help me to avoid the poor design decisions that SO MANY people make when crafting a survey.  And as a user experience pro, being able to design and deploy a survey that will get valid and reliable results is a critical skill.

Lucky for you, all that time I’ve wasted spent taking surveys can help you be a better survey designer, too.  Below are the most common survey design mistakes I’ve seen in my many hours of survey responding… avoid them and you’re well on your way to crafting a survey that works.

1. Too many questions

I understand the temptation to get the most bang for your buck when surveying your audience–they’re there, you’ve got them, might as well ask everything you want to know, right?  Wrong.  As surprising as it may sound, survey respondents have lives, too.  A 50 question survey is unacceptable.  I guarantee you, folks are christmas-treeing the hell out of that thing.  Bye-bye, reliability.

2. Not including a progress indicator

SurveyMonkey actually advises against this, so perhaps that’s why I rarely see one.  But letting respondents know how far along they are in the survey process is just good UX.  And if you leave one out and violate any of these other mistakes, chances are good that your respondents are bouncing… that’s what I do!

3. Requiring answers on all questions

Sorry folks, but sometimes I just don’t understand what you are trying to ask me.  Or I don’t have an opinion.  Or it doesn’t apply to me at all.  So pretty please, give me an out.  If you can’t include an “other” or “N/A” option, you better not require me to answer it.

4. Too many open-ended questions

I totally understand and appreciate your desire for qualitative data in your results… having respondents put things into their own words is important and valuable in solving problems.  But using an open-ended question where a multiple choice or matrix will do?  You’re wasting my time, man.

5. Leading questions

Oh man, this is a biggie.  And while many survey respondents might not even catch on, leading questions can confuse and frustrate.  And leave you with invalid results.  Learn about preventing leading questions (and other survey no-no’s) from the smart folks at SurveyMonkey.

6. Restrictive multiple choice answers

I don’t know how many times I’ve taken a survey with multiple choice questions that don’t offer the answer I want to provide.  Spend some time ensuring that your answers are comprehensive.  And then add an “other” option for good measure.

7. Illogical question order

This may be the result of inappropriate use of the “randomize” setting, but I’ve seen quite a few surveys where questions seem out-of-order or reference a topic that was fully explored two pages ago.  Randomizing questions can help ensure reliable results by protecting against survey fatigue, but you must use with care.  I’s not helpful to force respondents to jump from one topic to another and back again.  Group your questions by topic and randomize within if necessary.

Need help planning your next survey or other customer research activities? Contact me.