Instruct Respondents How to Answer Questions

Let’s turn our attention now to various places where a survey project can help respondents answer questions. Artful instructions will also make the analyst’s job easier when it comes to cleaning, standardizing, and aggregating answers. If people can willy-nilly fill in as they see fit, the analyst will have a fit (yes, a willful pun) and the quality of the information collected will degrade. We will review these instruction methods by starting with the broadest scope and proceeding to the most targetted.

At the beginning, as when a member of your population reads your invitation email, you can add a bit of overall guidance. For example, you might alert them that the questionnaire has six multiple-choice questions, each of which ask for a single selection and includes an ‘Other’ selection. That may not seem instructive, but it sets the stage for them to anticipate their answers, i.e., no “choose all that apply” questions.

Once the survey is opened, a section – a grouping of related questions, such as about the law firm or about the law department – might lead off with its own directions. For example, if the first four questions gather demographic questions, you might consider explaining what’s coming. “We start with important background facts about you that will enable us to compare the responses of separate groups and compare those groups.”

Within a section, and depending on the hosting software you have licensed, you may be able to insert text elements in a text note. As an example, before a series of questions that ask about compensation, you might explain that answers should only contain digits, no signs or symbols (such as “$” or “CDN”) or decimal points. If the instructions in that text element apply to the next three questions only, you could state that also.

Each question might also travel with its companion instruction. The question itself might shape how it is to be understood: “Please check below all the hardware and equipment you use when you work from home, e.g., laptop, headphones, dual monitors, ergonomic seats, special lighting, etc. By ‘at home,’ we mean out of the office but not at a client site.”

Some survey software lets you provide instructions separate from and after the question itself. This capability is called various names, such as question sub-text. Here you have more real estate to nudge answers toward the format or kind of detail that you seek. Continuing the previous question from a technology maturity survey, the question sub-text might elaborate: “Ideally, provide the name of the vendor of the software as well as the name of the package. If you don’t know either, describe the function of the application.”

The answer space for a question is where numbers should be inserted. The space might contain formatting clues. For example, if you’re asking for an allocation of percentages of time spent on various tasks over a period, the answer space might have a percentage symbol (“%”) or abbreviation (“pct.”) on the right side. They indicate both what to fill in (percentage numbers only) and what not to fill in (a percentage symbol or abbreviation).

Instructions are particularly important for rating questions because you want to avoid someone inadvertently reversing the scale. Your instruction might say, “This question uses 1 to rate the least desirable method of outside counsel, cost control; it uses 10 to rate the best, optimal method. Be sure to follow that rating order in your answers.” You might add that no rating can be used twice, if that is an applicable parameter. At the end of the rating portion, you might add a confirmatory text box that reminds them with something like “Please confirm that you follow the 1 low to 10 high rating scheme. If you didn’t please revise your answer.” Another way to confirm their thinking is a question: “What did you select as your best method?” so that you can corroborate the two questions. One further note: keep your rating scales the same throughout the survey, or at least with 1 as the least favorable selection.

Instructions are not guide rails only for format and content. An instruction, though perhaps thought of as an alert or a piece of information, might allay concerns or improve the user interface.

• They might give assurance to respondents, such as explaining why a question is obligatory. Or, for instance, when you ask for email addresses other than from employees: “We will use the email address only to send your report or to ask about a possibly mistaken or overlooked answer.” This statement may ease worries about being spammed or having the email address sold.

Navigation instructions tell a user how to move to the next page, return to a previous page, or jump to a different question. Error messages, typically on required questions, should be considered prescriptive instructions. At the end of the questionnaire, a closing message might tell the respondent what to do next or what choices they have.

Notwithstanding the array of instructional tools, a diligent survey designer might create a frequently-asked-questions site with a link and place the link at different points in the survey instrument. Users could click on the link to find out how to deal with a problem. Creating such an FAQ site would only make sense if there were many points of clarification and if you were going to use the survey repeatedly. Or, that you are a serial surveyor! Perhaps this is only a future aspiration.