Report the Full Methodology of Your Project

From my point of view, all reports of survey results should include a fulsome methodology description. No matter if the sponsor is a law firm or a legal vendor, if the survey went outside of the organization for participants, the report should describe the project openly and transparently.

More specifically, the goals of a methodology section in a survey report are twofold. First, to let people determine whether your steps to handle the data of the survey sample were evenhanded, thought through, and carefully applied. Translated: your findings are trustworthy. Second, to lay out what you did so fully that someone could, at least theoretically, replicate your steps and produce nearly the same results. Translated: your procedures were sound. The first goal emphasizes fairness, the second goal advances reproducibility. So, the section on methodology should be thorough enough to build the credibility of your survey findings and to permit Popperian falsifiability of your conclusions.

At a minimum, a responsible sponsor should explain how many people or organizations it invited to take part, how the sponsor identified that group, and what it did to secure their participation. For example, a law firm that surveys corporate law departments about attitudes and actions regarding sanctions might write, “We sent email invitations to the General Counsels of the top 1,000 U.S. companies by revenue and followed it up two weeks later with a second email invitation.” The next sentence usually tells how many complete responses the project obtained, which means it discloses the participation rate, and any unusual distribution in that respondent group. The distribution disclosure helps readers understand the degree to which the results are representative of the survey population.

Often the methodology appears at the end of the report in the appendix. Readers assume the sausage making was reputable, but they can flip to the back to confirm it.

When describing the methodology of an online survey project, strive to include all major decisions made by the sponsor and the analyst. “Major” decisions affect or could have affected the findings meaningfully. Therefore, you want to explain in detail what you did and why. Here are several examples of desirable material to cover in the methodology:

• If the analyst filled in any missing data – a procedure known as imputation, which has multiple methods to choose from

• Currency conversions, primarily the exchange rate

• Disambiguation, such as categorizing text comments after someone picks “Other” to the closest selection item

• Thresholds for discarding partial responses (although a workaround for these decisions is to include an “N= “ statement of the number of participants for each table or graphic)

• Removal of duplicates and the reasoning for doing so

• Standardization, such as titles

• The rationale behind and construction of any index variable. If you weighted any of the components in the index, that procedure also needs to be detailed

• Sources of any *external data that you added to supplement what the participants provided. For example, knowing the headquarters country of a company, if you add a ranking from a corruption index

• The software you used during the survey project.

If you do not include a comprehensive methodology section in your report, at least keep careful notes of the important decisions that were made during the project. Those notes will help you answer questions that may be asked about what was done and why. They will also encourage you to consider each crucial step of your survey project.