By Susan M. McMillan, Ph.D.
Some of the most frequent research-related inquiries we get from schools and districts are about surveys. Perhaps this is because there are so many uses of surveys in K-12 educational settings. Surveys are used to:
- Measure school climate, social and emotional learning (SEL), and student and family engagement
- Evaluate courses and programs
- Assess Title I needs
- Find community preferences regarding
- facilities planning
- bond and referendum planning
- boundary-change processes
- any major policy change such as school start times or class sizes
- Determine the need for childcare, meals, and technology services in your community
This blog provides a guide to planning successful surveys and provides links for existing education-related surveys.
How to Begin
A good way to start planning a survey is to think of it as a carefully focused research project. Spend time thinking about how your survey will be organized and administered so that the results will include actionable insights. The goal of “collecting data” isn’t specific enough and needs to be refined.
Your initial planning phase should result in a solid set of answers to the following questions1:
- What is your research or evaluation question?
- What do you need to measure to answer the question?
- Will the respondents be able to answer the questions? Do they have:
- The information you need to solicit?
- The necessary maturity level?
- The reading skills (if the survey is written)?
- How will the information from the survey be used?
It is best to keep surveys clearly targeted in terms of your research or evaluation question. However, if you find that your survey is experiencing “mission creep”—more and more things are getting piled in because you want to ask as much as possible at one time—you should:
- Organize the survey by topic
- Make the transitions between topics clear, with a short introduction to each section
- Keep respondent fatigue in mind, especially for younger students
Note that if you are planning to survey minors about potentially controversial topics such as political or religious beliefs, risky behavior, or mental health issues, your survey administration plan should follow district and federal requirements for parent notification2.
Use Existing Surveys when Possible
Writing a new survey instrument can be challenging, so the Harvard University Program on Survey Research advises that you pay attention to how other people have already measured the same concept you’re trying to measure3. The K-12 education “space” offers many existing surveys you might consider using.
If you are looking for a school climate or SEL assessment, American Institutes of Research (AIR) publishes a table of currently available survey instruments4. The table includes the website, publisher information, age ranges covered, constructs measured, time estimate, cost, and setting for each listed assessment.
The US Department of Education5 provides surveys to measure school climate with separate questionnaires for students (grades 5 through 12), instructional and non-instructional staff, and parents. The measures include:
- Engagement; cultural and linguistic competence, relationships, school participation
- Safety; emotional safety, physical safety, bullying/cyberbullying, substance abuse, emergency readiness/management
- Environment; physical environment, instructional environment, physical health (only for staff), mental health, discipline
Commonly used surveys for measuring student perceptions of teaching are described in Asking Students About Teaching,6 a report for the Bill & Melinda Gates Foundation. The report details the methodology—with plenty of tips for ensuring valid and reliable results when conducting a survey—and includes an appendix with survey questions.
The Centers for Disease Control and Prevention (CDC) has developed a well-established survey to assess the prevalence of health behaviors in middle and high-school students, and to monitor progress toward meeting health program goals. The Youth Risk Behavior Surveillance System (YRBSS) questionnaires7 are free to use and the surveys are well documented.
Ten Tips for Writing Survey Questions
Sometimes you really need to put together your own survey, particularly if you want to understand community needs such as for childcare or technology, or preferences about local issues such as school start times or whether your voters are likely to support a bond issue.
Survey questions are composed of a stem (text that sets up the situation or question) and response options (answer choices), and sometimes additional instructions. If the survey question stems are unclear, the response options aren’t quite right, or the directions are confusing, the survey results will be difficult to interpret.
The Harvard Program for Survey Research describes three primary goals for an ideal survey question8:
- It measures the underlying concept it is intended to tap
- It doesn’t measure other concepts
- It means the same thing to all respondents
To fulfill the primary goals, follow these 10 tips for writing good survey questions.
- Use simple and concrete language and define terms as necessary. For example, if you ask about “the future” you will need to specify what time frame you want people to consider: next month? Next semester? Next school year?
- Avoid double-negatives. They can cause confusion for respondents. “Do you favor or oppose not closing Jones Elementary School?” is confusing and would be better phrased as, “Do you favor or oppose keeping Jones Elementary School open?”
- Avoid double-barreled questions. Consider this example: “Do you agree or disagree that schools that fail to attract enough students should be closed and teachers lose their jobs.”9 Respondents might agree with schools being closed, but disagree with teachers losing their jobs, or vice versa. Interpreting the results would be problematic because you would have no way of knowing which part of the question each respondent answered. It is best to split this type of compound question into two separate survey questions.
- Match the response options with the stem. Examples of poor matches are endless but imagine trying to answer a “yes/no” question using a rating scale of stars (1 star to 5 stars). Or, rating the quality of a service using an agree-disagree scale. Stem and response mismatches are frustrating for respondents and they make interpretation of results difficult, if not impossible.
- Give clear instructions. If you want people to think about their actions or feelings “in the past”, provide a time frame for reference. If you want people to “check all that apply” or “choose one”, include those instructions. For open-ended questions, explain what type of response people should provide; an issue or problem, a month, number of days, etc.10
- Make sure the multiple-choice response options cover all possibilities without overlapping. In more technical terms, the response options should be exhaustive and mutually exclusive. It is very easy to create gaps and/or overlapping response categories. Try to imagine every possible answer. If you suspect that some people will have truly unique answers, include an “other” response option. Pay close attention to any responses with ranges such as age, grade level or education level, and time frames. If respondents can’t find a response option that works for them, they will either skip the item or provide an inaccurate response.
- Design questions to mitigate the “primacy effect”. People tend to choose the first option they see, and this is called the primacy effect. If you’re using a Likert-type scale (agree-disagree) for response options, list the least agreeable options first; top to bottom, or left to right.11 For multiple-choice questions, list the response options in a natural order if there is one (e.g., small to large), or alphabetical order.
- Give a full range of response options, not just favorable or unfavorable. If you provide response options that reflect only positive or only negative reactions, your results will be skewed. Your decision-making will be based on potentially incomplete data.
- Keep questions neutral and avoid leading and/or emotionally charged language. “Do you agree that taxes are already too high?” is leading and more emotionally charged than a question phrased as, “Do you support a tax increase of x amount to fund a new high school?” The same person might answer “yes” to both (perhaps grudgingly), but the second question would provide you with more specific information about voter support.
- Write each survey question so the results provide specific information about your research question. To understand this tip, think about an example involving student and staff perceptions of safety at school. Depending on your specific research question, you could ask a very broad survey question such as “Do you feel safe at school?” and find the percentage of respondents who say “Yes”.
If you intend to use the results to boost safety within school buildings, you might wonder about why the people who said “no” responded as they did. It may be more helpful for you to determine where people feel more/less safe at school, and that would require more specific questions: “Do you feel safe in the hallways?” “Do you feel safe in the bathrooms?” Or, for more nuance, you could ask respondents to rate how safe they feel in various places within the school.
If you wonder whether the answers differ by respondent gender, race, or ethnicity group, you will need to include demographic survey questions. For each item you include on your survey, evaluate how the responses will contribute to an answer for your research question.
Pilot Test your Survey
A pilot test is the survey equivalent of the “measure twice, cut once” maxim in carpentry. It is a formal way to test the survey administration process, discover whether respondents are interpreting your survey questions as anticipated, and determine whether the responses are providing answers to your research question. A pilot study may seem unnecessary and time consuming, but it is essential for ensuring that your survey results are accurate and meaningful.
Whether your survey will be administered online, using paper, or via interview, the pilot test will reveal any problems with the process. Perhaps the email invitation link only works for some people, not all items are being shown online, or an item was accidentally deleted from the printed version. You may discover that the survey takes way too long for respondent comfort and they skip the final third of the items. Ensuring that the administration process is working as planned provides evidence that your final survey results will be accurate.
Having people with “fresh eyes” take your survey will help you identify any problems with questions that may have crept into your item-writing process. Typos are easy to miss, and even seasoned experts accidentally slip in a double-barreled question or forgot about a possible response to a multiple-choice question. You can also ask respondents to identify any questions they found confusing so you can improve the wording or response options. Using the pilot test results to fix or eliminate problems with individual questions will help reduce measurement error.
Analyze your pilot survey data to determine whether the responses are truly meeting your need for actionable information. Sometimes people find that their survey results produce no useful insights into their primary research question. If that is the case, your pilot data should provide clues about how to re-write your questions to get more specific and relevant information from your survey.
Let us know if you need further assistance with your survey project. We’re here to help!
Footnotes
1) https://www.sheilabrobinson.com/2020/03/04/asked-and-answered-rating-scale-survey-questions/ and https://www.pewresearch.org/methods/u-s-survey-research/questionnaire-design/
2) https://www2.ed.gov/policy/gen/guid/fpco/ppra/parents.html
3) https://psr.iq.harvard.edu/files/psr/files/PSRQuestionnaireTipSheet_0.pdf
4) https://www.air.org/sites/default/files/SEL-Ready-to-Assess-Act-2019-rev.pdf
5) https://safesupportivelearning.ed.gov/edscls and https://safesupportivelearning.ed.gov/sites/default/files/EDSCLS%20Questionnaires.pdf
6) http://k12education.gatesfoundation.org/resource/asking-students-about-teaching-student-perception-surveys-and-their-implementation/
7) https://www.cdc.gov/healthyyouth/data/yrbs/questionnaires.htm
8) https://psr.iq.harvard.edu/files/psr/files/PSRQuestionnaireTipSheet_0.pdf
9) https://www.sheffield.ac.uk/polopoly_fs/1.597637!/file/likertfactsheet.pdf
10) https://www.pewresearch.org/methods/u-s-survey-research/questionnaire-design/
11) https://www.sheilabrobinson.com/2020/03/04/asked-and-answered-rating-scale-survey-questions/