Training Evaluation Series Part 1: Survey Design Basics

This fall RU Training will offer a series of articles addressing advanced topics in training evaluation. I hope that these posts will benefit our current students and our alumni. This series assumes a basic knowledge of both Kirkpatrick’s Model and Phillip’s ROI Methodology.

The first article addresses survey design basics. Surveys are an essential tool throughout the training process. Before training design begins, surveys allow us to collect information about our participants and the organization where they work to guide the design process. During training, survey data helps us gauge how training is going, allowing for necessary changes to ensure that we are meeting the needs of our learners. After training, we use surveys and questionnaires to evaluate training both immediately when training ends and later as a follow-up to measure transfer.

The adage, “if you can’t do it right don’t do it at all” relates directly to survey data. When surveys are well designed, measure what they are supposed to measure, and are stable over time, you’ve got quality data, but when they are not, the results you receive can do more harm than good. If your survey is inaccurate, the data will be inaccurate and can take you and your organization down the wrong path—leading to decisions that are both incorrect and costly—so no data is better than bad data.

In this article, I will review the basics of survey design and guide you to resources that will help you master the craft of survey design.

The Basics of Survey Design

  • First, consider the type of information you want to collect. The three basic kinds of information captured by surveys are knowledge, attitudes, and behavior. To gather knowledge ask factual questions, to measure attitude, ask questions that measure opinion, and for behavior, ask questions that address both knowledge and attitude and collect data over time—before training, when training ends, and when the trainee is back on the job–to identify the effects of training.
  • There are two types of questions you might ask: open-ended questions and close-ended questions. Open-ended questions allow participants to express themselves freely but are time-consuming to “grade” while close-ended questions can be limiting, but better lend themselves to objective measurement. A simple rule of thumb for choosing between the two is if you know the specific information needed to answer a question, closed-ended responses are preferred (Converse and Presser, p.33). If however, you are not sure what the range of possible responses are to a question, and need to conduct a preliminary exploration of a topic, open-ended questions will work better. Note also that previous research shows that respondents are more willing to offer sensitive information on a survey using an open-ended response.
  • Choose possible responses carefully when using close-ended questions. Most close-ended questions give respondents choices or a range of choices. You want to capture all possible options clearly and without bias. For example, Bradburn (2004) cautions against the use of vague categories such as never, rarely, occasionally, and often because there is no way to know what “often” means to each participant. A common choice for survey data collection is the Likert-type scale where participants are asked to rate each item. In general, a 5-point Likert scale is an appropriate choice and would include statements like: “strongly agree,” “agree,” “neutral,” “disagree,” and “strongly disagree.” Because Likert scales have numbers attached to statements, they are easy to tabulate and report back.
  • When it comes to questionnaires, appearance counts. Make sure your questionnaire looks professional, is grammatically correct, easy to read, and contains clear instructions. It’s always a good idea to have others proof your questionnaire and provide feedback. Also, only ask questions that are necessary. Participants find overly long questionnaires daunting.
    Remember, garbage in, garbage out. The quality of your data is dependent on the clarity of each question. A thoughtful design captures the experience and change of program participants.
  • Confidentiality is key to getting honest answers. Most surveys are collected anonymously using an online tool like Survey Monkey or Qualtrics (available to students through the RU Library). It is important to communicate to participants that all data will be collected and reported anonymously.

To revisit our three types of information, when testing knowledge, use close-ended factual questions that directly measure course objectives. When measuring attitude, use a five-point Likert scale, and when measuring behavior, use a combination of both types of questions and administer the same questionnaire before training, right after training, and at one-month intervals following training.

Add a comment to this post to tell us about your experience with survey design—particularly what you think makes a survey work or not work effectively.

References and Resources

The books that follow are useful resources and will help you learn more about survey design. The first is more advanced, but the second book by Converse and Prosser is a quick read.

Bradburn, N., Sudman, S. and Wansink, B. (2004) Asking Questions. Jossey-Bass; San Francisco, Ca.

Converse, J. and Presser, S. (1986) Survey Questions: Handcrafting the Standardized Questionnaire. Sage University Press; Newbury Park, Ca.



  • Great article that outlines the factors that are necessary when conducting a survey design. It almost seems as if some of these items are a no brainer in a sense such as, making sure that the questions make sense. However, the emphasis on the importance of how the questions are designed is crucial because if they are worded in a way that will persuade the reader to answer in a particular way. I used SurveyMonkey a couple of times and it would flag me for creating questions that may swing the reader in a particular way or another. Has anyone else used SurveyMonkey and if so, what were you thoughts on it?

  • Great post! This makes me rethink all the surveys I’ve taken. It also reminds me of metrics – if you don’t do anything with the data points you collect, they are not worth collecting. I see the value of employee satisfaction surveys and has me considering the use of them within my department. Thanks!

  • This article is great and super helpful. While I lack survey design skills I have taken plenty of surveys following a training. And my favorite arrangement is having the survey broken into sections and that section having a short block of close-ended questions followed by an open-ended questions or comment field. It helps me to focus and be a little more objective rather than rushing through just to complete the survey.

  • I used to be a program director in nonprofit and as the overseer of several programs it was my job to monitor their effectiveness and expansion. The best way to do this on a cost-effective measure was the use of surveys. They were heavily utilized throughout the organization and I often found myself drafting and tweaking them over time as I did trainings. The data collected became indispensable to constantly improve our programming and training effectiveness. There was certainly a learning curve to it at first. I was by no means a master at drafting surveys and often didn’t even know what I wanted to measure or what I was even looking for. It took time and practice to hone these skills to make more effective data collection and ultimately be able to apply that data to make quality changes to the programs. I am still learning and building on this foundational experience today.

  • Great article! It seems a very simple point but I truly appreciate the thinking about how appearance counts for effective surveys. I certainly find that when great tools are poorly designed it negatively impacts the success of the tool! I recently was at a training and the post workshop evaluation was pretty clearly cut and pasted from another event; it not only had the wrong date but also asked for feedback and responses for content that was not covered. That impacted my willingness to spend time filling in the survey as it was clear it was incorrect. I also second the encouragement for a reasonable length! We often try to do way too much! Brief, well written, logically designed tools are so wonderful!

  • Great article! This article can definitely serve as a “best practice” for collecting feedback after a training program. I believe the best surveys have a combination of open and close-ended questions so that participants have a chance to add detail regarding certain areas of a program. I think anonymous surveys capture more honest feedback, compared to those that force self-identification. When administering or completing questionnaires, I prefer the Likert method format, because I think it’s quicker, while still allowing for a pretty accurate gauge on the experience received.


  • Factors that contribute to survey designs that work are:
    1. Timely. The survey should be administered shortly after training program ends so the participant’s experiences can be easily recalled.
    2. Objective and Participant focused, not content or facilitator focused. Questions address the learning objectives outlined in the course curriculum, classroom environment, and content delivery.
    3. Convenient for the participant. On-line surveys allow for the participant to take the survey when it’s convenient, location that may be private (their office, or home), and doesn’t interfere with job duties or responsibilities that otherwise could act as a detractor.
    4. Simple, uncomplicated, and not rushed. Less is more, when it comes to the number of survey questions- limiting them between 5-15 questions in total. Indicate the average time frame it will take to complete the survey before the participants begins it.
    5. Say what you mean, mean what you say. Keep the language and sentence structure simple, using words that will not be confusing, refrain from sentences that can be misinterpreted, and that the participant will understand how to answer them.
    6. Categorized to limit bias or personal opinion. Classroom environment, content, and facilitation style can often influence each category and how the participant will respond to the survey questions. By asking specific questions and outlining them in their own categories, it may lessen the influence or biased personal opinion.
    7. Provide space for additional comments. Especially if the participant responds to the survey question unfavorably. Detailed comments help support how the participant feels, why they responded the way they did, and provides positive and negative feedback that otherwise would not be shared in a confidential manner.

    Factors that contribute to ineffective survey designs are:
    1. Excluding the instructional design (if different than the survey design team) or facilitation team in the onset of the survey design questions. Often times, the survey questions developed miss the mark of the actual content being delivered. This can lead to a breakdown or inaccuracy of survey results.
    2. Creating surveys that are opposite of all factors that contribute to survey designs that work.

  • Survey design is something that I don’t have a ton of experience with, so this article as a huge help to understand the basics to success when developing one. In being on the receiving end of surveys, I do see how useful they can be and how harmful they can be. This is think from the reactionary stage of course as I don’t typically have a had in analyzing the results. I’m personally more drawn to the surveys that follow a Likert format. Having clear quantitative and qualitative options is essential to well rounded data. If open ended question are implemented, I would always push for a max of two so that we don’t fall too far in the narrative. At that point interviews might make sense. Most importantly, I think a survey should be concise and if at all possible no more than one page, even if you have administer the second half down the line.

  • One thing I like to see on surveys is space for comments. If the survey is really short something simple at the end might be enough, but I think when a survey has multiple sections, it’s nice to see a comment option after each section. I also like it when there are some slightly more guided questions, prompting people to comment about a specific issue or topic. In my limited experience, I’ve found the qualitative comments to be just as useful, if not more so, than the quantitative data.

  • My experience with survey design is very well know and great. I have dealt with the open ended question, which gives me a wide range to grade what is actual going on. and to express my opinion.

  • Thanks for sharing Mike. I think both are excellent ways to make the most of level 1 assessment.


  • Hi Kathleen – great post. I always try to include two specific questions in any survey I design (especially level 1). The first is a predictive question to try to gauge what a level 3 evaluation will show. For example, “How likely are your work behaviors going to change as a result of this program?”.

    The second is a reverse question to make sure the participant is paying attention to the survey and not just clicking through. “How dissatisfied are you with the program?

    • Mike, your suggestion to add a reverse question is genius! Thanks for sharing.

      I’ve never thought to consider vague options like often, rarely and to opt for strongly agree and agree, but it makes complete sense. This language is not one in the same. Particularly, when communicating the data the use of clear language will help stakeholders take the results seriously.

Please post a comment

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s