DATABASE SYSTEMS CORP. Custom Call Surveys Home  |   Contact Us  |   About Us  |   Sign Up  |   FAQ

IVR surveys Database Systems Corp. Logo

Phone Survey Design

Automatic phone surveying is an economical and highly accurate method of collecting information by phone.. There are many advantages of automated phone surveys. Studies have indicated that automated surveys are more accurate than surveys conducted using live operators. Database Systems Corp. (DSC) was incorporated in 1978 and has been a leading developer of computer software and phone systems. Using this advanced technology, DSC provides phone surveys and custom phone applications for a wide variety of industries and governments.

The following article describes information about phone surveys and phone survey techniques.

Contact DSC to learn more about our phone survey development and automatic survey outsourcing services.

Basics of Survey and Question Design

This is an introduction for federal government program managers on how to design surveys and survey questions to collect customer feedback.

Initial design considerations

  • Before you design your survey
  • Survey design
  • Question design

    Common survey question types and examples

  • Multiple choice
  • Rank order scale
  • Rating scale
    • Likert scale
    • Semantic differential scale
  • Open-ended questions

    Common question design pitfalls

  • Asking two questions at once
  • Leaving out choices
  • Leading questions
  • Built-in assumptions

    Tips for technology-based surveys

    Skip logic/conditional branching

    Initial design considerations

    Before you design your survey

    • Clearly articulate the goals of your survey. Why are you running a survey? What, specifically, will you do with the survey results? How will the information help you improve your customer's experience with your agency?
    • Make sure that each question will give you the right kind of feedback to achieve your survey goals.
    • When in doubt, contact a statistician or survey expert for help with survey and question design.

    Survey design

    • The opening should introduce the survey, explain who is collecting the feedback and why. You should also include some reasons for participation, and share details about the confidentiality of the information you are collecting.
    • The introduction should set expectations about survey length and estimate the time it will take someone to complete.
    • Opening questions should be easy to answer, to increase participant trust and encourage them to continue answering questions.
    • Ensure questions are relevant to participants, to reduce abandonment.
    • To minimize confusion, questions should follow a logical flow, with similar questions grouped together.
    • Keep your survey short and to the point - fewer questions will deliver a higher response rate.
    • If you have sensitive questions, or questions requesting personal information, include them towards the end of the survey, after trust has been built.
    • Thank your participants after they've completed the survey.
    • Test your survey with a small group before launch. Have participants share what they are thinking as they fill out each question, and make improvements where necessary.

    Question design

    • Keep questions short and easy to read. The longer and more complex the questions, the less accurate feedback you'll get. This is particularly true of phone surveys.
    • Keep questions easy to answer, otherwise participants may abandon the survey, or provide incorrect information (e.g., giving the same answer/value for all questions, simply to get through the survey).
    • Keep "required" questions to a minimum. If a participant can’t or doesn’t want to answer a required question, they may abandon the survey.
    • Use a consistent rating scale (e.g., if 5=high and 1=low, keep this consistent throughout all survey questions).
    • For rating scales, make sure your scale is balanced (e.g., provide an equal number of positive and negative response options).
    • Label each point in a response scale to ensure clarity and equal weight to each response option.
    • For closed-ended questions, include all possible answers, and make sure there is no overlap between answer options.
    • Use consistent word choices and definitions throughout the survey.
    • Avoid technical jargon and use language familiar to participants.
    • Be as precise as possible to avoid word choice confusion. Avoid words like “often” or “rarely”, which may mean different things to different people. Instead, use a precise phrase like “fewer than three times per week.”
    • Try to construct the questions as objectively as possible.

    Common survey question types and examples

    Multiple choice questions

    Questions with two or more answer options. Useful for all types of feedback, including collecting demographic information. Answers can be "yes/no" or a choice of multiple answers. Beware of leaving out an answer option, or using answer options that are not mutually exclusive.

      Example 1: Are you a U.S. Citizen? Yes / No

      Example 2: How many times have you called our agency about this issue in the past month?

      • Once
      • Twice
      • Three times
      • More than three times
      • Don't know/not sure

    Rank order scale questions

    Questions that require the ranking of potential answer choices by a specific characteristic. These questions can provide insight into how important something is to a customer. Best in online or paper surveys, but doesn't work too well in phone surveys.

    Rating scale questions

    Questions that use a rating scale for responses. This type of question is useful for determining the prevalence of an attitude, opinion, knowledge or behavior.

    There are two common types of scales:

    Likert scale

    Participants are typically asked whether they agree or disagree with a statement. Responses often range from “strongly disagree” to “strongly agree,” with five total answer options. (For additional answer options, see table below.) Each option is ascribed a score or weight (1 = strong disagree to 5 = strongly agree), and these scores can be used in survey response analysis. For scaled questions, it is important to include a “neutral” category (“Neither Agree nor Disagree” below).

    Semantic differential scale

    In a question using a semantic differential scale, the ends of the scale are labeled with contrasting statements. The scales can vary, typically using either five or seven points.

    Open-ended questions

    Questions where there are no specified answer choices. These are particularly helpful for collecting feedback from your participants about their attitudes or opinions. However, these questions may require extra time or can be challenging to answer, so participants may skip the questions or abandon the survey. In addition, the analysis of open-ended questions can be difficult to automate, and may require extra time or resources to review. Consider providing extra motivation to elicit a response (e.g., “Your comments will help us improve our website”) and ensure there is enough space for a complete response.

      Example: What are two ways we could have improved your experience with our agency today? We take your feedback very seriously and review comments daily.

    Avoid these common question design pitfalls

    Asking two questions at once (double-barreled questions)

      Example: How satisfied are you with the hours and location of our offices? [ 1=very dissatisfied, 5=very satisfied]
    You won't be able to tell whether the participant is responding about the time, or the location, so you should ask this as two separate questions.

    Leaving out a response choice

      Example: How many times in the past month have you visited our website? [ 0 1-2 3-4 5 or more]
    Always include an option for "not applicable" or "don’t know", since some people will not know or remember, and if they guess, their answer will skew the results.

    Leading questions

    Based on their structure, certain questions can “lead” participants to a specific response:

      Example: This agency was recently ranked as number one in customer satisfaction in the federal government. How satisfied are you with your experience today? [ 1=very dissatisfied, 5=very satisfied]
    The first statement influences the response to the question by providing additional information that leads respondents to a positive response, so you should leave that text out.

    Built-in assumptions

    Questions that assume familiarity with a given topic:

      Example: This website is an improvement over our last website. [ 1=strongly disagree, 5=strongly agree]
    This question assumes that the survey participant has experience with the earlier version of the website.

    Tips for technology-based surveys

    Skip logic or conditional branching

    When creating technology-based surveys, skip logic can be helpful. Skip logic enables you to guide participants to a specific follow-up question, based on a response to an earlier question. This technique can be used to minimize non-relevant questions for each participant, and for filtering out survey participants. For example, if you are looking for U.S. citizens only to fill out certain parts of your survey, anyone who answers “no” to the question “Are you a U. S. citizen?” can be skipped to the next relevant section.

    Call Us Today

    Contact DSC to learn more about our IVR survey phone systems and services.

    Phone Survey Articles

    The following are additional articles that relate to phone surveying applications, guidelines and helpful hints when developing telephone surveys.