Contact SalesLog in
Contact SalesLog in

Get insights to inspire better decisions, create experiences people love, and drive business growth

Survey Tips

60 good survey questions: Types, examples, and tips

60 good survey questions: Types, examples, and tips

You need a solid foundation: good survey questions to collect meaningful and actionable survey insights. This may seem obvious, but it’s pretty important to your overall survey success.

Well-written survey questions solicit accurate and actionable insights that can guide product development, direct marketing campaigns, and improve customer journeys.

Learn how to write good survey questions for your next research project.

Good survey questions are clear, neutral, unbiased, and relevant to your research objective.

Survey questions are crucial to any survey, as they help gather valuable insights and data from respondents. Crafting effective survey questions is essential for obtaining accurate and actionable information.

Good survey questions are relevant to your goals and objectives. For example, some help you understand customers' satisfaction with your company, products, or service. The Net Promoter Score® (NPS) question, “On a scale of 0 to 10, how likely is it that you would recommend this company to a friend?” is an excellent survey question if you’re measuring customer satisfaction and loyalty.

However, an employee NPS (eNPS) survey question would be a better fit if your objective is to measure employee engagement. 

Combining qualitative open-ended questions with quantitative closed-ended questions can enhance survey insights by providing detailed personal feedback and measurable data. By understanding the various types of survey questions and how to use them effectively, you can create surveys that yield valuable insights and drive informed decision-making.

Your research objective is the difference between a “good” and a “bad” survey question. Here are 60 good survey question examples based on the situation, including customer satisfaction, employee engagement, and post-event.

A good customer service survey question can accomplish two things: solicit honest customer feedback and reinforce your company brand as they’re clear and straightforward. Here are 12 customer service questions to jumpstart your customer support survey.

  1. How satisfied are you with the overall customer service experience?
  2. How did you contact our customer support team?
  3. How easy was it to reach our customer service?
  4. How satisfied are you with the response time of our customer support team?
  5. How would you rate the professionalism of our customer support team?
  6. How knowledgeable did you find our customer service representatives?
  7. How well did our customer support team resolve your issue?
  8. What could our customer support team have done better?
  9. How likely are you to recommend our customer support to others?
  10. How likely is it that you would recommend our company to a friend or colleague?
  11. Do you have any other comments or suggestions for our customer support team?
  12. Please provide any additional comments or suggestions you may have.

Employee-centric survey questions deserve the same care customer-facing questions dictate. Employees are just as likely to abandon and straightline as customers.

These responses can dramatically impact the accuracy of your employee engagement and satisfaction data and mislead employee initiatives. Avoid this outcome with these Likert scale and dichotomous employee survey questions to gauge employee engagement and satisfaction.

  1. I am satisfied that I have opportunities to apply my expertise.
  2. My organization is dedicated to my professional development.
  3. I am satisfied with my overall job security.
  4. How easy is it to get help from your supervisor when you want it?
  5. How understanding are your coworkers?
  6. How reliable is your supervisor?
  7. When I speak up at work, my opinion is valued.
  8. How well does your supervisor facilitate your professional growth?
  9. How realistic were the expectations that were set for you?
  10. Overall, how fairly were you treated?
  11. How involved are employees in setting the company’s objectives?
  12. How well do the members of your department work together to reach a common goal?

Good UX surveys use various question types to understand the customer experience with your product. Likert scale questions gather quantitative data, while open-ended questions allow you to understand customer sentiment.

Use these UX survey questions to evaluate user experience and product usability. Keep in mind that you may want to accompany the survey question with high-resolution images.

  1. How would you rate the overall user experience of our product?
  2. How would you describe our product in one word or sentence?
  3. How easy was it to navigate through our product?
  4. How visually appealing do you find our product?
  5. What features do you find most useful? Select all that apply.
  6. What do you like most about our product?
  7. How responsive is our product?
  8. What improvements would you suggest for our product?
  9. How likely are you to recommend our product to a friend or colleague?
  10. How satisfied are you with the user interface of our product?
  11. How satisfied are you with the ability to collaborate with other users on the website?
  12. How often do you use our product?

Post-event feedback questions are deployed immediately after an event. For this reason, questions are often close-ended and concise as the survey will likely be completed on mobile devices, which encourages high response rates.

Consider these survey questions after an event to collect feedback from participants.

  1. Overall, how would you rate the event?
  2. Overall, were you satisfied or dissatisfied with the event?
  3. How could future events be improved? Select all that apply.
  4. Was the event length too long, too short, or about right?
  5. How well did the event meet your expectations?
  6. How likely are you to recommend this event to a friend/colleague?
  7. Prior to the event, how much of the information that you needed did you get?
  8. How useful was the information presented at the event?
  9. How helpful was the staff at the event?
  10. How likely are you to attend the event again in the future?
  11. How would you rate the vendors at the event?
  12. How clearly was the information presented at the event?

Close-ended questions are among the most accurate ways to gather quantitative information. These questions limit a user's response options to pre-selected multiple-choice answers, making them easier to analyze. You aren’t restricted to close-ended questions, though. In many cases, like the NPS survey question, it is best practice to follow up with an open-ended question to understand the respondent's answer.

Here are examples of quantitative survey questions for a variety of research objectives.

  1. How likely is it that you would recommend this company as a place to work to a friend?
  2. How likely is it that you would recommend this company to a friend or colleague?
  3. How easy was it for you to complete this action?
  4. Overall, how would you rate your purchase experience today?
  5. How would you rate this employee’s performance?
  6. When was the last time you used this product category?
  7. Will you be attending the event?
  8. Overall, how well does our website meet your needs?
  9. Overall, how would you rate your purchase experience today?
  10. How would you rate each aspect of your experience?
  11. How likely are you to purchase any of our products again?
  12. In a typical week, how often do you feel stressed at work?

Get real results with our expert-written survey templates. 

Different types of survey questions, such as multiple choice, rating scale, and open-ended questions, each serve unique purposes and can provide a comprehensive understanding of the respondents’ opinions and experiences. 

Consider the following question types as you scrutinize your survey questions and their ability to gather the information you need to succeed. Nominal questions, for instance, present respondents with multiple answer choices which do not overlap, ensuring clarity and precision in responses.

Dichotomous questions are binary “yes” or “no” survey questions, restricting respondents to two straightforward answers options. This type of survey question is quantitative and can be followed up with open-ended questions.

For example, a human resources professional may ask a new hire if they received a benefits package presentation.

Ranking survey question example

Ranking questions ask respondents to organize answer choices by way of preference. Ranking questions can be a fun and interactive activity. They can also gather insights for niche needs.

A product manager may use a ranking question to ask customers to rank product features. Survey responses can help product managers prioritize feature updates.

A semantic differential scale question asks respondents to rate their attitude toward a topic. The two scale endpoints are opposites.

For example, if a product usability question asks, “How easy was it to use this product feature?” the two answer endpoints would be “very easy” or “very hard.”

Matrix questions have the same response options in a row. Likert scale questions or rating scale questions work well as a matrix question.

For example, a customer experience professional may ask if, on a scale of “very satisfied” to “very dissatisfied” customers are satisfied with the product, customer support team, and onboarding.

Example of dropdown survey question

Dropdown questions are close-ended, multiple-choice questions that allow for a long list of responses. Keep in mind that a long list of dropdown answers may not display well on mobile devices.

Slider questions are interactive, quantitative survey questions. Respondents answer questions on a numerical scale, making it easy for survey makers to aggregate and analyze data.

Image choice questions allow respondents to select images as answers. This type of question works great when you want respondents to evaluate visual qualities. For example, you can use this question for logo testing and user interface testing.

Constant sum questions require respondents to divide a specific number of points or percentages as part of a total sum.

For example, a survey question may ask “Using 100 points, please apply a number of points to each factor based on how important each is to you when using our product.”

Side-by-side questions allow you to ask multiple questions in a condensed format. Like matrix questions, this type of question will enable you to evaluate several aspects of a subject in a straightforward format.

Star rating questions are another way to let respondents evaluate a statement on a visual scale. The scale comprises stars, hearts, thumbs, smiles, or other niche visuals that help survey makers measure respondent sentiment. No matter the visual, each is assigned a weight so a quantitative score can later be aggregated.

Open ended question example

Want to hear from survey respondents in their own words? You’ll need an open-ended question requiring respondents to type their answers into a text box instead of choosing from pre-set answer options.

Since open-ended questions are exploratory, they invite insights into respondents’ opinions, feelings, and experiences. Good open-ended questions will often dig into all three and serve as follow-ups to previous closed-ended questions. 

Multiple choice questions are the most popular survey question type. They allow your respondents to select one or more options from a list of answers you define. They’re intuitive, help produce easy-to-analyze data, and provide mutually exclusive choices. Because the answer options are fixed, your respondents have a more effortless survey-taking experience.

Likert scales are a specific type of rating scale. They’re the “agree or disagree'' and “likely or unlikely” questions that you often see in online surveys. They’re used to measure attitudes and opinions. They go beyond the simpler “yes/no” question, using a 5 or 7-point rating scale that goes from one extreme attitude to another.

For example:

  • Strongly agree
  • Agree
  • Neither agree nor disagree
  • Disagree
  • Strongly disagree

You already know that good survey questions are clear, neutral, unbiased, and relevant. But the question remains—how do you write good survey questions? Here are five tips you can employ right now to write good survey questions for accurate data collection.

Avoid leading questions that encourage respondents to give a specific answer.

You know the type. “Since you love our product so much, how likely are you to recommend it to a friend?” Or “Tell us about how the conference changed your professional life.”

These survey questions assume the respondent’s experience, inserting bias into the question.

Speak a language your respondents understand, and save the jargon for colleagues.

Think of it this way: How do you expect respondents to respond to the best of their ability when they don’t understand the question? Of course, respondents have the option to follow-up or do their own research. However, they are more likely to fill in a random answer or abandon the survey, negatively impacting your data accuracy.

Simplify your survey question language. And if you must use industry-specific language, define it for the respondents.

  • Pretesting: Test your survey questions with a small group of respondents to ensure they are clear and effective. Pretesting helps identify any issues or biases in your questions before the survey is distributed to a larger audience. This step is crucial for refining your questions and improving the overall quality of your survey.
  • Refining: Based on the feedback you receive from pretesting, refine your survey questions to ensure they are clear, specific, and neutral. Make any necessary changes to improve the clarity and effectiveness of your questions. This iterative process helps create a more reliable and accurate customer satisfaction survey.

Double-barreled questions, or a question that asks a respondent’s opinion on two things at one time, are a recipe for confusion and flawed data.

For example, “How would you rate our customer service and product reliability?” This combines customer satisfaction and product usability questions. It is not clear which question the respondent should answer.

This may lead the respondent to skip the question, provide an answer that doesn’t reflect their true opinion, or abandon the survey altogether. 

To avoid double-barreled questions, refer to your research objective. Ensure that the question aligns with your research question. In addition, distribute your survey to colleagues before sending it to your target audience. They can help identify issues that may not be evident to you.

  • Neutrality: Ensure that your survey questions are neutral and do not influence respondents’ answers. Avoid using leading questions or assumptions that may bias respondents’ responses. For example, instead of asking, “How much do you love our product?” you could ask, “How satisfied are you with our product?” Neutral questions help gather unbiased and honest feedback.
  • Clarity: Ensure that your survey questions are clear and easy to understand. Avoid using jargon or technical terms that may confuse respondents. For example, instead of asking, “How would you rate the UX of our product?” you could ask, “How easy is it to use our product?” Clear questions help respondents provide accurate answers, leading to more reliable data.
  • Specificity: Make sure your survey questions focus on a particular aspect of customer satisfaction. Vague questions can lead to ambiguous answers that are difficult to analyze. For instance, instead of asking, “How do you feel about our service?” you could ask, “How satisfied are you with the response time of our customer service team?” Specific questions provide more precise insights into customer satisfaction.

Good survey questions are critical to market research. The right questions will solicit actionable data that directs product development, marketing campaigns, and CX initiatives to success.

The good news is that you don’t have to be a survey expert to write survey questions. SurveyMonkey has 400+ expert-written survey templates and prebuilt forms that you can customize to your specifications.

NPS, Net Promoter & Net Promoter Score are registered trademarks of Satmetrix Systems, Inc., Bain & Company and Fred Reichheld.

Ready to send your own surveys?