8 Real-life Customer Survey Examples
Are you looking for customer survey examples to improve how you collect customer feedback?
Let’s face it – Sometimes the quickest way to learn to design customer surveys is through the experience of other brands that have nailed their feedback surveys already.
And so, in this article, you’ll find 8 inspiring real-life customer survey examples from some small and big, companies, and be able to see how they do it.
But let’s take a step back, and gain some info on the types of customer surveys you’ll see in this article.
The 3 Types Of Customer Satisfaction Surveys
Customer satisfaction surveys, as the name suggests, are surveys used to figure out whether or not the customers are satisfied with your product or service (and why). However, there’s no one survey type to accurately uncover each aspect of customer satisfaction at each point of the customer journey. So instead, you’ll have to run multiple types of surveys.
However, these three are the most popular customer surveys used to understand user satisfaction levels:
1. CSAT
A customer satisfaction score (CSAT) survey uncovers how satisfied the customers are with your product, service, or the overall experience you provide. Typically, these surveys are sent to know customers’ attitudes towards your business as a whole or to measure satisfaction at crucial interaction points (such as customer service, sales chat, billing, etc.)
For example, here’s a CSAT survey to understand customer satisfaction with your product in general:
Once people start responding to this survey, the CSAT survey tool calculates the CSAT score in real-time. And by the end of the survey, you’ll know how satisfied or unsatisfied the customers are.
In case you use a manual method (like sending a Google Form to get customer feedback) instead of a specific tool (like Refiner), this is the formula to find CSAT score:
CSAT score (in %) = (The number of satisfied customers (i.e., people who choose option 4 or 5 on the 5-point CSAT scale) / Number of total survey responses) x 100
Related Read: What is a good CSAT score?
2. NPS
NPS (or Net Promoter Score) surveys are used to measure customer loyalty by finding customers’ likeliness to recommend your product or service. [NPS is a registered trademark of Bain & Company, Inc., Satmetrix Systems, Inc., and Fred Reichheld.]
Just like CSAT survey questions, you can ask different NPS questions at different points to see the impact of specific interactions on customer loyalty.
And of course, you can send a general NPS survey, like this one:
To calculate net promoter score manually, this is the formula:
NPS = [(Number of promoters – Number of detractors) / (Total number of respondents)] x 100
Here, promoters are the happy and loyal customers who choose 9 or 10 on the NPS scale. On the opposite side are the detractors, who show their unhappiness by choosing between 0 and 6.
Find out more: How to evaluate NPS
3. CES
Customer effort score (CES) surveys reveal how easy or difficult it is to use/interact with your product or service and its features. These surveys are generally used to make UI/UX improvements.
Here’s a common CES survey question companies ask (typically through an in-app widget) after a user interacts with a new feature:
To find the customer effort score (CES), use this formula:
CES = Total sum of responses / Number of responses
The 5 Types Of Customer Satisfaction Survey Questions (With Examples)
All customer survey examples you’ll see further will be using one or a combination of the following five types of questions (and the three survey types mentioned above):
1. Binary-scale question
These are yes/no questions, wherein users have to choose from two opposite options to answer.
Example Question: Are you satisfied with our customer service?
Example Options: 👍 or 👎
2. Rating (or ordinal) question
The user has to give a rating on a scale of 1-5 or 1-10 to answer a rating question.
For example, Amazon sends a rating question to understand more about their sellers:
3. Likert scale question
A Likert scale is a five or seven-point scale with options ranging from one extreme attitude to the opposite extreme attitude. Users have to choose the option that best describes their attitude as the answer to the asked question.
For example, here, the users can choose between “Very unsatisfied” and “Very satisfied” as the answer, depending on how much they like the product:
4. Multiple-choice question
In multiple-choice questions, you can label multiple options and allow users to select more than one option as an answer.
For example, this survey question asks the user’s job role for customer segmentation and gives multiple options:
5. Open-ended question
Open-ended questions give users the needed space to express their opinion in-depth.
For example, let’s say someone gives a rating of three or less on your “How satisfied are you with the recent customer support interaction?” question. Then, you can add a follow-up open-ended question like, “How can we make your customer support experience better?” and give the user some space to type their answer.
33 (Specific) Customer Satisfaction Survey Questions To Ask Your Customers
Below are some of the questions you can consider asking while designing your customer surveys. I’ve divided the questions for specific goals to make things easier for you.
Note: Refer to our “How to Survey Your Users” article in case you don’t understand the context behind the questions. I’ve discussed them in-depth there.
To make customer segmentation easier:
- What is your industry?
- How big is your team?
- What is your job title?
- Which describes your role at your current job best?
To identify high-value accounts:
- How many hours do you save per week using our product?
- How many colleagues are using our product?
- Who else (job titles) do you think would benefit from our product?
- What product did our solution replace?
To see if your product has achieved product-market fit:
- How would you feel if you could no longer use our product?
- What type of people do you think would most benefit from [Product Name]?
- How can we improve [Product Name] for you?
- What is the main benefit you receive from using [Product Name]?
Follow-up questions if your product doesn’t have the product-market fit yet:
- What would make you change your answer?
- What features are missing from our product?
- If you had to replace our product with an alternative, what would that be and why?
To determine ideal price for the product:
- At what price point would you consider our product too expensive?
- At what price point would you feel that you’re getting the best value for money from our product?
To collect feature feedback:
- Which of the following product features are most important to you?
- How important is [Feature Name] to you?
To find UX gaps and determine CES:
- What were you trying to accomplish with our product today? Did you manage to do it?
- On a scale of 1-5, how easy was it to accomplish your goal?
- Was there anything that you didn’t manage to accomplish today?
To find your most effective marketing channel:
- How did you hear about us?
- Would you tell your connections/friends about our product?
If yes, follow up with “How would you describe it? What channel would you use?”
To collect reviews:
- How likely are you to recommend our product to a friend or colleague?
- On a scale from 0-10, how happy are you with our product?
- How does our product help you in your day-to-day work?
To prevent inactive users from churning:
- Hey [user], I’ve noticed you haven’t tried [product] yet. Can you tell us why?
- What can we do to make our product/your experience better?
- Does [product] live up to your expectations? If not, can you tell us why?
To understand why users stopped using your product:
- What’s the primary reason for canceling your account/subscription?
- Why did you initially sign up for our product? What happened since then?
- How can we make our product better? What features would make you give it a try?
Choose the best survey type and question type while deciding to use any of the above questions.
Tip: Use the question type that’s easiest to answer for the maximum possible number of responses. For example, if a multiple-choice question can get you a precise answer, do not use an open-ended question.
6 Inspiring Real-life Customer Satisfaction Survey Examples
1. Amazon
Unless you live under a rock, you know what Amazon is and how it collects reviews and feedback.
However, you might not know of the Amazon Associates Program – Amazon’s affiliate program that incentivizes influencers and professional affiliate marketers to bring sales to Amazon. It’s one of the largest affiliate programs in the world, and Amazon continuously improves it with feedback surveys like this:
Step-1: A page describing everything about the feedback survey:
Step-2: An NPS survey question to determine the affiliate’s loyalty towards the program:
Step-3: To understand the affiliate’s reason behind giving the score on the NPS scale:
Step-4: A CSAT question to know user satisfaction:
Step-5: A rating scale question to get more details behind users’ satisfaction or dissatisfaction:
Step-6: To see if they can provide better resources/help content to affiliates:
The above screenshots are of only half of the questions the feedback form contains. There are many more such pages in the survey to gather the necessary details for improving the Amazon Associates Program.
2. Livestorm
Livestorm is a customer success software used to conduct online meetings, webinars, and virtual events. It’s one of the many happy Refiner users and has automated collecting user feedback, as displayed in this chart:
Here’s one use case of how Livestorm uses surveys: If someone is on the paid plan for more than 15 days, they send an NPS survey to the customer when a webinar ends (i.e., at the correct time). Then, if the customer gives NPS greater than 8, Livestorm asks the customer to leave a review on G2C and/or Capterra (because they are more likely to leave a review + leave a positive review).
3. Upwork
Upwork is one of the top freelancing platforms. And I assume one of the reasons why it’s at the top is because it considers its users’ feedback by sending surveys like this:
In this particular survey, Upwork has asked the freelancers on its platform many rating-scale questions to understand what the freelancers value the most about a job. For example, the ‘How valuable are guaranteed hours per week?’ question will likely get a higher rating because guaranteed work is high value for many freelancers looking for stability.
4. Namecheap
Namecheap is a web hosting provider known for its value hosting and helpful customer support. At the end of each customer support session, Namecheap sends this feedback survey to collect CSAT score and NPS from the user:
The survey also has a comments box to understand why the user gave a rating they did.
5. SaaStock
SaaStock holds conferences, events, and meetups for people involved in B2B SaaS. It’s one of the many Refiner users that nail customer surveys.
At the end of any meetup, SaaStock sends all of the guests this NPS survey to measure attenders’ loyalty and likeliness to bring another guest with them at the next conference:
And, of course, there’s a follow-up question to collect feedback on how they can make the attendees’ experience better.
6. Apple
Apple regularly collects customer feedback to stay ahead of the competition and improve its products. For example, in 2014, Apple sent a 38-page survey to iPhone users to understand actual customers’ opinions of Android. The survey included typical segmentation questions, CSAT and NPS questions, and in-depth questions like this to get more insights:
A Couple Of Examples Of Bad Customer Surveys
1. Using a (bad) CSAT question for an NPS survey
Twitter user Mike Sharp shared this actual customer survey he received:
The survey asks the customer, “How satisfied are you with our team member?”. And this simple survey has three major mistakes:
- Firstly, it’s a horrible and vague question.
- Secondly, it follows a typical CSAT question format. Yet, the answer to the question involves selecting from a scale of 0-10, which is unique to NPS surveys.
- Lastly, I admit: though it’s not ideal to ask users to select from 0-10 to answer a CSAT question, it’s technically okay. However, the options “Not at all likely” to “Extremely likely” make no sense for the asked question.
2. Bad UX
Twitter user Anthony Schulzetenberg shared this survey by ShopRunner:
Unlike the previous example, the question type, the question quality, and the answer options are perfect.
However, the way it’s formatted is not ideal from a user experience viewpoint. As Anthony aptly pointed out in his tweet, ‘Net Promoter Scores (NPS) are often a central metric for company success. The presentation of this one-question survey impacts how customers respond. This example = 😬’.
All options — 0-10 — should be in one line whether the user opens it on a desktop or mobile device. For example, here’s how Refiner-generated surveys look on mobile:
Conclusion
- I shared how good surveys and bad surveys look in this article.
- I also shared some general customer survey questions you can ask your customers.
Just based on these two, you can create your first few customer surveys.
However, there are many best practices to consider and A/B tests to conduct for making your customer surveys ideal in every aspect.
For now, though: as I said, you can take inspiration from the above customer survey examples and build your first survey or improve your current surveys. Good luck!