Are you looking for best practices for running customer satisfaction surveys?
Let’s face it – when you implement any strategy for the first time, you often miss the mark. There’s always something you could’ve done or shouldn’t have done better.
The good news? You don’t have to try to figure out what that something is on your own. You could simply use other people’s experiences.
Here at Refiner, we know a few things about customer satisfaction surveys. So if you are new to feedback surveys and want to know how to run them more effectively, you can learn from us.
This guide will certainly get you started. You’ll discover the most important best practices to build and run effective customer satisfaction surveys.
But let’s get something out of the way first.
Why You Absolutely Must Follow Best Practices for Customer Satisfaction Surveys?
Before we get to the good stuff, let’s briefly cover why following those best practices is so important. And let’s do it in a bit more unconventional way – By focusing on what happens when you’ve failed to follow those best practices.
And there are two major negative outcomes you can expect:
Low response rate
Known fact – A typical customer survey gets between 10% to 30% response rate.
So, if you send a well-designed customer survey to 1000 customers, only 100-300 people will actually respond.
However, If you fail to create an effective feedback survey, you might only get a 5% response rate or lower. And that, as I’m sure you can imagine, isn’t enough data to help your SaaS reach its customer survey goals. Here’s why:
- You won’t be able to determine the overall customer satisfaction level because mostly only your extreme fans or extreme critics will bother filling your surveys.
- You won’t be able to pinpoint and prioritize dissatisfaction factors. The thing is, only 1 out of 26 unhappy customers complain. You design surveys to get the rest to share their opinions. However, you won’t even get most of them to respond with poorly-designed surveys, let alone get their thoughts.
- You won’t ] know which features to add if you’re running the survey for product development.
- Your customer support team won’t have enough data to make changes, and so on.
Inaccurate or poor answers
You can increase the response rate with good survey design, timing, and presentation. But even if you do, the higher number of responses doesn’t mean that you’ll get better responses.
Why? Because the quality of responses depends on other customer satisfaction survey best practices (for example, the survey length or crafting effective questions).
So even though you can get many responses, not all those responses could be unhelpful if you don’t follow the best practices surrounding getting insightful answers.
In short, if you follow the below best practices, you’ll get a high response rate and actionable answers.
So, let’s see what those responses are.
10 Customer Satisfaction Survey Best Practices
1. Choose the right survey type
There are many different types of surveys. Each of them can help you uncover different aspects of customer experience.
So, how do you know which one to run? Well, it all depends on your survey goals.
Let me explain that with the three most common customer survey types: CSAT, NPS, and CES.
A customer satisfaction score (CSAT) survey helps you understand how satisfied the customers are with your product, service, or overall experience. You can use this survey type to measure customer satisfaction at crucial customer interaction points (example: after customer service, billing, sales interaction, etc.)
Here’s an example of a CSAT survey:
Depending on what respondents choose, the CSAT survey tool calculates a CSAT score using this formula:
CSAT score = (The number of satisfied customers (i.e., people who selected option 4 or 5 on a 5-point scale)/ Number of survey responses) x 100
And depending on the score, you’ll know the customer satisfaction levels.
Find out more: What is a good CSAT score?
NPS (or Net Promoter Score) surveys help measure customer loyalty and customers’ likeliness to recommend your product or service to a friend or colleague. NPS is a registered trademark of Bain & Company, Inc., Satmetrix Systems, Inc., and Fred Reichheld.
Here’s an example of an NPS survey:
Depending on what respondents choose, the NPS software calculates NPS or Net Promoter Score using this formula:
NPS = [(Number of promoters – Number of detractors) / (Total number of respondents)] x 100
Here, promoters are the respondents who choose 9 or 10, and detractors choose between 0 and 6.
This score determines your customers’ loyalty levels.
Find out more: How to evaluate NPS
Customer effort score (CES) measures the ease of using/interacting with your product or service and its features.
Here’s a common CES question asked after a customer uses a feature in SaaS:
Depending on what respondents choose, the CES software calculates CES or Customer Effort Score with this formula:
CES = Total sum of responses / Number of responses
The final score uncovers how easy or difficult the users find interacting with your product or service.
2. Choose the right survey questions
Like in the above examples, the first survey question shouldn’t demand much of customers’ time so that the maximum number of people start filling the survey.
Pick one of these frictionless question types for your first question:
- Binary-scale question: The user has to choose between two options (Yes or No, 😀 or 😠, 👍 or 👎, etc.).
- Rating (or ordinal) question: The user has to give a rating on a scale of 1-5 or 1-10.
- Likert scale: Likert scale is a five or seven-point scale wherein the users have to choose a response between two extreme attitudes. For example, on one side, the option can be ‘highly unsatisfied,’ and on the other, ‘highly satisfied.’
Once you attract users with a frictionless question and get their initial response (necessary for the survey score), you should further ask follow-up questions to understand their response. For that, you can use one of these question types:
- Multiple-choice question: Give users multiple options to choose from. For example, if someone responds positively to a customer service-related CSAT question. You can follow up with “What did you like most about the customer service?” and give options like ‘fast response,’ ‘fast resolution,’ ‘agent’s attitude,’ etc.
- Open-ended question: This question type gives users the freedom to express their thoughts in-depth.
3. Construct clear and specific questions
All your questions should be clear and specific. For example, “How would you rate the support you received?” is a) a clear CSAT question that anyone can understand and b) a specific question that demands only one answer (the customer support was good, average, or bad).
Besides, you must not ask more than one question at a time. Otherwise, it can become hard for respondents to answer. For example, “How would you rate your customer service experience, and would you recommend us to others?” is both clear and specific, but customers can’t quickly analyze the question.
Here are a few other elements of a good question:
- There’s no bias in the question. For example, “How much did you like our customer support?” implies that the customer liked your customer service experience.
- The question doesn’t make assumptions about the respondent. For example, “How would you rate the support you received?” doesn’t assume the customer talked to a support agent. It’s only sent after a support session.
- There are no grammatical errors in the question.
4. Limit the number of survey questions
The survey length highly impacts both the response rate and quality of responses. It’s simple logic: Almost no one would be willing to spend their tens of minutes filling a long survey. Even if you tie an incentive for completing the survey, most won’t give full & honest answers, and they’ll try to get it over with.
Rhonda G. Kost and Joel Correa da Rosa researched the impact of survey length on survey completion, response rate, and reliability. They sent three types of surveys: ultrashort, short, and long, and found:
- Response rates were 64% (Ultrashort), 63% (Short), and 51% (Long), respectively.
- Completion rates were 63%, 54%, and 37%, respectively.
5. Send surveys at the right time
Sending your surveys at the right time can positively impact the response rate.
Of course, send the surveys meant to be sent after action after the action. For example, you should send the survey asking about the customer support experience right after resolving the support query.
However, send general surveys when customers notice them, yet the surveys don’t feel intrusive. For example, if you send surveys via email, sending them at the right time increases open rates – and thus response rates.
6. Send surveys frequently
There are two reasons to send your surveys frequently:
1. People want to fill the survey, but they forget because of other priorities.
For such people, send the same survey after some time.
For example, here’s an email from Namecheap asking to rate their services (on Nov 11):
And here’s a reminder email a week later:
2. People’s opinions change (if you make response-based changes).
Therefore, you should send surveys frequently (every quarter, 6 months, or year) to gather the freshest opinions of your customers.
7. Send surveys to the right platform
Where you send your survey also impacts the survey response rate. Choose from these options, based on survey types and actions involved:
- In-app popup: You can display the survey in an in-app widget. For example, if you are a SaaS business, you can pop up a CES survey for the customer(s) who uses your new feature.
- Email: Email is perhaps the most popular option to send customer surveys. For example, you can send email surveys after a customer calls your support team.
- Live chat: You can also send the surveys via live chat (after a live chat conversation, in most cases).
- Survey pages: And, of course, you can create separate survey landing pages and share links with your customers.
8. Offer survey respondents a bonus (optional)
If you follow all the above-mentioned best practices, you’ll likely get a great response rate and helpful responses.
However, sometimes, all the best practices don’t cut it or get a specific customer segment to respond. In such cases, you can try offering an incentive for those who complete the survey. For instance, you can offer an extended free trial, a freebie, or even a monetary reward.
Note: Ensure that they know they’re asked for an honest opinion, and the reward is not a bribe to say positive things.
9. Follow up with respondents.
As soon as possible, follow up with survey respondents and close the feedback loop. Unfortunately, this is an often ignored step, and it leads to customers thinking they wasted time filling your survey.
Here are a few options to follow up, close the feedback loop, and ensure that the customers feel valued:
- Thank the customer for taking out time to fill your survey.
- Apologize to customers who aren’t satisfied with your product or service.
- Acknowledge the problem and share your action plan to solve the problem.
- Again follow up when you take the said action.
10. Put your findings into action.
The last and the most crucial customer satisfaction best practice is to take action based on survey responses. Otherwise, there’s no point in running the surveys.
Here’s how you can put your survey results in action:
Step 1: Determine warning levels
First, find out if the problem requires immediate action or not. For example, if the CSAT survey score is 70%+, the warning levels are pretty low. Therefore, you can passively act on customer problems you find while looking through the survey responses. On the contrary, if CSAT is below 40%, the warning levels are high, and customer problems need immediate attention, or you might lose them.
Step 2: Look for patterns
Of course, you can’t go customer by customer and make changes per each feedback. Instead, you need to look for patterns and find common problems that, if solved, will uplift the survey score in the next iteration.
There are two ways to look for patterns:
- Look for patterns across customer segments: You look for patterns across your customer segments based on their geographic, demographic, technographic, firmographic, value-based, or usage-based data.
- Look for patterns across responses: There’ll be common complaints about a significant problem (if any). Prioritize according to the frequency of problem keyword appearance.
Step 3: Make an action plan and act
Consult all the relevant teams, consider everyone’s opinions, and make an action plan. Then, act and solve the problem. And don’t forget to update the customer about it to gain those loyalty points.
Running customer surveys is the best way to uncover how customers feel about your brand. Simply put, here’s how customer surveys work:
Step 1: You ask customers a question
Step 2: The customers answer it
Practically, it’s not that simple, though. You need to take care of a lot of things to get the maximum possible actionable answers. And hopefully, this guide helped you find those “a lot of things.”
Now, only one thing remains: Try all the abovementioned best practices. Good luck!