In-app Survey Response Rates (Benchmark Data 2022)
Are you planning to launch an in-app survey but aren’t sure whether it’s worth the effort? Wondering what sort of response rates to expect?
Well, below you’ll find more information about that than you might have hoped for.
You see, I know that the biggest objections for conducting surveys of any kind is the potentially low response rate. Nobody wants to spend time designing the survey, coming up with questions to ask, building the widget, setting up various conditions only to see a handful of responses at most.
The thing is, not all survey types perform the same. Some surveys might deliver fewer responses than others, true. But there are also others that perform incredibly well.
In-app surveys are a relatively new thing in SaaS. Sure, we’ve been seeing them popping up for a while. But I notice a growing interest around them now.
And so, together with the team, we empirically evaluate their potential.
You can see our research findings below. (Spoiler, in-app surveys are awesome!)
You’ll find out what are the typical response and completion rates of in-app surveys. We’ll also show you what effect the length of a survey has on those metrics. And we’ll also discuss whether placement can affect the results you get.
Before we get to that, though, let’s cover some basics, shall we?
What Exactly is an In-app Survey?
In simplest terms, an in-app survey is a survey that you integrate on your page or app.
Instead of sending a survey via email and presenting it outside of your product, you trigger the widget to users who are using your app at that very moment.
The beauty of those surveys is that they allow you to ask product-related questions in the context. The person is using your app at that very moment, after all. You’ll soon see how much difference just this one thing alone makes to the survey’s performance.
But let’s not get ahead of ourselves…
Why in-app surveys gain such popularity?
I, probably, answered the question above when talking about being able to survey customers in context. But there are other reasons worth considering:
- Incredibly laser-focused targeting. Let’s face it, it’s virtually impossible to survey someone who’s not a customer with this type of feedback. As a result, with in-app surveys, you can be absolutely sure that whatever insights you collect represent your ideal customers’ opinion.
- Simpler customer feedback. In-app surveys help your users too. They can respond to a survey quickly without breaking their process or having to go to another screen or app to do so.
- Speed of collecting feedback. Again, because the survey happens in an app, you collect feedback almost instantly. The customer doesn’t have to make a conscious decision to switch from what they were doing to complete the survey.
What can you use in-app surveys for?
In-app surveys, or microsurveys, as they’re also called are just a method to ask product users anything you want. For that reason, you can use them to measure customer satisfaction with CSAT. Or run an NPS survey in-app to measure how likely customers are to recommend you to others. Or collect feedback on any other issue or factor that you want to research further.
This company, for example, runs an in-app survey to evaluate their users’ experience with a particular feature – the main dashboard within the tool.
Another popular use case for in-app surveys is user research. For example, our customer Livestorm uses in-app surveys to better understand who their users are and what they are looking to achieve with their tool.
But great as in-app surveys might seem after reading this, do they really work?
Let’s find out.
In-app Surveys Response Rates Research – Methodology
To make our research as statically significant, we decided on certain ground rules for evaluating in-app survey performance.
For this research, we evaluated the performance of the last 500 in-app surveys created with Refiner that passed the following criteria:
- The widget had to have generated at least 100 survey views during the original survey run,
- The survey also had to collect at least 50 responses during its run.
- Do not use logic-jump to dynamically adjust the survey’s length to each individual user.
In total, the 500 surveys we evaluated generated 7.963.216 views and 1.344.251 survey responses.
A few more notes to clarify different terminology we used in the research:
- A survey is a “web app survey widget” launched within a web application where users are identified by a unique ID.
- A survey view refers to a survey displayed for a user in the app.
- A response rate is the number of survey views where the user answered at least one question in the survey.
- A completion rate is the total number of survey views where participants responded to all questions in the survey.
Our Results
Part 1. Response and completion rates
Let’s tackle the big question first – Do in-app surveys generate any satisfactory results?
Our research findings suggest a resounding yes.
Take a look at the data below. Surveys in our sample set of 500 generated a whopping 25.25% of response rates. Completion rates fared just slightly below that, at 23.04%.
This close tie between response rates and completion rates makes sense. In-app surveys are usually short, with 2-3 questions. There is little friction due to length, leading to the majority of users completing the entire survey.
But these were averages. Let’s dig deeper into the data to see the exact breakdown of response rates.
The majority of web-app survey widgets we evaluated in our sample set have response rates somewhere between 15% and 25%. That said, many of them generated response rates of above 30% as well.
Part 2. Survey length
Let’s focus on the length now. It’s a known fact that the longer the survey, the fewer responses you might expect to receive. Not every participant will want to or be in a position to dedicate time and effort into completing it, after all.
Unsurprisingly, our research also found completion rates to be dropping with each additional question in the survey.
Now, most of the survey widgets we reviewed for this research had no more than three elements. As you can see from the results below, surveys with 1-2 questions achieved the highest completion rate by far. From then on, the completion rates began to drop, with an interesting exception of a 4-question survey.
(Worth to note – Refiner has a logic-jump feature that allows companies to personalize the survey flow and dynamically change the survey’s length. To avoid any inconsistency in the data, we excluded any surveys using that feature from this research.)
Part 3. Survey placement
The final factor we evaluated is the effect of survey placement – the exact location where you display it in the app – on the response rate.
Our initial hypothesis was that the more prominent the survey is, the more likely customers are to respond to it.
We evaluated the performance of the most popular in-app survey placements to uncover which one converted the most.
And here are our findings.
As you can see, the central modal performed the best, generating a 39.9% of complete rates while surveys placed at the top right of the screen fared the worst. It’s also worth noting that the second-best performing placement – top-center slide in – was also the least used survey placement in our data set. This fact may have attributed to its higher performance, and further research might be required to fully establish its actual conversion rates.
And there you have it…
… a research-based answer to whether in-app survey work, and what sort of results you could expect from them.
What’s left now is to go on and start collecting product feedback with in-app surveys.
Good luck!