Because Thriday is an all-in-one platform, the onboarding experience needs to be taken seriously or customers won’t know how to get the most out of it.
People are joining Thriday without even really understanding how powerful it is. Some people are signing up, not realizing that they get access to a full business transaction account, so there’s all these things that the feedback loop can help us improve.
To make the best experience possible, Justin decided to measure effort. First, once after the registration. Then, after each key feature was experienced for the first time including: invoicing, bookkeeping, etc.
With invoicing, for example, a survey is triggered after the first time they’ve used that feature. And then if they’ve used invoicing again, the next time they use it after 30 days is up, they’ll get that survey again.
The goal of the recurrence? Being able to track improvements over time.
Hopefully a person that gave us a four will give us a five next time, because we’ve improved that part, or they’ve figured out how to use it properly. We ask the same people with the same surveys multiple times, but obviously with delays in between so we don’t annoy them.
With all these surveys, they get to pinpoint exactly what needs to be improved instead of relying on vague feedback.
Our initial benchmark across the line for the customer effort score is to have a score above four for each of the features. But at a high level, I can look at that each day or each month as an aggregate right across all of the features and know that we’re above that benchmark. And we are also able to look at month over month. We can also do that at a feature level, because some features are more used than others, which skews the overall score.
When a score gets lower than it should, they see it very quickly and can take corrective action right away.
If we see something that needs some work, then we’ll go on and do a couple of sprints on adding some more functionality or ironing out some little bugs or whatever it is we discover from customer feedback. And then once we’ve implemented those changes, we can then go and see how it changed the customer effort score to know we improved the feature in a measurable way.
While the current focus is on feedback, they already know what they will do for follow-ups.
We have a plan in place as well for the people that give us really good feedback. We’re going to automate a follow-up to ask them if they can introduce us to any other small business owners as a referral play. But at this stage, it’s just collecting that feedback, improving the product based on the feedback, letting people know that we are working on the things that they need improving.