You know by now that creating and distributing customer satisfaction surveys can have enormous benefits on your business.
In addition to measuring and determining specifics regarding customer satisfaction, customer surveys can:
- Allow you to improve specific elements of your product or service
- Improve your customer’s end-to-end experience with your company
- Improve your customer’s perception of your brand as a whole
Unfortunately, there’s no guarantee that your customers will respond to your survey. In fact, there’s a pretty good chance that most of the people you send your survey to aren’t going to respond.
Let’s look at some of the numbers:
- According to Pew Research Center, the response rate of telephone surveys conducted in 2012 was a mere 9%. This is an incredibly drastic decline from 1997, at which point the response rate was 36%.
- Average email survey response rates vary widely, ranging anywhere from 10% to 40%. The norm appears to be somewhere in the area of 20%.
- Surveys conducted through SMS and text message have the highest response rate, at around 50% (contributing to this is the fact that SMS open rates are close to 100%; it seems not many people can resist the urge to check a simple text message).
A quick side note: it’s difficult to measure the response rate of surveys housed on a company’s website, as they aren’t being “delivered”; rather, they’re simply being “presented.”
Survey response rates in different industries also vary widely, as well. This is due to a number of factors, such as a specific audience’s receptiveness to surveys and the preferred medium through which companies within a certain industry communicate with its customers.
However, Deborah Eastman of Satmetrix reports that:
B2B companies see a response rate of between 23-32%, while B2C companies tend to see response rates of only 13-16%.
While a number of factors can affect the response rate of your customer surveys, the fact is that any given survey could see a response rate of as low as 2% or as high as 85%.
The response rate for your next survey depends largely on how you go about creating, distributing, and collecting it.
Calculating Survey Response Rate & Completion Rate
Before we go into how, exactly, you can increase your survey response rate, let’s talk about the difference between response rate and completion rate – as well as why it’s important to keep both of these numbers as high as possible.
Survey Response Rate
Calculating survey response rate simply requires you to divide the number of people who responded to your survey by the number of people you actually sent the survey to, then multiply that number by 100.
If you sent a survey to 50 people, and 20 of them responded, your survey response rate would be 40%: (20/50) x 100 = 40%
Recall that we mentioned the difficulty in determining the response rate of a survey housed on a website. This is because even if you know how many people have visited your site, there’s no way to tell how many of them actually saw or noticed your survey.
On the other hand, it’s easy to keep track of how many people you sent your survey to via email – as well as how many of those individuals responded.
A low response rate can lead to inaccurate or unreliable responses, too.
First of all, the smaller the sample size, the more difficult it is to extrapolate the results to your entire customer base. For example, if you have a customer base of 500 and only 25 people respond to your survey, the data you get back isn’t going to tell you much about your customer base as a whole.
Secondly, a low response rate may lead to non-response bias. In short, this means the responses you do receive may clash with the responses you would have received from other customers, had they actually responded to the survey.
Whatever the case may be, the lower your response rate, the less usable the data you do collect actually becomes.
Survey Completion Rate
Calculating the survey completion rate is just as simple as calculating the response rate. The difference is that the completion rate deals only with the individuals who engaged with your survey in one way or another and ignores those who were sent a survey but declined the opportunity to fill it out.
Survey completion rate is calculated by dividing the number of completed surveys (as in, surveys returned with every answer filled in) by the number of surveys returned as a whole, then multiplying by 100.
So, if you receive 50 surveys, but 10 of them are missing one or more answers, your completion rate would be 80%: 40/50 x 100 = 80%
A low completion rate will also contribute to inaccurate or unreliable responses and instances of non-response bias, as well. Unanswered questions, naturally, contribute to a smaller sample size for that specific information (rather than the survey as a whole) – but this can be just as detrimental to your analysis of the collected data.
A low completion rate may also be a sign that something is wrong with your survey.
Assuming that the individuals who began filling out your survey were actually interested in doing so (compared to those who ignored or denied your request in the first place), it’s also safe to assume those who abandoned your survey faced one or more issues that caused them to do so.
Both your survey response rate and completion rate are important to pay attention to, as they can each provide hints that you need to improve some aspect of your survey.
Whether it be the actual content of the survey, your delivery of the survey, or something else entirely, these factors will ultimately get in the way of your ability to collect accurate and reliable information about your customer’s experience with your brand.
Improving Survey Response Rates
Now that we understand exactly why having (at the very least) a decent survey response rate is so important, we can talk about what you can do to actually maximize the amount of – and quality of – the responses you receive.
We’ll frame the following sections as follows:
- First, we’ll discuss a problem or issue that’s typical of surveys that generate low response rates
- Then, we’ll detail a few things you can do to mitigate these issues
- Finally, we’ll discuss some of the pitfalls to avoid when attempting to fix these problems
Let’s get started.
You Created Your Survey As An Afterthought
This probably doesn’t apply to you, but, for the sake of argument, let’s say you really don’t understand just how powerful a tool customer surveys can be.
Perhaps you’ve thrown together some vague questions regarding your customers’ “level of satisfaction with the service they received” from your company. Maybe you’re just sending out a survey to act as if you care about your customers when in actuality you’re just paying lip service. Or, maybe you really do want to know what your customers think of your company – but you have no idea how to ask them.
First things first: get into the mindset that you absolutely will be asking your customers about their experiences with your company, and you will use what you learn to improve some aspect of your product or service or another.
Once you’ve done this, you’ll be much more equipped to ask your customers the questions that matter. In other words, you’ll be able to begin developing each one of your surveys knowing that the results will go toward improving a specific aspect of your business.
Additionally, make sure your audience knows (again, specifically) why you’re conducting the survey. By keeping your customers apprised of the goings-on within your company, you’ll make them feel valued – and will forge a deeper connection with them in the process.
However, in doing so, avoid putting too much pressure on your customers when asking them to complete the survey. If customers feel like the stakes are too high, they may either provide unreliable answers to certain questions – or may not respond to the survey at all.
Your Survey Targets The Wrong Audience
Another mistake that can be relatively easy to make is creating a single survey and sending it out to your entire customer base.
While it’s understandable that you’d want to get your survey in front of as many of your customers as possible, doing so might actually be detrimental to your purpose for sending out the survey in the first place.
Just as you might target different customer segments and personas with separate marketing campaigns, you should also develop individual surveys with a specific customer segment in mind. Piggybacking off of what we said in the last section, this will ensure that the survey questions you ask actually matter to the customers being asked – making their responses all the more valuable.
Once a survey is ready to be delivered, you have two options to make sure only the right audience completes it:
- Designate which customer segments you want to send the survey to yourself
- Create a pre-survey questionnaire to determine whether a respondent falls within the targeted segment. You can then use disqualifying logic to ensure only those who fit the criteria are able to submit a response
Another potential problem could arise if a respondent – despite falling into the correct customer segment – didn’t want to be sent a survey in the first place. To avoid such circumstances, you might want to solicit permission from your customers before you send out a survey. While you might categorize those who didn’t give permission as “non-respondents,” you can at least take comfort in knowing that, had these individuals felt obligated to respond, their answers likely wouldn’t have been valid anyway.
On that same token, you should also be careful not to assume that every response you receive is necessarily valid. Say, for example, you receive a response with 100% positive (or negative) responses; while there’s a possibility that such a response is valid, there’s also a decent chance that this individual just went on “down the line” without really paying attention to each question being asked.
There are a few ways to avoid including invalid responses in your overall data. First, look for outliers, such as the example mentioned above. Second, consider measuring the time it took for customers to respond; a below-average response time is probably a sign the customer rushed through the survey. Lastly, add the option for respondents to provide open-ended responses to specific questions as they choose.
Your Survey Is Too Long
Be realistic – even the most loyal of customers aren’t going to want to (or be able to) spend an excessive amount of time answering dozens of questions.
And even if they do power through and complete your survey (despite losing interest halfway through), their responses may, again, not be valid or valuable to your organization.
In general, your surveys should err on the side of brevity.
As we’ve said before, you should only ask questions that actually matter to a specific respondent’s persona. Eliminate redundant questions, or those which, when answered, won’t provide much in terms of value. The quicker you can get to the point, the more likely your customers are to respond – and the more likely it is that their responses are valid.
Additionally, when delivering or presenting your survey to customers, give them a ballpark idea of how long completing the survey will take. This will weed out any respondents who, it turns out, don’t want to take the time to complete your survey (who probably would have provided unreliable responses had they done so). It will also allow those that do want to respond to pick a time in which they’ll be able to focus on nothing but completing the survey.
Another option is to implement a progress bar within your survey, showing respondents how many pages they have yet to go (or the percentage of the survey they’ve completed thus far).
One thing not to do is be brief for the sake of being brief. While you do want to get to the point as quickly as possible, you don’t want to exclude questions that could provide valuable insight into your customer’s experience with your service. In other words, when “trimming the fat,” as mentioned above, make sure you’re not accidentally cutting out the good stuff, too.
When To Send Your Survey To Customers
Another aspect of survey delivery relating to time is when you deliver it – both in general terms and in relation to specific customer engagements.
Delivering your survey at the wrong time can be detrimental for two reasons:
- It could end up getting lost in your customer’s email inboxes
- Your customers’ responses may be unreliable due to the lapsed time between use of your services and survey completion
If you plan on delivering a survey to an entire audience segment at once, it’s essential that you know when these individuals will be most responsive to such (similar to how you might plan the best times to send out a weekly newsletter).
This optimum time varies by industry and customer persona; some customers might be most willing to respond at 9AM on Monday, while others might be more receptive at noon on Wednesday.
Since this optimum time varies by industry, you’ll need to figure out which time works best for your customers.
Pro Tip: You can see when people are most active on your website using Google Analytics by following these simple steps:
- Login to Google Analytics at http://analytics.google.com
- Go to the profile that contains your web site
- On the dashboard you’ll see a heat map showing how much traffic your website gets by day and time of day
- Find the busiest day and time of day and send your survey at that time
Here’s an example from our Google Analytics account:
As you can see, around 9am on Tuesday appears to be when our website gets the most visitors, so that would be the perfect time for us to send a survey. People will be in front of their computer and probably at work – the perfect environment from which to answer a survey.
If you want to deliver surveys to specific customers after they’ve engaged with your company (e.g., they’ve purchased a product and you want to ask them about their experience with it), you again need to determine the optimal time to do so.
Doing so requires you consider how much time is typically needed for your customer to get good use out of your product (and to begin seeing the desired results). The trick is to reach your customers after they’ve had ample time to use your product or service, at a time in which their experience is still fresh in their mind.
However, you don’t want to send such a survey too soon after they’ve made a purchase (and haven’t had time to use your product to its fullest extent). If they haven’t yet seen results from using it, sending a survey prematurely might give them the idea that they should have seen results by this point – leading them to provide responses that aren’t exactly positive.
Or, they might complete the survey as best they can at the current time, only to leave out important information that they otherwise wouldn’t have had they received the survey a week later.
Your Customers Have No Reason To Complete Your Survey
This goes along with what we said earlier about ensuring your customers know your purpose for conducting your survey.
If your customers have no reason to complete your survey, they aren’t going to do it. You need to be clear about what your respondents will get in return (either immediately or eventually) for taking the time to fill out the survey in question.
Such “rewards” can come in two forms: intrinsic or extrinsic.
For reasons we’ll discuss in a moment, it’s better to focus on informing respondents about the intrinsic rewards of completing a survey. For example, the information they provide you will help you improve certain aspects of your products or services – in turn resulting in a better experience for your customers. With this information in mind, respondents will understand that it’s in their best interest to answer survey questions as honestly and completely as possible.
You might also decide to offer extrinsic rewards – such as coupons, discounts, and other deals – to customers who take the time to respond to your survey. Needless to say, this might make some customers more likely to complete the survey, as they’re guaranteed to get something in return for their efforts.
The downside to providing extrinsic rewards is, of course, that some individuals might complete the survey just to receive their reward. Those who approach the survey with this mindset may or may not actually take the process seriously – meaning you might end up with a bunch of invalid or otherwise unusable responses. You can avoid such instances by providing an incentive that, while extrinsic in nature, isn’t of any extreme value.
The Logistics Of Your Survey Aren’t Optimal
Lastly, if the logistics of your survey are off, it becomes difficult – sometimes impossible – for your customers to complete it.
Earlier on in this article, we discussed the importance of delivering your survey via the correct channel(s). It makes perfect sense: if you aren’t making your survey available through your customer’s preferred medium, they either aren’t going to want to complete it or they aren’t even going to notice it in the first place.
Furthermore, your survey needs to be easily understandable and easy to complete. If, for example, the answer choices provided are ambiguous (“Does 5 mean ‘strongly agree’ or ‘strongly disagree’?”), your respondents will immediately be confused – and, again, will either provide unreliable answers, or not provide any answer at all. Or, if a glitch (such as a page load error) arises, respondents probably aren’t going to spend much time trying to fix the issue.
So, to be clear, before you send out your survey, make sure that:
- You know which channels to deliver it by
- The survey’s instructions are clear and understandable
- You’ve tested your survey for bugs and other glitches
One thing to keep in mind, though, is to not provide so much guidance that your respondents become bored. While you definitely want to be sure your customers understand the meaning behind certain answer choices, you don’t want to spend too much time explaining things like how to fill out the survey and how to submit it once they complete it. Your customers are smart; give them what they need to know, then let them take care of the rest.
The Design Of Your Survey Isn’t Appealing
As consumers we’ve been trained by companies like Facebook, Google and Airbnb to almost expect beautiful design in the software, tools and websites we use.
If your survey is poorly designed, is hard to read, isn’t responsive for mobile devices or just generally looks stale, this will have a huge negative impact on both your completion and response rates.
The easiest way to make sure the design of your survey appeals to your customers is to use survey software with a focus on design and user experience. You can use survey software like SurveyKing or SurveyMonkey to create and send surveys with an appealing design.
Conducting surveys of your customer’s experience with your products or services is an efficient and effective way to know what your company is doing well and how you need to improve.
But before you’re able to collect and analyze the data that will help you make these improvements, you need to be sure that your customers actually respond to your surveys. You also need to be sure the answers you receive are honest, accurate, and reliable.
Keep this in mind as you create your next customer survey and you’ll hopefully see an increase in your survey response rate immediately.