Question

Topic: Research/Metrics

How To Deal With Item Non-response

Posted by Anonymous on 350 Points
we recently conducted a customer satisfaction survey where we asked our customers to rate our various departments on specific services offered, as well as overall satisfaction.

some respondents did not give ratings on some of the items (specific services). one item had as high as 44% no response.

how do we deal with this so that we can come up with objective conclusions from the survey?
To continue reading this question and the solution, sign up ... it's free!

RESPONSES

  • Posted by CarolBlaha on Member
    come on-- just cause you put a survey out there --what is there reason to respond?

    Think of it from the customer's point of view-- and that is too revealing. Why aren't the customers responding? Where are you on the top of mind of the customers?

  • Posted by Levon on Accepted
    You can tell a lot by a customer not answering a Question. It might be something that the customer does not want to discuss with you or it might of not made sense to the person taking the survey and they didn't know how to answer your question. Was it a poorly written question? If the answer is yes - then re-write it for the next survey.
  • Posted by Paul Linnell on Accepted
    Hi Sheila

    Perhaps the people who didn’t respond to some of your questions had not enough experience of those departments to offer a response. This is good because it is better they don’t respond to those questions than give you an arbitrary tick (usually in the middle box). This would give you a very suspect result. It is often safer to provide a “Don’t know” or “Not Applicable” option for this type of question.

    Anyway, for each question, you should focus your analysis on those answers you did receive and ignore the “null” responses.

    I hope that helps a little.

    Best regards

    Paul Linnell
  • Posted by Susan Oakes on Accepted
    Hi Sheila,

    Perhaps have a look at their other responses and see if you can see any trends emerging from the ones that didn't answer all the questions. You can also do the same for the customers who answered all the questions.

    Although doing the above may not be statistically correct, you may be able to get at least some guidance.

    Best of luck
    Susan
    M4B
  • Posted by saul.dobney on Accepted
    If the customer is not giving you ratings then they are probably telling you that they either have no experience of the service, or that it is not relevant too them. Many many satisfaction studies ask questions that are important for the business, but that are completely irrelevant to the customer - I have experience of getting mad with surveys that failed to ask me the questions I wanted to answer.

    The key part of a satisfaction study is that it is a measurement of dissatisfaction so that you can fix things. So one view is that the customer should rate you on the areas the customer wants to rate you on, rather than the things you think you want to be rated on.
  • Posted by adammjw on Member
    Sheila,

    You got the best answers you could expect, given the scarcity of info you provided.
    I would only add to it all that perhaps you should have customized your survey based on heavy or light use of specific services.Light users or no-users should be given different options to choose from than heavy ones.
    If the population is not large you could think of directly contacting your non-answer clients and ask them what the reasons are behind their no-response.
  • Posted on Accepted
    Sheila,

    In the future, you can build skip patterns into your survey so that respondents aren't asked to rate satisfaction with services they may not have experienced. Start off with a progression of measurements: awareness, experience with, satisfaction with experience.

    Also, a "not applicable" response option should be added for each item rated. Your satisfaction means then are calculated only on those who selected a rating, excluding the "NAs."

    In this case, however, I agree that you should look across the respondents and clean the data by eliminating those respondents who have skipped a lot of questions or given illogical answers (e.g., saying at the beginning that they were a user of a product, but then checking off "NA" for questions related to that service.)

    When the data is clean, look at the specific questions. If you don't have enough respondents for a statistically significant analysis, you can look at what you do have and report it as directional only. It certainly can be telling, for example, if far fewer respondents gave ratings for a certain feature. This could mean that they don't know about it or know about it but haven't used it. If this was a feature your firm has spent some resources on promoting, or it it's a recent addition, the message probably isn't getting through.

    Good luck
  • Posted on Accepted
    In addition to the good advice you've already received, you should make a note of the lower response rate for the relevant questions (assuming, as pointed out above, that sample size will support statistical analysis) when you are preparing the report so that you don't mislead your audience.

    In the future, you might consider asking respondents to not only rate their satisfaction with various attributes, but also to rate how important that attribute is to them. That may also take away a lot of the guesswork in trying to figure out why they're not answering.

    Finally, if you are giving respondents an incentive, you can think about making questions mandatory in order for them to complete the survey and receive the incentive. Although, you should still have an "N/A" option.
  • Posted by CarolBlaha on Member
    good post above-- I'd also add-- a question I always include and I think says a lot is-- "would you recommend this to another?" Why or why not.
  • Posted on Accepted
    When i fill out a survey and leave something blank or no response it's most likely because the rating system is so close i can't decide, when someone gives you choices e.g. your feelings - very strong, strong, so-so, don't care, that's not clear enough, maybe like just three choices. Your feelings on this subject are: I do not use this service, I use this service occasionally or I use this service a lot and it's great?

    Most people will not take the time for surveys if they have to read a lot of info or make lots of choices. We're lazy, what i can say.

    As they say in journalism, write like you're writing for a grade schooler, if it's too challenging, you'll miss the majority of your audience. Use the MISS method: Make it simple stupid.

    It's always worked for me :)

Post a Comment