Going the Extra Mile – Exploring Survey Design: Case Study 4

In my last article, I took a break from my series of case studies on Exploring Survey Design so I could illuminate some alternatives for obtaining a rank ordering of a list of items (see http://www.surveygizmo.com/survey-blog/alternative-ranking-questions/). Now I am back with another case study on survey design.

A common use of surveys is to investigate new markets. I hear from people all the time who have new product ideas. They usually want to learn about the market’s reaction to their product before investing a lot of money, or they may need market data to solicit support from outside investors.

A New Type of Golf Shoe

A client came to Survey Design and Analysis to obtain data to test the acceptance of their new product and to use that data to solicit potential investors.

The product was a golf shoe with markings on it to help golfers line up their golf shot. Our client had already completed an in-depth study on golf swings and developed the markings based on recommended stances from successful professional golfers.

There were two pairs of shoes, one based on a stance recommended by David Leadbetter and Sam Snead and one based on a stance recommended by Ben Hogan. Hogan believed that the reason most balls were not hit straight (a big deal in golf) was because of a golfer’s incorrect stance. In fact, he attributed 80% of improperly hit golf balls to an incorrect stance. This led to our client’s invention.

The client conducted some research with friends and family and the results showed that the shoes worked to improve scores. Now they wanted to get some comprehensive data before moving forward with a larger investment. We were engaged to design and execute the survey.

An In-depth Approach to Design

As I have said repeatedly on this blog, design is the most critical part of the survey process. If you aren’t asking the right questions in the right way, then nothing else about the survey matters much. In the case of investigating new product ideas, the design is especially difficult to get right because, by definition, you have no information to go on.

In this case, in order to craft the right questions for the survey, it was important to understand how golfers would view the shoes, to find out what they would think about the idea, understand what they like, what they don’t like, and even get suggestions from them about what would make the new shoes better.

How often have you done a survey and only discovered afterwords what you should have asked?

It’s almost as if we need to first do the survey in order to know what to ask and how to ask it. That’s it. That is the answer! Well, almost.

What we did was design a mini-survey of golfers using personal interviewing (some over the telephone and some in person) with an open-ended format to allow for two-way communication. The golfers could ask us questions while we explored their thoughts. This provided what we needed in order to design the “real” survey, the one that we would put online and send to a wider audience, i.e., hundreds of golfers.

The Mini-Survey

What group could better help us understand how golfers think, how to use golfing language, and how to discern the most important considerations than golfers themselves? We recruited 18 golfers among friends and families to participate in a “pre-survey.”

Complete Guide to Great Survey Design

Make your next survey project your best yet with this 6-part guide to survey design.

Get the Ebook

The objectives for the survey were the following:

  • Probe a few golfers on golf shoe preferences to test questions, sequencing, and discover any unexpected issues
  • Test initial reaction to the performance-enhancing golf shoes
  • Develop hypotheses on buying behaviors
  • Probe for best distribution/influence channels for golfers

Here is a summary of the approach:

Interviews with 18 golfers

  • Thirteen were men, five were women
  • Ages ranged from 20-74
  • Golfers were from five different states
  • Golf scores ranged from the 70s to the 120s
  • Everyone saw a picture of the Leadbetter-Snead shoes
  • Some saw pictures of the Hogan shoes
  • Seven were interviewed in person; eleven by phone

Two golf pros were interviewed in person

  • This was an addition to the scope to discover any additional issues we might need to consider
  • During the interviews, ten participants agreed to test the concept by wearing the shoes while golfing and then agreeing to one more interview afterwords. Having golfers actually test out the shoes was a unique opportunity that provided even more insight into how most players would perceive this new product idea.

    Final Design

    All interviews from both groups (those that wore the shoes and those that only saw pictures and descriptions of the shoes) lasted 30-90 minutes. Although the interviews provided a great deal of valuable information, it was still a small group: only 18 golfers. A larger sample was needed in order to provide solid information for further inference. The information collected from the interviews formed the basis for creating an online survey that was sent to a large number of golfers.

    The process for developing the online survey proceeded as it would for any survey design: questions were formulated based on the survey objectives and the data from the pre-survey; a survey was developed and then put online for review. Our clients and even some of those who had been interviewed participated in the review of the online survey. Further edits were made and the survey was tested.

    As I’ve emphasized over and over again, you only have one chance with the survey design. Once it is designed and sent out you can’t hear the reactions to it or bring it back for changes. The approach we had taken provided us assurance that the survey resonated with golfers, included the right questions, and didn’t miss any important aspects.


    Clearly our approach required a lot of effort and time to get the design right. Was it worth all the effort?

    The survey was sent to a rented list of “golf enthusiasts.” Rented lists like the one we used are notorious for low response rates, usually around 1-3%. In addition, when you use rented lists you are charged for emails sent not responses received. However, in this case, over 93% of those viewing the survey completed it. 836 golfers responded to the survey, more than four times the expected response! The design was a huge success, which made the whole project successful.

    We often get unsolicited comments on the survey itself, but this time we received more than we ever had before. Here is a sampling:

    “A very attractive and easy to read survey.”
    “Good survey…”
    “Good survey. To the point. No duplication. Time line was right on.”
    “I enjoyed doing this survey…”
    “I really enjoyed taking this survey.”
    “Nice and short survey [Note: the survey had 29 questions.] Nice that there is space for comments.”
    “Nicely set up survey.”
    “Thank you for letting me take your survey. It was a great use of my time.”
    “Thank you. Interesting survey.”
    “Very interesting survey”
    “Very nice survey”

    This example is a special case of a two-phased approach we have found to be very effective in certain situations. In my next blog article I will present this approach in more general terms and provide more details on its application.

    Join the Conversation