Exploring Survey Design: Case Study 1, Part I

As I mentioned in my previous post, over the next couple of months I’ll be putting together a couple of case studies that go through the process of survey design and implementation. The first case study in this series will go through my interaction with a company (actual name withheld for their privacy) and their need to do some pricing research.

Pricing Survey Case Study

A leading price optimization and management solutions company, TriBlend Solutions, needs information to guide its marketing efforts. They plan to do a survey of customers and prospects in order to better understand their market and their target audience.


TriBlend Solutions (TS) partnered with the Professional Pricing Society (PPS) to do their survey. TS was considering doing the survey themselves. They had a list of questions and PPS had an account for doing surveys. (This would change!)

The main players for this project were Lori, Lori’s co-worker, Eric and their manager Andy, all with TS. TS was looking to get some help with the design and structure of their survey. Lori found Survey Design and Analysis from a web search and that’s how she came into contact with me. Lori and her team had most of their questions written and suspected it wouldn’t take much time to have someone help them create the final survey.

Lori’s original request was on June 24. They planned to deploy in the fall.

On June 27 I responded to Lori with a request for more information on the following:

  • Your survey objectives and reasons for starting this undertaking
  • The number of names (email addresses) on your survey list
  • Whether you’ve done previous studies that were similar
  • The proposed size of the survey (number of questions)

[The design process actually starts in the proposal phase!]

The information she provided was used to create a proposal. After several delays through July we had a conference call on August 2 to discuss the details of our proposal.

On August 8th our proposal to design their questionnaire and make recommendations for executing the survey was accepted. A few days later the work was expanded to include the complete execution of the survey. The survey design always includes considerations for execution and Lori realized the complete execution would be a natural extension of the design process.

Survey Design Process

A kickoff meeting (conference call) was scheduled for August 14. Prior to the meeting the following survey objectives were enumerated:

  • Learn what they can do better
  • Look at challenges, prepare for future
  • Assess the momentum in industry
  • Understand current pricing capabilities

Kickoff Meeting (conference call), August 14 – Ed with Lori, Andy and Eric

This first meeting lasted over two hours on the phone. It went long because it was productive. Everyone felt that we made good progress. There was that “ah-ha” moment when Lori and Eric realized that designing a good survey meant thinking through EVERY question completely. Comment from Lori:

“I know it was a long meeting but we’re so happy to be working with you. We’re really confident that in the end we’re going to have a fabulous survey.”

Complete Guide to Great Survey Design

Make your next survey project your best yet with the free, step-by-step guide.

Get Ebook

There were 35 questions in the original list and we talked about each question. What are we trying to get from this question? Will it be ambiguous to the respondent? Is it clear? Here are a couple of examples from our discussions:

  1. Has your company’s pricing process been discussed or addressed by your company’s board, investors or financial analysts?
    Yes – answer 5 then go to 6
    No – go to 6
    What does “discussed or addressed,” mean? Are they the same thing or different? And what does “pricing process,” mean? Is the meaning the same for every company who will respond?
  2. What is their primary focus or directive?
    “Primary focus or directive”? Are we trying to get at “business goals”? The wording is too vague and will probably mean something different for each respondent.
  3. What functional group is driving the pricing improvement initiative?
    Driving or leading? Which are we looking for?
  4. Do you plan on implementing price management software?
    Is implementing the right word or is deploying more appropriate for software?

This is a small subset. Every question (all 35) were discussed and evaluated from the viewpoint of the person who would be answering the question. As I’ve mentioned on this blog before, it isn’t so critical that everyone understand every question. What is critical is that everyone has the same understanding of every question.

Questions need to be so clear that there can be only one way to view them.

Discussion about the wording for each question led to work on the structure of the questions. How can we order the questions so that they flow evenly and the information builds in a way that makes it easier for the respondent? Are there background or qualifying questions needed to help the respondent see what is coming?

At the end of this first conference call, we realized that we needed an introduction and a few qualifying questions in order to not only help the respondent but also for us to know who (what type of person) was answering the survey.

Our next steps were for TriBlend to put together definitions or explanations for the following terms (this list started out differently and changed several times over the course of the design process):

  • Pricing team
  • Pricing technology
  • Pricing process
  • Pricing tool
  • Pricing function
  • Pricing effectiveness
  • Price management software

We also determined that we needed to understand the “pricing activities” that typical companies go through. This would help us have a basic understanding of whom we were dealing with and the perspective they have for answering the questions. TriBlend took on this task.

I created the next draft of the survey, which I’ll call version 4 (TriBlend had gone through a couple of revisions at the onset).

Version 4 included:

The following introduction:

“Thank you for participating in our pricing study. The information you provide will be used to better understand the state of the current technology, challenges and trends in our industry. These questions should take 10-15 minutes of your time. If you are interested in receiving a summary of the results of this study be sure to indicate so on the last page of this questionnaire.”

A list of definitions to which the respondent could refer throughout the survey:

In what follows several terms will be used. Please refer to these definitions as needed.

  • I. Pricing – Any and all activities involved with understanding and or setting prices for a company’s products
  • II. Pricing Process – Any sequence of pricing-related activities
  • III. Pricing Tool – Any software, commercial or otherwise, used to support a pricing activity
  • IV. Price Management Software – A commercial pricing tool specifically designed to improve pricing effectiveness
  • V. Pricing Effectiveness – The degree to which your pricing processes and or tools produce the desired results in terms of revenue, win rates, margins and predictability
  • And a set of introductory or qualifying questions:

    1. How familiar are you with the “pricing” at your company?
    2. What is your role with regard to pricing at your company? (Select all that apply)
    3. Overall, how would rate the effectiveness of the pricing processes at your company?
    4. How would rate the effectiveness of the pricing tools at your company?
    5. What method do you use to set and negotiate prices?
    6. Comments On Introductory Questions

    [Note the first five of these questions were multiple choice but the choices have been left off for brevity sake.]

    Version 4 was sent to Lori, Andy and Eric on August 18 for their review. I asked them to focus on the following points:

    • Addition and deletion of questions
    • Placement of questions in sections
    • Placement of Sections
    • Section titles
    • Length of the survey and flow of the questions
    • Question applicability by type of respondent (managers versus workers)
    • Executive only questions (i.e. do some questions apply only to executives)
    • Understandability and definitions

    As part of the next review we were able to focus on which questions could be answered by which type of customer or prospect. This led to adding branching and question skipping (and a chance for SurveyGizmo features to shine). We also expanded the review to include more people identified by Lori and Eric.

    The survey was shaping up but it was a long way from being finalized and ready to launch, as we will see in my next blog entry. The survey still needed detailed refinement, deployment online, more reviews of the online version and testing before it would be ready to launch. As I continue this case study we will watch the survey go to version 12, see the final product and see how all the design work paid off.

    Join the Conversation