Reviewing Feedback from Customer Satisfaction Surveys

When we deliver the results of our customer satisfaction surveys we usually spend a whole day with the client’s senior team, going through the feedback and helping them to prioritise their next steps.

When I first joined InfoQuest back in 1999 the modus operandi was to spend a couple of hours with whoever had commissioned the survey (and perhaps one or two of their colleagues) going through the report, pointing out the blindingly obvious, drinking their coffee and walking off into the sunset with their cheque.  I really wasn’t happy with this process, although our clients didn’t seem to expect any more than this.  Then, one day, Mike, a client rang, asking me to help them with their German team.  We’d been doing their European customer satisfaction surveys every 18-months for a few years and the results weren’t going in the right direction.  Of course I said yes.  I asked Mike how I could help, and he asked me to go with him to Germany.

Categories

A couple of weeks later I caught up with him on the aeroplane and asked him what he wanted me to do.  He said “anything you want”.  I then asked Mike how long I’d got with the team.  (There were going to be about ten people at the meeting, all senior folk, responsible for Germany, Austria, Switzerland, Eastern Europe and Russia and I was expecting him to say half an hour, or maybe forty minutes).  Mike told me that I’d got them from 08:30 until one o’clock and then he needed to spend a couple of hours with them.  Oops.  I had to prepare pretty quickly, and that’s how the InfoQuest post-survey workshop was born.

I drew up an outline sketch of the Brainstorm Scorer, which I’d developed in a former life, and asked one of the PA’s to make a pretty version of it.  Both Mike and I found the workshop fascinating.  The group was split into small teams that were then asked to go through the latest report and come back to the main session with a minimum of ten ideas, based on the customer feedback, that would increase profitable sales.  Once all the ideas had been presented they were scored using the scorer and then prioritised.

This was so much more effective than me going through the report; making assumptions, guessing at the answers and not getting to the depths of the issues.  I have to say thank you to Tom Peters for teaching me two lessons: one was Toyota’s five “Whys” to get to the root of an issue (keep asking “why” until you find out what the real issue is).  The other, again from Toyota, was the principle of building a programme of continuous improvement (Kaizen) out of a mountain of a thousand little ideas.

After the German workshop Mike then asked me to run the same thing around the rest of his European operations.  For him the reports took second place – what he found most useful was the prioritised action plan each of his teams put together.

Since then the workshops have continued to be fine-tuned.  Now we take a full day.  The ideal number of people is between 12 and 16.  And they are asked for twenty rather than ten ideas (the record for a workshop currently stands at 165).  There is a whole chapter at the back of the sample customer satisfaction survey report that explains in detail how it all works.

Contact InfoQuest to start organising your survey

Contact Us