* You are viewing the archive for the ‘Surveys’ Category. View the rest of the archives.

Book Review: The Ultimate Question

Uq SmThe business world is filled with an overwhelming number of questions and uncertainties. As statisticians analyze the uncertainties, the number of questions they ask seems to grow exponentially.

Business consultant and author Fred Reichheld thinks he has found the question that all companies need to ask in order to determine just how loyal their customers are – and he has humbly called it the ultimate question.

Reichheld talks about this ultimate question and what it should mean to you and your business in in his 200 page book entitled The Ultimate Question: Driving Good Profits and True Growth. The book, first published in 2006 by Harvard Business School Press, primarily focuses on three key areas: the “ultimate question,” a scoring method called “Net Promoter,” and the importance of “good profits.”

The “ultimate question” is the simple and common question of “How likely are you to recommend Company X to a friend or colleague?”. Net Promoter is a scoring method that subtracts the proportion of detractors from the proportion of promoters. Good profits are simply profits that come from people that actually want to use your products and services (as opposed to those who might be locked into contractors or dissatisfied for one reason or another).

Like many things in customer service, the premise behind the book and the Net Promoter concept is laughably simple: if you deliver an experience that makes people genuinely want to recommend your company to their friends, family, or colleagues, you’re going to grow. Just like many business books, The Ultimate Question takes this relatively simple concept and adds strategically placed healthy servings of jargon, buzzwords, and acronyms to help justify the three hours and $20 that the book will cost. After the first 50 pages, the book starts to drag on and get redundant, but there are plenty of examples and tidbits to make it worth reading until the end.

With that said, I’d still recommend reading the book because it clearly articulates some very important aspects of business and customer service. Recihheld’s core points make sense and the examples he provides are interesting. After reading the book, any competent customer service manager or executive can easily conduct a Net Promoter survey and make use of the results. He clearly explains what Net Promoter is, why it should matter to your business, and how to make it work. Even though I don’t agree with Recihheld’s view that the “would you recommend” question is the only question that needs to be asked (I think you need more information than that), I still think that the “would you recommend” question is a great question to ask and that Net Promoter has its merits.

Net Promoter isn’t exactly new to the business world and that may very well be one of its biggest strengths. A whole host of companies in a variety of industries make use of Net Promoter and many of them are fairly transparent about their scores. It’s interesting to see what your Net Promoter score is and then compare that to some of the big companies in your industry. The average Net Promoter score is around 10 and it’s possible to have a score anywhere between -100 and 100.

I’ve conducted Net Promoter surveys for several companies and have always found the results to be useful when they are coupled with other questions. Net Promoter doesn’t tell you everything, but there is really very little to lose in asking your customers how likely they are to recommend your company to a friend or colleague. You might be in for a rude awakening, but you’ll almost certainly come out of the process knowing more than you did before. Once you have the results from your first Net Promoter survey, you’ll be faced with the true ultimate question, the question of how to improve.

Bottomline: Despite being slightly redundant, The Ultimate Question clearly articulates the importance of and how to measure customer loyalty. You may not agree with all of Reichheld’s points, but a majority of them make sense and are applicable to almost any business.

Pros: The book fully explains Net Promoter and why it matters. It provides a plethora of advice and action items that managers and executives can use to start tracking customer loyalty.

Cons: Some of them Reichheld’s methods are more academic than they are practical and the second half of book gets annoyingly redundant.

Interested? You can purchase the book on Amazon.com for about $20. You can also see some of my other posts about Net Promoter here.

What’s the ideal response rate?

Numbers
I am not a statistics expert. I know the basics: I can write reasonably balanced questions, I know which questions to ask to get the information I’m interested in, I know what is and is not a random sample, and I can use Excel to an extent where I can organize and make sense of the data. I am not a math guy, but I do find the numbers behind customer service interesting.

Most of the time, the numbers I get from surveys just confirm common sense and/or my original assumptions, but I also do learn things about the company and the customers in question just about every time. They’re worth doing, especially considering they cost relatively little and don’t take that much time.

However, a question that always plagues me is how much of a response rate is enough. I’ve seen response rates for various surveys range from 3 or 4% (for long, annoying surveys) to more than 70% (for short, direct surveys). I’ve used incentives to increase response rates and then I’ve run incentive-less surveys.

The most recent survey I ran for a client saw response rates increase by about 7 percentage points when a fairly reasonable incentive was added. The first time around, I worked with the client to run a short and simple survey that ended up yielding about a 13% response rate. Then, we added an incentive, ran the exact same survey and sent it to a slightly different group, and saw the response rate go up to go about to about 20% (which I still thought was kind of low). The randomly selected person who won the giveaway was delighted and I am confident that he is now a customer for life (despite the fact that he rated the company positively anyway).

Companies I have worked with have had survey response rates that are all over the place. Some companies just seem to have more responsive customers than others. Some surveys tend to attract more attention than others. A lot of it seems kind of random until you’ve done it a few times at the particular company.

With that in mind, is anyone aware of a good response rate that yields accurate enough results and is still manageable to obtain? I like to aim for about 30% in the surveys I do, but that is often a little bit too optimistic. What would you say is a good response rate and how would you go about achieving that?

Book Review: Delivering and Measuring Customer Service

Front-Cover-150
Another book I recently finished reading was Delivering and Measuring Customer Service by Richard D. Hanks. The book focuses on two key aspects of customer service: actually delivering it and then getting real-time feedback that you can use to improve upon it. The book is relatively sparse on details about the delivering aspect and focuses much more intently on the importance of and the best practices for measuring customer service.

Author Richard Hanks told me he decided to write the book because he was frustrated with a lack of hands on, practical books that addressed the topic of how to measure customer service. There were plenty of long, relatively boring “academic” type books on the subject, but he noticed a serious lack of “here’s how you do it” books. Thinking Delivering and Measuring Customer Service could help fill that gap, Richard worked on writing down and summarizing what he learned from his work at Marriott Hotels, PepsiCo, and most recently, his survey company Mindshare. His perspective is a unique one that makes for an informative book that is also an interesting read.

Delivering and Measuring Customer Service talks a lot about the importance of real time feedback and subsequently, the importance of mastering “the boring, everyday.” As Richard explained to me, if you run a hotel that’s located in an exquisite location, provides great customer service, and has wonderful food, you’d think your customers are going to be pretty happy. They should be, but if you don’t master the “boring, everyday” things like having clean bathrooms or ensuring the light bulbs in the room work after each guess, customers are going to be frustrated. If the bathroom in the room is dirty, the customer isn’t going to leave happy, no matter how good the rest of the experience is.

The actual book, which is about 200 pages of pretty easy reading, is divided into seven primary sections: General Overview, Cultural Catalysts of Service, Gathering Customer Experience Feedback, Analyzing the Results, Using Customer Feedback to Improve, Customer Service Recovery and Follow-up, and finally, Tips and Tricks. Each section contains a few sub-sections that delve into specific areas. They’re generally well presented, well organized, and informative.

Perhaps most importantly for this type of book, Delivering and Measuring Customer Service gives plenty of good tips that managers can act on right away. I read the book with a highlighter in hand and found myself highlighting something that I thought was interesting or insightful once every few pages. Like most of customer service, a lot of the advice is common sense, but a vast majority of customer service managers will be able to get something useful from this book, particularly with the book’s focus on measuring customer service. Very few customer service books spend so much time on the importance of and how to measure customer service.

According to Richard, great customer service and at the very least, mastery of the “boring” stuff stems from the repetition of consistency and dependability. To be a great customer service organization, you need to be able to provide great service all the time. Customers then start to expect great customer service and a standard is created. The ability to keep up with that standard is what sets the mediocre companies apart from the exceptional companies.

Bottomline: Delivering and Measuring Customer Service is a great book for those interested in the subject the name implies. It’s an easy and entertaining ready that is full of useful advice, guides, and information that customer service or business managers can take back to their teams and start acting on right away.

Pros: Easy to read with clever cartoons scattered throughout book, more than enough useful insight and advice to justify the price and time

Cons: The book tends to only touch (as opposed to explain in detail) many areas and also happens to jumps around. The lack of detail is both expected and acceptable given the book’s broad subject area and the jumping around isn’t noticeable or important to those reading the book for its content, as opposed to its literary merit (which is how most business books should be read).

Interested? You can buy the book on Amazon.com for about $20.

Give the customer what they want.

A lot of companies design their customer service experiences around what they think the customer wants. However, companies rarely make an effort to find out what the customer actually wants. While the two (thinks and actually) are often quite similar, there are always going to be differences and disparities.

Surveying customers is a really powerful tool. And you can extend the definition of surveys beyond a formal “check the appropriate box” type survey. Even casual conversations with customers about what they want and expect from the customer service experience is better than just assuming what they want and expect. The goal is to get feedback and ideas, as well as to understand what your customers are thinking.

For example, some companies will kill themselves and try to get response times under 10 or hold time under 1 minute. These are goals that look nice in marketing material and on the resumes of customer service executives, but they may not always mean as much to your customers. It’s very possible that, after surveying, you find out your customers would much rather wait 20 minutes for a response or 5 minutes on hold and get a better initial response or have a call that is less rushed.

Examples like that are typical examples of company goals and customer goals not aligning because the company just isn’t in tune with what the customer wants. Showing an effort to reduce response times and reduce hold times is obviously an effort, but in many cases, that effort could probably be better placed elsewhere.

The reverse can always apply and the preference can always change. If you experience a service outage and customers feel your company didn’t communicate quickly enough, they might change their preference and start to prefer shorter response times instead of better initial responses. Constant surveying and constant conversations with customers will reveal what the preferences are.

And the most important part is to survey as much as you can (without annoying your customers, of course). Ask them about as much as the customer service experience as possible. You can never get too much feedback from your customers.

Are you sure you want to provide negative feedback?

Apparently, I’m rude. I’m inconsiderate, thoughtless, and downright offensive. I have no regard for the feelings of others and might even go out of my way to make people feel bad. At least, that’s the way GoDaddy made me feel when I filled out a survey they sent me today.

Negativeok
A confirmation box that popped up when I filled out a GoDaddy post-call survey.

Putting the melodrama aside, I was actually quite surprised when GoDaddy asked me if I was sure I wanted to submit negative feedback. Their exact words were: “You are about to submit negative feedback for this survey. Do you wish to continue?” At first, I didn’t believe it. I had to read it again. I had never seen that before. Sure enough, a company was actually asking me to think twice my choice to provide negative feedback.

I’m not if GoDaddy realizes the point of surveying their customers. The point is not to get the best numbers. Instead, the point is (or rather, should be) to get the most honest answers and feedback about their customer service. The best way to avoid getting negative ratings is to provide better service. GoDaddy actively discouraging their customers from rating the company negatively during a survey is counter-productive. GoDaddy is manipulating their own results, which as far as I’m aware, are only used internally. They’re only fooling themselves. This isn’t the right way to conduct or look at surveys.

If a customer has a less than phenomenal customer service experience and rates the experience accordingly in a survey, you should let him or her provide that feedback. In fact, you should value that type of feedback (see this post about keeping your enemies closer); as a customer-centric organization, it is your responsibility to thank the customer for their feedback and see what you can do to make the experience better. You may choose to take it a step further and follow up with the particular customer personally or you can choose to simply consider collective and individual customer feedback when making changes. That’s how surveying works and what it’s designed to do.

Surveying is not a game of company versus customer. If customers are providing positive feedback, it’s probably for a reason. Similarly, if they’re providing negative feedback, it’s for a reason. Surveys are a way of gathering customer feedback and opinions. Doing anything to try and skew those honest opinions (like double checking with customers before accepting negative feedback) is just making it harder to collect that honest feedback and in the long run, harder to provide service that is truly exceptional.

Christoph Guttentag from Duke – Part 4 of 4

Logo-1This is the fourth and final part of my interview with Christoph Guttentag, Dean of Undergraduate Admissions at Duke University.

In this part of the interview, Christoph explains how different applicants communicate with Duke, when the best time to contact an admissions officer is, addresses the hotly debated topic of admissions officers looking at MySpace and Facebook profiles, and how Duke gathers feedback regarding its admissions processes. He also provides his opinion about sending thank you notes and courtesy in general, and then finally, provides some tips to those thinking about applying to Duke University.

I want to thank Christoph for taking the time to speak with me and to answer my questions. Hopefully you as readers have enjoyed reading the interview as much as I did conducting it.

To view the rest of this portion of the interview, click “more.” Other parts of the interview include part 1, part 2, and part 3.

Technorati Tags: , , , , , , ,


Continue Reading

Surveying Your Service Providers

Survey
A lot of companies spend a lot of money on surveying their customers. These companies are doing the right thing (surveying your customers is great), but chances are, there is a group that they’re forgetting to survey: their service providers.

Employees, especially frontline service providers, know a lot about your customers. They are on the phones every day and as a result, have a tremendous amount of experience working with your customers and understanding their experiences. The employees know what customers are calling about, what they think causes the issues, and what customers complain about. If you’re surveying your customers, you probably know a lot of this already, but employees always have a unique, and interesting, perspective.

Surveying your service providers also has the additional effect of making employees feel like their opinions are valued (which is hopefully true). You (as a management team or as a company) taking the time to ask about their opinions, experiences, and insights makes them feel like they have a voice and that their suggestions and comments will hopefully be implemented, or at least considered.

So what do you ask your employees? I always recommend asking them a mix of specific and broad questions. The specific questions are ones where your want their opinion about how to deal with a particular problem or challenge you’re facing. The broad questions are so they can provide you with their feedback and ideas about other possible issues you might not be aware of (or might not realize how important they are). You have to word the questions so they don’t threaten anyone or make it seem like you’re trying to catch someone in any type of trap.

Sometimes it’s helpful to have the surveys conducted anonymously (if you choose to do a written / online survey) or by a third party (anonymous or not, if you choose to do them as interviews). It all depends on what type of information you’re looking for and how your company is setup. It can be a bit awkward for employees to give honest feedback to managers or supervisors (or to a third party if they know the feedback won’t be anonymous).

Some companies do this more regularly. They run the program as more of a job satisfaction and quality evaluation. If you systemize the process and do it consistently across departments and teams, it won’t be as intimating. All too often, “reviews” or “evaluations” have this negative connotation that imply something is wrong. You should work hard to avoid or eliminate that association. The goal of surveys, evaluations, interviews, etc. (of customers or of employees) is to get feedback – feedback that you can use to improve.

Quick Post: Build the feedback process right in.

Skype Logo
Skype has an interesting way of gathering feedback. After every call, a little survey pops up. They ask you to rate the quality of your call (they use a 1-5 star system) and then they show a list of things that could have gone wrong (echos, etc.) and ask you to check boxes of anything that was applicable.

The survey is super simple and has gotten even simpler over time (it used to redirect you to a web site – now it seems to be built into the program). It’s very self explanatory and since it pops up after every call, you have the opportunity to rate your experience frequently.

The survey is optional, but I bet that Skype has really high response rates. Again, the simplicity is probably what would lead to high response rates. I’ve already talked about what a big hit one question surveys are and this just serves as an extension of that. Keep your surveys simple, keep the questions relevant, watch the results pile in.

« Previous Page  Next Page »