5 types of misleading data that hurts customer service

How long does it take to create a merchandise display?

This was an important question for a retailer with thousands of stores. Each week, stores received instructions to create new merchandise displays along with the estimated time they took to build.

Staffing decisions were centralized. The corporate office estimated how much labor each store needed for the week and created a staffing plan. The estimated time to create new displays was included in the plan.

The estimates were almost always wrong.

In reality, the displays took much longer to build than planned. Managers weren't allowed to deviate from the staffing plan, so they pulled employees away from other tasks like helping customers.

The retailer was plagued by many examples like this, where misleading data led to poor decisions. It eventually went bankrupt.

You can avoid a similar fate by identifying misleading data. Here are five types to watch out for.

Two colleagues are reviewing business data.

#1 Anecdotes

Stories can be powerful ways to communicate ideas, but they can also be misleading.

One CEO expressed confidence that his business was customer-focused because he had recently received several compliments from friends. These stories were reassuring, and the CEO rebuffed attempts to find more data about service quality.

Had the CEO dug deeper, he would have discovered a growing number of unhappy clients. There were critical gaps in hiring, training, and customer service that needed to be addressed.

Without data to support this, the CEO refused to invest in necessary improvements. The company was soon blindsided by a wave of lost business that it should have seen coming.

Lesson: Always look for data to prove or refute anecdotal evidence before making critical decisions.

#2 Generalizations

Data is often generalized. This means it is accurate in many cases, but might not be accurate in others.

Years ago, many customer service teams shifted from one computer monitor to two. The rationale was that "studies show" two monitors improve productivity.

This was true in many cases, but not always.

Those studies were largely commissioned by companies that made monitors. The authors had a vested interest in downplaying another discovery: one monitor is sometimes better than two.

This meant that some customer service teams hurt productivity when they added a second monitor.

Lesson: Make sure data is applicable to your situation before using it.

#3 Contextless

Data gains new meaning when you add some context around it.

One CEO balked at a plan to raise wages for customer service employees from $12 to $14 per hour. What the CEO saw was a 17 percent increase at a time when the company was already paying 50 percent more than the minimum wage ($8 per hour).

Graph showing a proposed wage increase for customer service employees from $12 per hour to $14 per hour would be a 17% jump.

The problem with this analysis is it didn’t consider the broader job market. Prospective employees might be considering job offers from other companies.

See what happens when you put proposed increase in context with the overall job market.

Graph showing the market rates for similar jobs. The current rate of $12 is at the very low end of the pay scale. The proposed $14 rate would be slightly below average.

The company was paying well below the market average for comparable jobs. Raising the starting wage to $14 per hour would still be below market, but it would give the company access to more talent.

There was one more piece of context that was helpful.

The CEO's chief concern was converting customer service inquiries into sales. A quick calculation showed that a 6 percent improvement in the conversion rate would pay for the $2 per hour wage increase.

Armed with more context, the CEO agreed to raise wages. It quickly became easier to hire good employees. Thanks to better hires, the sales conversion rate increased 36 percent.

Lesson: Add context to give data more meaning.

#4 Manipulated

Data is often manipulated by unscrupulous employees who have a direct incentive to make it look better.

Some cashiers emphasize a survey to happy customers, but fail to mention it when a customer is grouchy. This manipulation is one of at least nine ways unscrupulous employees can boost survey scores without providing better service.

The end result is voice of customer data that makes customer service look far better than it really is. Meanwhile, complaints go unnoticed or unreported.

Why do employees do this?

Because they either get an incentive for good survey scores or they get in trouble if too many customers complain. Sometimes, it's both.

Lesson: Avoid giving employees incentives that might cause them to manipulate data.

#5 Perfect

Some executives won't use data to make a decision until they're convinced the data is perfect. Unfortunately, perfect data does not exist.

First contact resolution (FCR) is a great example.

The idea is to solve customer issues on the first contact so the customer doesn't have to repeatedly contact a company. Repeated contacts frustrate customers and cost the company extra money.

The challenge is FCR is notoriously difficult to measure.

  • What counts as a repeat contact?

  • Can those contacts be easily tracked?

  • How can you tell a repeat contact from a contact about a new issue?

  • How do you determine whether an issue is fully resolved?

  • What if a customer uses multiple channels to contact a company?

So the trouble with FCR is you can't craft a perfect measure, but you can still work towards improving FCR by taking a few specific actions.

Lesson: Don't wait for data to be perfect because you'll end up doing nothing.

Conclusion

Executives are easily deceived when they don’t question the data they use to make decisions. When in doubt, ask a few more questions.

For example, think about the time it takes to make retail store displays. A few questions could have fixed the company's data problem:

  • Are we accurately estimating how long it takes to build the displays?

  • What's the impact of an inaccurate estimate?

  • How can we improve the accuracy of our estimates?

You can imagine a different outcome if someone had asked those questions.