Paste your Google Webmaster Tools verification code here

Why Monitor A Problem If You Don’t Fix It?

 

Every time I see a LifeLock commercial, I think about our mystery shopping and customer feedback programs. There is one line that stands out:

“I’m not a security guard. I’m a security monitor. I only notify people if there’s a robbery…”

And their tag line: Why monitor a problem if you’re not going to fix it?

Sometimes companies will start a mystery shopping or customer feedback program with the best of intentions – they are excited about designing a program that will monitor and measure the customer experience, either from an operational or subjective perspective.

And then the first results come in, and key staff read every word, share with their employees, and wait expectantly for the next one. Then the program runs for a while, and…

Now what?

There have been times when a client’s program seems to take a turn – all of a sudden the overall performance scores are lower, or a particular location seems to have consistent complaints of slow service, incorrect orders, or some other issue. If it’s not improving, it’s time to figure out why.

There have been times where clients will admit they realize there’s an issue but haven’t directly addressed it for a variety of reasons. Perhaps they’re understaffed, or not sure how to handle the issue, or, they’re not really doing much with the data they’re getting. In that case, they are a lot like the security monitor noted above – they are alerted to an issue, but are not actively doing anything about it.

Below are some examples of ways customer experience data may not be used effectively, and some ways to overcome these challenges.

 

“Oh, we use the data. Every time a low score comes in the staff get in big trouble.” When I hear this, I want to cry a little. This is the absolute WORST way to deal with lower than anticipated performance on a mystery shop, especially if you are only focusing on the weaker performing evaluations. A consistent pattern of low performance signals something to dive deeper into for sure. By consistently using analytical reporting portals, you will be able to identify these areas for improvement and action. By only singling out the poor scores, you are setting staff up for failure. You are also setting the tone that any customer experience measurement is “the enemy” and this will leave a staff that has no interest in hearing the feedback or wanting to improve.

Instead, take a different approach: instead of calling out the staff for a poor score, celebrate the good ones. Call out the staff for the best shops or surveys in month’s period. For the weaker evaluations, compile enough data to pinpoint the issue(s) and create an action plan to make it better.

 

“We are supposed to have meetings on a monthly basis to discuss the data, but business has gotten really busy lately, so…” Sometimes it takes a village, but often there could be one point person who is solely responsible for aggregating the data from all customer measurement programs and provide regular reporting to key staff.

It is important to have regular meetings to discuss company wide issues as time allows, but that doesn’t mean nothing should be done in the meantime. Assign a point person who is responsible for distributing individual evaluations or feedback surveys, but also for looking at the back end analytics and providing key metric reports so that managers have a place to work from to make improvements.

 

“I know District Manager A is on top of the program for his/her stores. I asked District Manager B about the program, and he/she said they may have seen some shops come in but hasn’t really looked closely.” When staff are not on board with a program, they may tend to not take it as seriously. The fact is, whether you like it or not, the program will go on, so you may as well make use of it. If you are in a position to oversee District Managers, for example, talk with them on a regular basis and give them some guidance on how to best use the data. Remind key staff that it’s less about the individual results and more about the aggregated data across all programs. Show them how they can make improvements in customer experience that will directly affect customer experience, increased sales, and better overall performance for their stores.

 

“I saw the surveys coming in last night and noticed that several customers were requesting contact. Sounds like it was a bad night.” Yikes. Thanks to technology, managers can be alerted to issues in almost real time, and sometimes taking quick action can alleviate an issue from snowballing into something bigger. I recall a customer feedback program in which text alerts would be sent if a customer requested contact from a manager. One evening, the alerts were coming in fairly quickly in quick succession. On closer inspection, the majority were for one location, and, in reading through the surveys, it appeared that the restaurant’s drive thru was experiencing a wait long enough to cause customers to leave mid-line and in the restaurant, the dining area was not maintained and significantly slow service was being reported.

In this instance, in a perfect world, a manager could do a quick check in with the store as the feedback is coming in to see what quick fixes can be put into place. Then, as soon as possible following the shift, talk with the store manager in future detail to learn more about the issue and create an action plan to ensure it doesn’t happen again or, if it does happen again, what to do to resolve it as quickly as possible.

 

Data is valuable, and not using it can be detrimental. Hindsight is 20/20; don’t be the one to look back and think, “If only we had paid attention to the data coming in….” Take advantage of your monitoring programs and act when needed – your customers will thank you.

 

 

Share

Mystery Shopping & Cross Tab Reports

 

 

Hopefully your mystery shopping program offers a robust suite of analytical reports to really dive into the data collected. After all, this is a great method of compiling objective data about your operational procedures, and is always a wonderful complement to a customer feedback program.

One report that seems to be under utilized, but can be extremely helpful, is the cross tab report. Essentially this looks at two data points to see if there is a correlation.

What are some examples?

Day of week vs overall score. Does your overall score fluctuate depending on the day of the week? Or, go one step further and look at time of day – is there a particular shift where performance seems to struggle?

 

New and returning customers vs overall satisfaction. Many mystery shopping programs will ask shoppers if they are a new or returning customer. Use that data to look more closely at the “subjective” questions you may incorporate into your program, such as rating the overall experience, or based on the experience, would the mystery shopper be likely to visit and/or recommend to others? While mystery shopping is typically an objective snapshot, adding a subjective question can give you more bang for your buck.

 

These are just two examples, but it can give you a sense of how you can look at data differently. Below are some tips when using the cross tab report:

 

  • Make sure the two questions “make sense” to compare. As an example, you don’t need to compare the cleanliness of the dining area with cashiers attempting to upsell. It’s an extreme example, but it’s easy to see how one has absolutely no impact on the other. Make sure one question has an impact on the other for best results.

 

  • Don’t just focus on company wide data – slice and dice by district or region, or even drill down to a specific location. This can be helpful if you have concerns about a location or group of stores – you can quickly run a cross tab to identify potential issues quickly.

 

  • Make sure your mystery shopping survey is designed well. You can always tweak your survey as your needs change, but starting with a solid survey, taking into consideration all of your goals for using the program most effectively, will give you a wide range of options for analyzing the data over time.

 

Cross tab reports are one of many that are available to analyze your mystery shopping data. Check back to see what else is available – we’ll be sharing more report tips & tricks in future blogs.

Share

Make the Most of Your NPS Data

 

Keep Customers Happy

 

I recently shared information on how to calculate your NPS score. This is an excellent snapshot tool, and there are other ways to use it to get even deeper insight.

Collect as much data as you can – this means incorporating the NPS question on as many customer touchpoints as possible. That might mean on customer feedback surveys, your mystery shopping program, incorporated into your POS system (I’ve seen some retailers have this question pop up during the transaction process), in email marketing, etc. The more data you have, the clearer picture you have.

Segment NPS data – this could be trickier, unless you set up your programs to track the customer journey effectively. One client we work with does a great job of this. They capture feedback from customers across all points of the customer journey, from placing the order to delivery to billing and invoicing. As such, they have NPS data from each touchpoint. This allows them to see changes across the journey and identify pain points that may be influencing overall satisfaction.

If you don’t do this through the program design, you may be able to separate it out through the raw data. If this is the case, you can easily calculate NPS manually using an Excel formula.

Other ways to segment NPS data may that may be useful:

Look closer at the detractors. This is a no brainer really – why are they dissatisfied? Look for trends that may identify your pain points to work on improvements.

New vs returning customers. They are each important for different reasons of course. If your new customer NPS is low, you need to find out why and fairly quickly, as there may be issues that are preventing them from becoming returning customers.

For returning customers, look for NPS trends over time. Do the numbers remain stable or fluctuate? If they fluctuate, you may want to dig deeper to see if it’s dependent on time of day/day of week, seasonal, etc.

Drill down to location or district levels. Are there a group of locations (or perhaps one specific location) that are tanking your NPS score? It may not be a company wide issue, but a more localized one.

NPS is just one data point, but it can be used many ways to get a better picture of what’s happening and gauge customer satisfaction. You have the data – why not make the most of it?

Share