Mystery Shopping & Cross Tab Reports

 

 

Hopefully your mystery shopping program offers a robust suite of analytical reports to really dive into the data collected. After all, this is a great method of compiling objective data about your operational procedures, and is always a wonderful complement to a customer feedback program.

One report that seems to be under utilized, but can be extremely helpful, is the cross tab report. Essentially this looks at two data points to see if there is a correlation.

What are some examples?

Day of week vs overall score. Does your overall score fluctuate depending on the day of the week? Or, go one step further and look at time of day Рis there a particular shift where performance seems to struggle?

 

New and returning customers vs overall satisfaction. Many mystery shopping programs will ask shoppers if they are a new or returning customer. Use that data to look more closely at the “subjective” questions you may incorporate into your program, such as rating the overall experience, or based on the experience, would the mystery shopper be likely to visit and/or recommend to others? While mystery shopping is typically an objective snapshot, adding a subjective question can give you more bang for your buck.

 

These are just two examples, but it can give you a sense of how you can look at data differently. Below are some tips when using the cross tab report:

 

  • Make sure the two questions “make sense” to compare. As an example, you don’t need to compare the cleanliness of the dining area with cashiers attempting to upsell. It’s an extreme example, but it’s easy to see how one has absolutely no impact on the other. Make sure one question has an impact on the other for best results.

 

  • Don’t just focus on company wide data – slice and dice by district or region, or even drill down to a specific location. This can be helpful if you have concerns about a location or group of stores – you can quickly run a cross tab to identify potential issues quickly.

 

  • Make sure your mystery shopping survey is designed well. You can always tweak your survey as your needs change, but starting with a solid survey, taking into consideration all of your goals for using the program most effectively, will give you a wide range of options for analyzing the data over time.

 

Cross tab reports are one of many that are available to analyze your mystery shopping data. Check back to see what else is available – we’ll be sharing more report tips & tricks in future blogs.

FacebookLinkedInTwitterGoogle+RedditStumbleUponDeliciousTumblrPinterestShare

Make the Most of Your NPS Data

 

Keep Customers Happy

 

I recently shared information on how to calculate your NPS score. This is an excellent snapshot tool, and there are other ways to use it to get even deeper insight.

Collect as much data as you can – this means incorporating the NPS question on as many customer touchpoints as possible. That might mean on customer feedback surveys, your mystery shopping program, incorporated into your POS system (I’ve seen some retailers have this question pop up during the transaction process), in email marketing, etc. The more data you have, the clearer picture you have.

Segment NPS data – this could be trickier, unless you set up your programs to track the customer journey effectively. One client we work with does a great job of this. They capture feedback from customers across all points of the customer journey, from placing the order to delivery to billing and invoicing. As such, they have NPS data from each touchpoint. This allows them to see changes across the journey and identify pain points that may be influencing overall satisfaction.

If you don’t do this through the program design, you may be able to separate it out through the raw data. If this is the case, you can easily calculate NPS manually using an Excel formula.

Other ways to segment NPS data may that may be useful:

Look closer at the detractors. This is a no brainer really – why are they dissatisfied? Look for trends that may identify your pain points to work on improvements.

New vs returning customers. They are each important for different reasons of course. If your new customer NPS is low, you need to find out why and fairly quickly, as there may be issues that are preventing them from becoming returning customers.

For returning customers, look for NPS trends over time. Do the numbers remain stable or fluctuate? If they fluctuate, you may want to dig deeper to see if it’s dependent on time of day/day of week, seasonal, etc.

Drill down to location or district levels. Are there a group of locations (or perhaps one specific location) that are tanking your NPS score? It may not be a company wide issue, but a more localized one.

NPS is just one data point, but it can be used many ways to get a better picture of what’s happening and gauge customer satisfaction. You have the data – why not make the most of it?

#ButtGate Made Worse by Short Fuse

Last week, a story went viral about a customer who left a less than stellar Google review for a restaurant in Tennessee. The owner, who pays attention to reviews (that’s a good thing), got a bit….upset with the customer and shot back a “review” of their own. In case you missed it, here’s a replay of the exchange:

 

Customer’s Google review:

 

Okay, so this review is a bit dramatic and could have been written differently and gotten the same message across, but there are some points to be made. After all, unclothed children are probably not a good idea in the dining room.

Instead of taking a breath and stepping back, the owner was reactive and posted this on the company’s Facebook page (it is now deleted):

 

You can only imagine the reaction it got from Facebook users.

After calming down a bit, the owners then posted this explanation, which has also been deleted. At least it’s a better explanation – mothers know that when it comes to criticizing children, parents do tend to get a bit sensitive.

 

 

There are always three sides to every story – his, hers, and the truth. I’m sure both sides are correct in their perception of what happened, but both reacted in a rather unnecessary way. However, it’s the business owner who will feel the impact of this – will people feel comfortable leaving less than stellar reviews going forward, or will they simply not return? Will people hesitate to visit, not necessarily because of the customer’s review (though that doesn’t help) but more because of the knee jerk response from the owners?

Honestly, this is a viral story at this moment – I’m sure it will blow over and no one will remember it in a few months. But for now, the reactionary response doesn’t seem worth it from a business perspective.