By
Matt Beckwith
|
Date Published: May 01, 2018 - Last Updated December 03, 2018
|
Comments
Throughout my contact center career, I have seen trends come and go. Some trends seem to last forever while some seem to stick around far longer than they should. These are the things left over from call center days gone by that just need to go, the relics of those that came before us. Yes, I know, that sounds dramatic, but I think it's time to make dramatic changes. No, I'm not talking about how some contact center leaders still beat team members over the head with average handle time goals. Yes, that is archaic, but we're not going to talk about that today. Instead, I am writing about AHT's crazy cousin: Call Quality. You know this crazy cousin. The one that shows up at awkward times and never leaves. By the way, if you're picturing Randy Quaid's character, Cousin Eddie, from National Lampoon's Christmas Vacation, you're exactly right! That guy!
As contact center leaders, we all care about the quality of service our team members provide. And it makes sense that our forefathers and foremothers came up with methods to measure quality. Measuring things, after all, is what contact center geeks live for, right? But with traditional quality monitoring, just like agent-level Average Handle Time goals, we've gone from measuring it to misusing it. It served a purpose for a while, but in many organizations, it has outlived its usefulness.
Of course, I still believe in quality monitoring. The idea that you should "inspect what you expect" is still very accurate. Since in many contact centers, voice is still the predominant channel, I'm going to focus on the phone. Here are a few reasons why it's time to reconsider traditional call quality monitoring.
Call quality forms can drive the wrong behaviors
Companies have gone crazy coming up with options to include on forms. Even I have been guilty of that (if you don't believe me, see Hexahedron of Quality). In the early days of my contact center career, we wanted to ensure consistent service to customers. To achieve consistency, we had our team members read lots of scripts--procedural scripts and scripts to sound caring or empathetic. To ensure consistency, we created call quality forms, which started with a few checkboxes, and then those checkboxes gave birth to even more checkboxes, and before we knew it, the original checkboxes became great-great-grand checkboxes. We created checkboxes for everything. I remember attending conferences with other contact center leaders, and inevitably the question would come up: "How many checkboxes does your form have?". We started with a checkbox for "Used the customer's name" and then that checkbox multiplied by three, and we were checking each time the agent said the name. On short calls, it was obviously stupid, but we dutifully checked those boxes when our team member said, "Mr. Smith, thank you for calling, Mr. Smith. Goodbye, Mr. Smith." And agents thought, "Phew… checked those boxes!". Eventually, I discovered that checking boxes didn't drive the behaviors we wanted. We wanted our team members to personalize the experience for the customer. Instead, they rushed Mr. Smith's name three times in a row, or they kept a caller who was in a rush to get off the phone unnecessarily longer so they could get through their script about the benefits of signing up for automatic payments.
Traditional call quality forms are too rigid
Because it takes a lot of time and effort to create a quality form, they generally stay that way for a while, sort of by design. It seems to be a universal rule that the more time spent building a tool, or the more complex a tool, the less of a hurry there is to improve it. That means when the business changes, or the products, services, or customers change, you end up with an outdated checkbox system.
Your call quality forms may be demoralizing to many (if not all) of your team members
Contact center team members are often the most measured and scrutinized employees in a company. There's usually a metric and goal for every single thing they do throughout their workday. On top of all the metric goals we expect our team members to hit, we also expect them to "own the customer experience" and tell them "that to the customer, you are the company!". We say those things, and then we hold this form over their head with a million checkboxes and say that checking those boxes is what's important. As leaders, we all care about delivering the best employee experience we can. It's hard to do that when you're forcing people to say things your way. Of course, I understand that in some industries there is a need for specific disclosures or certain words. But even in those regulated environments, that verbiage likely only accounts for a fraction of the entire call.
There is a better way
You don't need the latest and greatest technology to have an effective quality monitoring program. Call Quality doesn't need to break the bank. This advice can work for small contact centers, those that don't have a budget for quality monitoring and reporting technology, and even for large contact centers. If you do have a call model or even worse, a checkbox call quality form, start over, today, from scratch. Don't start with your existing document. Review your new hire training, talk to your team members, identify the exact behaviors you want and the outcomes that those behaviors create. There is nothing inherently wrong with having a call model, but it should be just that, a model. You should be able to explain why each element of the model is important to the business. Does it matter that the customer's name is used three times during every call, or is it more important that the team member use the customer's name appropriately and find other ways to personalize each call?
Ready to revamp your quality forms? ICMI has a course that can help!
Ditch whatever form you're using today. Instead, use a blank sheet of paper. Sit next to your team member and listen to them. It is time to stop doing quality monitors from your office! For starters, sitting in your office and listening to a recording or a live call doesn't show your team member that you trust them.
I can already hear your objections. "If I'm sitting next to my team member, they won't make any mistakes." Darn. Sounds like a good problem to have. And that is a total myth. The purpose of quality monitoring is to help ensure that customers are getting the service that the company and the manager expected. This includes demonstrating behaviors that show caring and empathy, as well as procedural compliance. You can do this either by encouraging good behavior or correcting poor behavior. But let's face it, we all know that it requires a combination of both. If you are sitting right there with your team member, they will still make errors, and you'll be there to help them when they do. They will also do things right, and you'll be there in the perfect spot to give them feedback and praise them for doing it well.
And here's my favorite way to do quality monitors: of course, they are side-by-side with the team member, but my leaders and I do them without wearing a headset. That's right; you can do quality monitoring without listening to the customer! Remember, you are not monitoring the quality of the customer, so you don't need to hear that part of the call. If your team member needs help with something on the call that requires you to listen to the customer, they will ask you for help. In other words, they will need to interpret what the customer needs, and that is a skill you should always want to help further develop - you listening to the customer doesn't help them with that at all. You don't need to hear the customer to know if your team member was speaking over the customer, or if they followed the correct steps when ordering a product.
Now, back to the blank sheet of paper. Write notes about the call. Think of it from the customer's perspective and your team member's perspective, not just your point of view.
Go over your notes with your team member. Have a conversation with them about what they did well and what they can do differently next time.
When you're done with each day's (or each team member's) quality monitoring, add your results to a spreadsheet. Yes, a Microsoft Excel or Google Docs spreadsheet. Just add a row for each thing that was done incorrectly. So, if your team member Jennifer does three wrong things on a call, there would be three rows in the spreadsheet with her name in the first column of each. Once you've done this with a few team members, you'll start to see trends and can begin categorizing the results. That's the key; the patterns should emerge from the calls you reviewed, not from a call quality form.
In my experience, the data you can get from this will be far more valuable than the information you're getting with a traditional checkbox form. My leaders report weekly and monthly trends they have identified in their monitors, without team member names, so that everyone on the team knows what is going well and what needs to be improved. They regularly review results with their manager to develop action plans around improving the quality of the entire team. Over time, when they focus on the top mistakes, the team gets better. We also share this data with our training team so they can make improvements to the new hire training program.
The key to a good quality monitoring program is that it's designed as a coaching tool to help team members. If you're using it as a scorecard, it's more like AHT's crazy cousin than a valuable tool to improve performance.
No more torture
One final tip. Another relic of contact center days gone by is the practice of forcing team members to listen to their own recorded calls. For some, that is pure torture! Please stop pushing them to listen to their own calls. Some people (and you can count me in that group) hate to hear their own recorded voice. There are scientific reasons for this. Suffice it to say, we hear our own voice differently when only through our ears, and the slight difference can be unsettling for many.
My challenge to you
If you're skeptical about doing this type of quality monitoring, my recommendation is just to try it. If your operation is big enough, try this with one team for one month, and then ask the team members and supervisors what they thought of it. Did it produce better data around where to focus improvement efforts?
Tell me how it goes! I'd love to hear your feedback in the comments.