
Has anyone reading this ever had a frustrating conversation with a call center representative?
For me, the most annoying part is that often the person I’m talking to doesn’t seem to have the power to solve my problem. “That’s against company policy” is one of the worst combinations of words in the English language.
But if you’ve ever been on the other side of the fence, you’ll know that most businesses don’t want their customers to get mad – and they certainly don’t want them to cancel their service or leave a bad review.
In fact, call centers even have supervisors overseeing the representatives, and part of their job is to listen in on calls, offer advice to the reps, and even step-in occasionally to assist.
If you’re familiar with ACD voice applications, these last three functions are referred to as monitor, whisper and barge-in functionality.
The problem is that the supervisor has no real way of knowing which calls need attention. Maybe a rep will ask for help if things get out of hand, but even then it’s often too little, too late.
But what if the contact center application was able to automatically tell the supervisor which calls need attention – before things go off the rails?
Then the supervisor could start monitoring the call early, and easily offer advice to the rep or even step-in to assist – thereby dramatically improving customer satisfaction and reducing customer refunds and cancellations.
A call center that had that capability would be immensely valuable to a business.
This is the future – thanks to machine learning.
Machine learning, again?
Long-time readers may recall that I shared a presentation about machine learning at the Metaswitch InTouch conferences in 2018 – and I’m very excited about the future applications of ML. You can watch a snippet of that talk below (or watch the whole thing if you have 40 minutes spare).
Most of the current applications of machine learning in telecoms are related to toll-fraud and robocall detection, but I’ve had some fascinating conversations recently about sentiment analysis and how that could be used in a contact center application.
If you’re new to the topic, machine learning is all about using data to make predictions. At a very high level, it works like this:
- You gather a large quantity of data, which is (often manually) categorized and labelled so that we know what it means. This is referred to as training data.
- A computer uses machine learning to analyze that data set and look for patterns in the raw data that predict the result / categorization.
- This algorithm is then unleashed on real-world data, and it looks for patterns (based on what it learned before) in each piece of data to predict the outcome / categorization.
- Ideally there’s a feedback mechanism where the computer can discover if it’s making good predictions or not, and improve performance over time.
That’s a little vague – so let me give a concrete example.
- When I first signed up for Netflix it asked me what kind of movies and TV shows I liked, and I fed it some limited training data about my preferences.
- Netflix used that data to make recommendations of what I might like to watch.
- Netflix gets ongoing feedback because it knows when I choose to watch one of its recommendations.
- Not only that, but it knows that I never got to the end of Ken Burns’ The Civil War, but that I’ve seen 13 seasons of NCIS – and all of that information feeds back into its algorithm.

And here’s the theory of what could happen in our contact center example.
- An AI (artificial intelligence – a computer using machine learning) uses training data to learn how to detect human emotions based on the sound of voices.
- The AI combines that data with a speech-to-text live-transcription of a call and text-based sentiment analysis of the words spoken in the call (e.g. the words “I hate your product” might indicate a negative sentiment).
- Through this combination of words spoken and the tones of voice used, the AI can learn to identify the patterns that indicate a tense / difficult phone call.
- This information can then be fed into the ACD supervisor dashboard. I’m imagining red, orange or green lights next to each call indicating a live-view of the sentiment.
- So basically the contact center is making recommendations to the supervisor about which calls need his/her attention – allowing the supervisor to assist on those calls that most need it.
- This makes the supervisor much more effective, and will hopefully significantly improve the performance of the call center.
- There could also be a feedback mechanism where the supervisor agrees/disagrees with the AI’s assessment, providing feedback so that the AI can become more effective in the future.
This is a great application of machine learning. It checks all the boxes:
- It’s a tough problem.
- It’s a valuable problem.
- It doesn’t have to be perfect right out of the gate, but there’s an opportunity for feedback and continuous improvement.
Where do I sign up?
Well… it’s possible the product I described may be slightly ahead of what’s currently available.
There are certainly tools out there that perform sentiment analysis on text, and there are tools that perform sentiment analysis on voice recordings – but I’m not aware that any of the major contact center offerings combine the two in real-time as I’ve described. Yet.
I’ve had several conversations with a leading AI consulting company, and this could be a great way to bring the benefits of AI and ML to telecoms in a meaningful and valuable way.
If this can be done, it could provide a significant edge to the vendor who offered this technology, and a service provider who could offer this to their enterprise customers would be able to make a compelling business case about the value of their solution.
What Next?
- If you are a vendor with a Contact Center product who is interested to explore this idea further, contact me and we can talk.
- If you are a service provider who has enterprise users of Contact Center software, I’d also love to hear from you. The more we understand the use case, the data available and the desired outcomes, the better the resulting AI.
- If you just think this is cool, please re-share on social media. 🙂
I know this topic is a little outside my regular fare – hopefully it has been fun to explore some new ideas, even if they aren’t immediately useful.