The 2023 State of Speech Analytics
Research firm MarketsandMarkets valued the worldwide speech analytics market at $1.8 billion in 2021 and projects it to reach $4.5 billion by 2026, growing at a compound rate of 20.5 percent per year. Meticulous Research expects the same market to reach $14.1 billion by 2029, expanding at a compound annual growth rate of 20.1 percent. Other firms have put forth projections that the market will reach between $5 billion and $8 billion over the next few years.
There is clearly some discrepancy among firms around the dollar amounts, but they all agree the market’s yearly growth will be 20 percent or a bit higher for the foreseeable future.
Year in Review
To trace the progress of speech analytics requires not just a year in review but a look at recent history, for industry researchers also agree that the major factor driving growth of the speech analytics market is the surging demand for companies to monitor and improve contact center agent performance, a need that only became greater when agents started working from home at the height of the COVID-19 pandemic.
“There are several reasons for the double-digit growth of the industry,” says Daniel Ziv, Verint’s vice president of speech and text analytics global product strategy. “Hopefully, COVID is behind us. But the impact is still there.”
Ziv points out there was a significant uptick in speech analytics because of COVID as companies still wanted to analyze calls and provide guidance when agents weren’t in the office.
With COVID, there was not only a shift to remote work but also a large increase in business conducted over the phone, Ziv says. “These interactions need to be collected.”
Though many of the simpler interactions, such as checking account balances, shifted to digital channels, the more complex ones moved to the phone, Ziv adds. “The volume of calls actually increased rather than decreased. Those interactions are more complex, more emotionally charged, more mission-critical. They need careful monitoring. We need insights in what’s hidden in these interactions.”
That has fueled a rapid influx of artificial intelligence-based technologies into the speech analytics domain as companies looked to collect and act on real-time actionable customer insights and predict customer intent and behavior.
COVID also spawned a rising use of cloud-based analytics solutions as companies needed to keep track of employees and their interactions across multiple locations and channels.
Ziv adds that the growth of speech analytics during the past year has also been driven by the need for companies to differentiate on customer experience.
“If those experiences are going to be over the phone, I need to make sure that they are effective,” Ziv says.
Ziv and other experts anticipate the technology expanding, with channels added to the mix for more complete, unified analytics. The trend has already started, with the market seeing a greater merging of speech, text, desktop, predictive, customer, and performance analytics into unified suites, creating the industry category called interaction analytics.
Verint was one of the pioneers in that area, having introduced a next-generation interaction feature that unifies analytics for voice and text interactions to inform workflow and contact center actions, Ziv says.
Companies are interested in more analytics categories and reports because there’s a shortage of data analysts, Ziv says, noting that reports that unify speech and text analysis enable non-technical teams to see and address common issues.
With recession fears, there’s also been a push for companies to be more efficient. “People are demanding that their conversational analytics platforms or speech analytics platforms deliver greater savings and revenue,” says Rick Britt, vice president of artificial intelligence at CallMiner.
Speech analytics had to expand as companies started looking for more than just a platform that delivers information to help with agent coaching. Companies sought out platforms that could deliver information they could use to understand why prospects or customers purchase, why they churn, etc., Britt says. “There’s a big move toward bigger brushstrokes.”
A Look Ahead
Those brushstrokes also take in emotion and sentiment analysis to aid in CX management; the expansion of e-commerce platforms and the need to track interactions on those platforms; the growing emphasis on customer satisfaction; the rising demand to manage inbound, outbound, and blended calls, emails, web inquiries, chats, and other channels; the rising demand for customer self-service interactions; and the growing need for fraud prevention and detection.
But it doesn’t end there. “The biggest trend I expect to see is rapid growth of real-time analytics,” Ziv says further. “I also think that we will see more leveraging of AI embedded in speech technology solutions.”
AI has also been making speech analytic solutions much easier to use, Britt maintains. “Machines are doing a lot of the processing now.”
Analysts and consultants also expect speech analytics to make deeper inroads into several key industries. It’s already a mainstay in financial services.
“With humanlike dialogues, the integration of voice technology with mobile banking is anticipated to increase client happiness,” Market Research Future said in a recent report on the speech analytics industry. The firm sees an increased reliance on actions like password verification and security questions. “Because of this, banks are putting an emphasis on innovation to create mobile banking services using speech analytics, which will ultimately support market expansion.”
According to the firm, 2022 also saw increased use of speech analytics in healthcare. And the newest use case in medical settings has nothing at all to do with contact center or other customer interactions.
One such example was the October announcement by Aural Analytics of a partnership with HitCheck to use speech analytics to detect concussions among players in contact sports like football and hockey. HitCheck’s cognitive assessment application tracks individuals’ unique performance over time and screens for sudden changes after a potential brain injury has occurred. Aural Analytics’ clinical-grade speech analytics are now embedded into HitCheck’s app to improve the assessments that coaches, athletes, medical personnel, and others rely on to assess an athlete’s ability to play.
“There is a distinct link between brain injury and changes in speech and language that can be quickly and easily measured using Aural Analytics Speech Vitals technology,” said Judy Smythe, CEO of Aural Analytics, in a statement. “Human speech relies on the coordination of complex neural activity, including cognitive and motor functions. A concussion or brain injury can cause damage to the underlying system, which becomes apparent in one’s speech.”
Aural Analytics, whose Speech Vitals product has been registered by the U.S. Food and Drug Administration as a computerized cognitive assessment aid, has many similar medical uses, as do products from several other vendors.
Projects that have only started in the past year or two have involved using speech analytics to diagnose and track the progression of a number of cognitive and degenerative conditions, including dementia, Alzheimer’s disease, Parkinson’s disease, and ALS, as well as other disorders, including COVID-19, stroke, COPD, migraines, and schizophrenia. Speech analytics is even being used to aid in clinical trials of some experimental medications.
Smythe sees speech analytics becoming “an essential element of best-in-class digital biomarkers, which in time will further the fields of research and clinical care across neurodegenerative conditions.”
Another driver of speech analytics is the continued adoption of cloud-based systems, which Ziv says don’t come with the high costs of hardware, software, and upgrades.
“Speech analytics is a very mature market; Verint has thousands of deployments,” Ziv says. “A lot of these existing customers just want to move to the cloud and get the latest innovations, which fuels growth.”
Speech analytics are also contributing to company growth in other ways, such as analyzing sales calls; companies have discovered that outcomes can often be traced to nuances in rep interactions with customers and prospects. Through extensive call analysis, speech analytics vendor VoiceBase found that salespeople who speak for less than 50 percent of the call and who discussed price just three or four times after establishing value during their pitches were more likely to close deals.
That will only increase adoption. Speech analytics can be expensive, so it’s lot easier for companies to justify the price if they show contributions to revenue generation.
Taking speech analytics further outside the contact center, other use cases are starting to include marketing, human resources, legal, and IT departments. In general, companies can deploy speech analytics anywhere that employees interact with customers, partners, and prospects.
That’s why CallMiner is adding summarization to its speech analytics product, providing companies a quick synopsis of the major points of any conversation.
A 1,500-word conversation might be summarized in as little as 30 words, Britt explains. “You can import this directly into your systems. People will be able to drive action from this.”
If, for example, a customer calls to cancel service, the user can quicky see if the reason for the cancellation was similar to other calls about cancellations and take appropriate action.
Going forward, look for vendors of speech analytics to also focus more of their attention on artificial intelligence, allowing for more relevant product recommendations based on the content and context of interactions and zero-touch capabilities to quickly train the technology to find what companies want.
These kinds of features are in high demand among speech analytics users, Britt says, arguing that analytics providers that don’t offer these capabilities will be falling behind.
Phillip Britt is a freelance writer based in the Chicago area. He can be reached at spenterprises1@comcast.net.