Beyond Verbal Debuts Moodies Emotions Analytics App
Beyond Verbal has debuted Moodies, an app that decodes and measures human emotions extracted from voices in real time. Using just 20 second of speech, Moodies lets users analyze their own voices or those of people around them.
"Our Emotions Analytics technology is different than any other speech recognition app in the sense that it listens to vocal modulations, not words, over a period of time," says Yuval Mor, CEO of Beyond Verbal, in an email. "It is not trying to [use] phonemes or even words on a continuous basis. This, among other differences, also makes it language-agnostic and culturally universal."
Mor explains that Moodies, bundled into cloud-based API software, enables devices and applications to understand moods, attitudes, and decision-making characteristics as users speak.
"Our patented technology listens to a period of about 10-15 seconds," he explains. "Its algorithms [are] constantly searching for the vocal patterns that our 18 years [of] research have previously identified and are matched to different primary and complex moods."
Mor believes Moodies can go beyond the consumer space and venture into the business world. Since the company launched its Friendly User Trial (FUT) program a few months ago, it has been approached by a number of potential partners from various fields, including call centers, remote health and wellness, human resources and recruiting, coaching, and other application developers.
"We believe that emotions understanding can dramatically alter the way we interact with machines and with each other," Mor says. "Allowing machines to understand, for every practical purpose, the most important medium in human communications allows, in a call center context, for example, for agents to better understand customer grievances and also monitor their own composure, conduct, and effectiveness."
While emotion detection using voice has faced a barrage of criticism in the past, Mor believes previous solutions have failed because they started with the wrong approaches.
"Most current and past emotion detection solutions have been trying to dig the answers in all the wrong places&—analyzing spoken words, not vocal intonations, to try and figure out our emotions," Mor says. "These approaches are severely impacted by our own human cognition. Words are poor as emotional yardsticks (just think of 'have a nice day'), and are inundated by massive rule processing, language, and cultural relevancy."
Related Articles
Consider it along with data when evaluating caller experience.
28 Jul 2014