AT&T's Watson Answers the Call
may verbalize destinations in hundreds of different ways. The QNX client-side analysis grants car makers greater flexibility, enabling them to adapt the AT&T Watson results for a variety of in-car applications, regional aspects, or personal tastes."
The Heavy Hitters
While AT&T Watson-based APIs provide access for developers not trained to use speech apps or who work in different platforms, the company has also worked with developers who are using the full-fledged Watson engine.
"They're not just using Watson as an API; they want to customize and personalize and have proactive analytics and do sophisticated technologies like speaker verification," Gilbert says. "These developers understand the technology and want to create something unique in the market. Our enterprise customers are not looking for the technology, they're looking for an end-to-end service where speech and natural language is part of that service."
A third way AT&T works with customers is by licensing technology through a joint strategic agreement. "There's the idea of combining one plus one and [coming up with] more than two, and we can go after a market together," Gilbert says. "We can't do it ourselves, they can't do it themselves, and we join forces to create either a speech, virtual assistant biometric, or translation application for different verticals."
Interacting with Interactions
Interactions is a company in the natural language processing business that typically helps large organizations, such as Humana, Hyatt, and Marriott, build intelligent virtual assistant applications for customer support.
Enterprises can use intelligent virtual assistants in a variety of applications in both customer care and sales settings, serving to route or self-serve customers, explains Mike Iacobucci, CEO of Interactions. "In a routing environment, Interactions has deployed applications fronting a caller with 'How may I help you?' and either routing [her] to existing high-performing self-service applications or...to new Interactions Virtual Assistant self-service applications," Iacobucci says. "In a self-service environment, Interactions can handle incredibly complex self-service processes."
For example, Interactions completes Medicare enrollment applications for several healthcare providers, such as Humana. In other self-service instances, Interactions can complete processes such as creating hotel reservations or filling out insurance claims.
Iacobucci says the company has designed its operations with a human element that uses the same protocols as a speech recognition engine. It is a way of understanding extremely complex human dialogue where speech recognition historically experiences trouble, including open-ended sentences, alphanumeric data, out-of-grammar responses, and scenarios with background noise or accents.
The company's technology is focused on using the combination of human-assisted understanding and an automatic speech recognition (ASR) engine to create conversational, or humanlike, systems. "The application really doesn't know where the understanding is coming from," Iacobucci says. "Our technology determines where the understanding should come from and has a great level of understanding of languages and accents far beyond what's capable from speech recognition alone."
Iacobucci says the company designs its back-end recognition technology as part of its offering assuming that there will be a level of ASR to work in tandem with Human Assisted Understanding (HAU). ASR, he explains, takes on the parts of the conversation that can be easily automated, and HAU takes on tasks that are more difficult and not suited for ASR.
In April 2013, Interactions signed a licensing agreement allowing it to use Watson in its speech-enabled virtual assistant applications for enterprises in the customer care market. The company uses Watson as a recognition resource to build highly conversational and humanlike virtual assistant applications that are based on its technology suite. "For speech
Related Articles
Both companies are committed to reshaping the future of speech-enabled interactions.
10 Feb 2014
Siri may be at the forefront, but she's far from on her own.
10 May 2013
AT&T's speech recognition and natural language understanding apps will enhance Interactions' customer care solutions.
08 Apr 2013
From the novice to the experienced, it's boom time for developers.
10 Feb 2013
New software brings AT&T Watson speech recognition engine to in-car systems and apps.
07 Jan 2013