Overheard/Underheard
A team of computer scientists at Hong Kong Baptist University have put a new spin on common speech-based biometric authentication methods by developing technologies that can identify individuals through their lip movements.
The effort, led by university professor Cheung Yiu-ming, involves matching the lip movements associated with saying specific passwords to other visual lip characteristics, including shape and texture. It is called, appropriately, the lip motion password.
The researchers claim that lip movements are unique to each individual and cannot be duplicated. The system they are developing can pick up those subtle differences and deny or grant access to systems or information based on them.
“The same password spoken by two persons is different, and a learning system can distinguish them,” Yiu-ming said in a statement.
According to Yiu-ming, lip motion is better than voice activation, which can be inhibited by outside factors like background noise. Further, lip passwords can be used by people with hearing or speech impairments, he adds, and isn’t affected by language barriers.
The project might also seek to integrate the lip motion technology with facial recognition authentication systems for an added layer of security.
Research into the lip motion password has been going on for a while. In fact, a patent on the technology was issued in 2015. Testing continues to this day, and researchers expect to have the first commercially viable products ready for rollout later this year. Initial plans call for it to be deployed to the financial sector first, with other industries to follow. One use case under consideration is building and door access.
As with any technology, the lip motion password is not expected to be completely foolproof, but early indications are that it’s better than most of what’s available today, and that’s definitely a comforting thought.