>
The fascinating feature of waterlogged fingertips we all share
The Hidden Dollar Revolution: America's New Digital Money System
SpaceX Starship About Nine Days From Next Launch
Cab-less truck glider leaps autonomously between road and rail
Can Tesla DOJO Chips Pass Nvidia GPUs?
Iron-fortified lumber could be a greener alternative to steel beams
One man, 856 venom hits, and the path to a universal snakebite cure
Dr. McCullough reveals cancer-fighting drug Big Pharma hopes you never hear about…
EXCLUSIVE: Raytheon Whistleblower Who Exposed The Neutrino Earthquake Weapon In Antarctica...
Doctors Say Injecting Gold Into Eyeballs Could Restore Lost Vision
Dark Matter: An 86-lb, 800-hp EV motor by Koenigsegg
Spacetop puts a massive multi-window workspace in front of your eyes
Computer scientists at Oxford University have teamed up with Google's DeepMind to develop artificial intelligence that might give the hearing impaired a helping hand, with their so-called Watch, Attend and Spell (WAS) software outperforming a lip-reading expert in early testing.
The figures on lip-reading accuracy do vary, but one thing's for certain: it is far from a perfect way of interpreting speech. In an earlier paper, Oxford computer scientists reported that on average, hearing-impaired lip-readers can achieve 52.3 percent accuracy. Meanwhile, Georgia Tech researchers say that only 30 percent of all speech is visible on the lips.
Whatever the case, software that can automate the task and/or boost its accuracy could have a big impact on the lives of the hearing impaired. It is with this is mind that the Oxford team collaborated with DeepMind, the artificial intelligence company acquired by Google in 2014, to develop a system that can bring better results.