>
Dubai: cryptocurrency payments for government services thanks to Crypto.com
Shocking UFO files hidden in presidential library claim US made successful contact with an alien...
Southern state residents 'desperate to escape' but homes won't sell as crash looms
Trump blasts hysteria over Qatar's $400M gift: 'We're the USA'
Cab-less truck glider leaps autonomously between road and rail
Can Tesla DOJO Chips Pass Nvidia GPUs?
Iron-fortified lumber could be a greener alternative to steel beams
One man, 856 venom hits, and the path to a universal snakebite cure
Dr. McCullough reveals cancer-fighting drug Big Pharma hopes you never hear about…
EXCLUSIVE: Raytheon Whistleblower Who Exposed The Neutrino Earthquake Weapon In Antarctica...
Doctors Say Injecting Gold Into Eyeballs Could Restore Lost Vision
Dark Matter: An 86-lb, 800-hp EV motor by Koenigsegg
Spacetop puts a massive multi-window workspace in front of your eyes
While AI systems can match many human capabilities, they take 10 times longer to learn. Now, by copying the way the brain works, Google DeepMind has built a machine that is closing the gap.
Intelligent machines have humans in their sights. Deep-learning machines already have superhuman skills when it comes to tasks such as face recognition, video-game playing, and even the ancient Chinese game of Go. So it's easy to think that humans are already outgunned.
But not so fast. Intelligent machines still lag behind humans in one crucial area of performance: the speed at which they learn. When it comes to mastering classic video games, for example, the best deep-learning machines take some 200 hours of play to reach the same skill levels that humans achieve in just two hours.
So computer scientists would dearly love to have some way to speed up the rate at which machines learn.