>
Penn State Law rocked by secret audio exposing dark DEI agenda…
The Nuclear Missile That Vladimir Putin Just Tested Could Destroy an Area the Size...
Half a Million Waiting in Libya To Invade Europe
US To Develop Small Modular Nuclear Reactors For Commercial Shipping
New York Mandates Kill Switch and Surveillance Software in Your 3D Printer ...
Cameco Sees As Many As 20 AP1000 Nuclear Reactors On The Horizon
His grandparents had heart disease.
At 11, Laurent Simons decided he wanted to fight aging.
Mayo Clinic's AI Can Detect Pancreatic Cancer up to 3 Years Before Diagnosis–When Treatment...
A multi-terrain robot from China is going viral, not because of raw speed or power...
The World's Biggest Fusion Reactor Just Hit A Milestone
Wow. Researchers just built an AI that can control your body...
Google Chrome silently installs a 4 GB AI model on your device without consent
The $5 Battery That Never Dies - Edison Buried This 100 Years Ago

Recently, veterinarians have developed a protocol for estimating the pain a sheep is in from its facial expressions, but humans apply it inconsistently, and manual ratings are time-consuming. Computer scientists at the University of Cambridge in the United Kingdom have stepped in to automate the task. They started by listing several "facial action units" (AUs) associated with different levels of pain, drawing on the Sheep Pain Facial Expression Scale. They manually labeled these AUs—nostril deformation, rotation of each ear, and narrowing of each eye—in 480 photos of sheep. Then they trained a machine-learning algorithm by feeding it 90% of the photos and their labels, and tested the algorithm on the remaining 10%. The program's average accuracy at identifying the AUs was 67%, about as accurate as the average human, the researchers will report today at the IEEE International Conference on Automatic Face and Gesture Recognition in Washington, D.C. Ears were the most telling cue.