>
Israeli Prime Minister, Netanyahu will meet with Trump on Wednesday and deliver instructions...
Elon Musk Offers To Cover Legal Bills Of Epstein Survivors Who Identify New Names
Red Alert Emergency Broadcast! Tune In NOW As Alex Jones Analyzes The Insane Revelations...
330 gallons of sulphuric acid was purchased for Epstein Island on the day the FBI opened...
Drone-launching underwater drone hitches a ride on ship and sub hulls
Humanoid Robots Get "Brains" As Dual-Use Fears Mount
SpaceX Authorized to Increase High Speed Internet Download Speeds 5X Through 2026
Space AI is the Key to the Technological Singularity
Velocitor X-1 eVTOL could be beating the traffic in just a year
Starlink smasher? China claims world's best high-powered microwave weapon
Wood scraps turn 'useless' desert sand into concrete
Let's Do a Detailed Review of Zorin -- Is This Good for Ex-Windows Users?
The World's First Sodium-Ion Battery EV Is A Winter Range Monster
China's CATL 5C Battery Breakthrough will Make Most Combustion Engine Vehicles OBSOLETE

Recently, veterinarians have developed a protocol for estimating the pain a sheep is in from its facial expressions, but humans apply it inconsistently, and manual ratings are time-consuming. Computer scientists at the University of Cambridge in the United Kingdom have stepped in to automate the task. They started by listing several "facial action units" (AUs) associated with different levels of pain, drawing on the Sheep Pain Facial Expression Scale. They manually labeled these AUs—nostril deformation, rotation of each ear, and narrowing of each eye—in 480 photos of sheep. Then they trained a machine-learning algorithm by feeding it 90% of the photos and their labels, and tested the algorithm on the remaining 10%. The program's average accuracy at identifying the AUs was 67%, about as accurate as the average human, the researchers will report today at the IEEE International Conference on Automatic Face and Gesture Recognition in Washington, D.C. Ears were the most telling cue.