>
America Growing at Odds with Itself: Something's Not Being Said
Outraged Farmers Blame Ag Monopolies as Catastrophic Collapse Looms
Exposing the Cover-Up That Could Collapse Big Medicine: Parasites
Israel's Former Space Security Chief says Aliens exist, and President Trump knows about it
Methylene chloride (CH2Cl?) and acetone (C?H?O) create a powerful paint remover...
Engineer Builds His Own X-Ray After Hospital Charges Him $69K
Researchers create 2D nanomaterials with up to nine metals for extreme conditions
The Evolution of Electric Motors: From Bulky to Lightweight, Efficient Powerhouses
3D-Printing 'Glue Gun' Can Repair Bone Fractures During Surgery Filling-in the Gaps Around..
Kevlar-like EV battery material dissolves after use to recycle itself
Laser connects plane and satellite in breakthrough air-to-space link
Lucid Motors' World-Leading Electric Powertrain Breakdown with Emad Dlala and Eric Bach
Murder, UFOs & Antigravity Tech -- What's Really Happening at Huntsville, Alabama's Space Po
Recently, veterinarians have developed a protocol for estimating the pain a sheep is in from its facial expressions, but humans apply it inconsistently, and manual ratings are time-consuming. Computer scientists at the University of Cambridge in the United Kingdom have stepped in to automate the task. They started by listing several "facial action units" (AUs) associated with different levels of pain, drawing on the Sheep Pain Facial Expression Scale. They manually labeled these AUs—nostril deformation, rotation of each ear, and narrowing of each eye—in 480 photos of sheep. Then they trained a machine-learning algorithm by feeding it 90% of the photos and their labels, and tested the algorithm on the remaining 10%. The program's average accuracy at identifying the AUs was 67%, about as accurate as the average human, the researchers will report today at the IEEE International Conference on Automatic Face and Gesture Recognition in Washington, D.C. Ears were the most telling cue.