La pain scale A scale from 0 to 10 only works if you can speak. What if you have Alzheimer's, cerebral palsy, or are intubated in intensive care? In that case, the doctor has to guess based on your face. The problem is, "guessing" isn't a science.
An Australian team has built PainChek, an app that uses artificial intelligence to do what humans do poorly: recognize microscopic expressions of pain. Nine muscle movements invisible to the naked eye, catalogued by the Face Action Coding System, are analyzed in three seconds. The algorithm returns a score from 0 to 42. 90% accuracy. Already used in hundreds of nursing homes in Australia, the United Kingdom, and Canada. In the United States, it awaits FDA approval.
My “inappropriate” question is: if a machine can read pain better than we can, what does that tell us about our ability to see the suffering of others?
When the pain scale stops working
Approximately 70% of patients in intensive care experience pain that is not recognized or treated adequately. In nursing homes, Between 60% and 80% of older adults with dementia suffer regularly, but healthcare workers struggle to interpret the signals. Numeric Rating Scale, the SEA (Visual Analogue Scale) and other traditional scales have a structural limitation: they presuppose that the patient can communicate.
For those who cannot speak, there are observational tools like the PAINAD or the Abbey Pain Scale. But they require time, training, and above all, human interpretation. And it's not easy. Often, agitated older adults are sedated with psychotropic drugs, and the pain remains undiagnosed.
How Artificial Intelligence of Pain Works
PainChek uses facial analysis technology based on Face Action Coding System, the same one used by researchers to study emotions since 1978. The algorithm was trained on thousands of images of faces experiencing pain and looks for nine specific muscle movements: raising the upper lip, contracting the eyebrows, tensing the cheeks, etc. Micro-expressions that last fractions of a second and that a human observer struggles to grasp, especially in patients with expressions altered by neurological conditions.
It's like a digital thermometer, but for pain. You open the app, point your smartphone 30 centimeters from the person's face, and record a three-second video. The neural network analyzes micro-contractions and generates a score. Then the operator completes a checklist of other behavioral signals: moaning, guarding a body part, sleep disturbances. The result is uploaded to a cloud archive that tracks (and displays) the evolution of the pain over time.
Kreshnik Hoti, senior researcher at PainChek, explains:
“We initially thought that AI should automate everything, but now we see that hybrid use (AI plus human input) is our main strength.”
The system doesn't replace clinical judgment; it supports it. And above all, it reduces the margin of error in situations where human interpretation is most fragile.
Findings in British care homes
Orchard Care Homes introduced PainChek in four facilities starting in January 2021. Within a few weeks, prescriptions for psychotropic drugs have dropped and the corridors were emptied of screams. Internal data shows A 25% reduction in the use of antipsychotics at the workplace level. In Scotland, falls decreased by 42%.That's not all: seniors who skipped meals due to undiagnosed dental pain have started eating again. Those isolated by their suffering have resumed socializing.
La AI technology has been approved by the Therapeutic Goods Administration Australian in 2017, then authorised in the UK, Canada and New Zealand. According to company data, it has recorded over 10 million ratings with 90% accuracy. In the United States, as mentioned, it is awaiting FDA approval.
The operational benefit is immediate: a complete assessment with the Abbey Pain Scale takes 20 minutes, while PainChek takes less than five. This frees up time for clinical staff and allows for more frequent pain monitoring, transforming it into a routine vital sign like blood pressure.
AI Pain Scale: Remaining Questions
Automatic facial analysis, let's face it, has a problematic history with algorithmic biases, especially related to skin color. PainChek claims to have trained the system on diverse datasets, but independent studies of 2024 Studies on cerebral palsy show that accuracy still varies by population. An expression of nausea or fear can be mistaken for pain. And there's always the risk that clinicians will rely too heavily on the algorithm, eroding their own observational skills.
Baird, who now lives with chronic pain, has a clear stance: "I had a hard time convincing people I had pain. PainChek would have made a huge difference." If artificial intelligence can give a numerical voice to those who suffer in silence, then perhaps it's worth adding an extra line to the medical record. Even if that line is written by a machine.
The pain scale won't disappear. But it's changing shape. And perhaps, after seventy years of "0 to 10," it's about time.