Remote heart rate sensors can be biased against darker skin. A UCLA team offers a solution

By combining two technologies, camera and radar, researchers boosted accuracy across a diverse range of skin tones

Key takeaways:

  • Engineering for equity. New engineering approaches like the one pioneered by UCLA researchers are needed to overcome the shortcomings of current health-related remote sensing technologies.

  • Further fixes. The researchers say the new device is an initial step toward health diagnostics that are fair and accurate across a diverse set of attributes, including skin tone, body mass and gender.

As telemedicine has grown more popular, so have devices that allow people to measure their vital signs from home and transmit the results by computer to their doctors. Yet in many cases, obtaining accurate remote readings for people of color has proved a persistent challenge. 

Take remote heart rate measurements, for example, which rely on a camera sensing subtle changes in the color of a patient’s face caused by fluctuations in the flow of blood beneath their skin. These devices, part of an emerging class of remote technologies, consistently have trouble reading color changes in people with darker skin tones, said Achuta Kadambi, an assistant professor of electrical and computer engineering at the UCLA Samueli School of Engineering.

Kadambi and his team have now developed a remote diagnostic technique that overcomes this implicit bias against darker skin while also making heart rate readings more accurate for patients across the full range of skin tones. Their secret? Combining the light-based measurements of a camera with radio-based measurements from radar. 

The researchers presented their findings, recently published in the journal ACM Transactions on Graphics, at the SIGGRAPH 2022 conference in Vancouver, British Columbia. The conference, held both virtually and in-person, is organized annually by members of the Association for Computing Machinery.

The advance could lead to new classes of high-performing medical devices and remote technologies that are more accurate and equitable, the researchers said, allowing doctors and health care systems to remotely monitor patients with confidence, both in clinical settings and from patients’ homes.

“In the larger picture, this work shows that practical and innovative engineering solutions can address persistent biases in medical devices,” said Kadambi, who is also a member of the California NanoSystems Institute at UCLA. “But that first requires an acknowledgement that such bias means the current best technology may not be the best for everyone. Through thoughtful design, we can find equitable solutions that perform as well or better.”

The UCLA team’s fusion of two techniques shows a promising path toward achieving those goals, said Kadambi, who is also an assistant professor of computer science and principal investigator on the research. As head of the Visual Machines Group at UCLA, he has written about different types of biases in medical devices and how to fix them.

In developing their new technology, the researchers first showed that the remote sensing device itself was the source of the bias, demonstrating in their paper that higher levels of melanin, natural pigments in the skin, interfere with what is known as the photoplethysmography, or PPG, signal that is used in current camera-based remote heart rate measurements.

PPG signaling is also used to measure heart rate through devices like pulse oximeters, which clamp onto a patient’s finger, as well as some wearable commercial products and smartwatch-powered apps. These devices emit light onto the skin and sense changes in the amount of light reflected back by circulating blood just below the surface. That reflected light produces the PPG signal, a measure of a patient’s heart rate.

Previous efforts to address skin tone biases in such technologies have generally looked to correct them through additional programming or by expanding baseline standards using a more diverse range of skin tones. But neither of these approaches target the real issue, Kadambi said, which is the physics of the device itself.

The UCLA researchers instead turned to another technology that can give an estimate of heart rate: radar. At 77 gigahertz, radar can sense subtle changes in the displacement of the chest from heartbeats. And while this method overcomes the issue of skin tone bias, it is less reliable than PPG signaling. However, they found success by combining these two different modes of sensing — camera and radar — and refining them through machine learning to work in concert.

In tests with 91 people, the researchers demonstrated that their camera-radar system outperforms camera-based remote PPG in both measurement accuracy and fairness across a wide variety of skin tones.

“Multimodal remote health care has the potential to make devices fairer not just across skin tones but across a diverse set of attributes, such as body mass index, gender and various health conditions,” said Alexander Vilesov, a UCLA electrical and computer engineering graduate student and a co-lead author of the paper. “Most of these aspects have not been thoroughly explored, and part of our future research seeks to understand such biases.”

The researchers suggested that such fairness-based improvements could be made to other types of technologies, such as thermal, acoustic, near-infrared and light polarization sensors.

“The COVID-19 pandemic revealed that new technologies are needed to allow doctors and care teams to remotely monitor their patients,” said study co-author Dr. Laleh Jalilian, a clinical assistant professor of anesthesiology and perioperative medicine at UCLA Health. “A key focus from the beginning of our collaboration was developing medical technology that performs fairly and with high accuracy across patients of diverse skin tones, as this will give doctors confidence that they can make high-quality medical decisions.” 

UCLA electrical and computer engineering graduate students Pradyumna Chari and Adnan Armouti are also co-lead authors of the paper. Other paper authors, all members of the Visual Machines Group, are UCLA electrical and computer engineering graduate students Anirudh Bindiganavale Harish, Kimaya Kulkarni and Ananya Deoghare.

The research was supported by a National Science Foundation CAREER award, an Army Young Investigator Award, a Cisco Research Award and the UCLA Health Innovation Fund.

READ FULL ARTICLE HERE

Previous
Previous

Terasaki Institute Opens New Biomedical Incubator on L.A.’s Westside

Next
Next

The Children’s Hospital Los Angeles Is Using Robots To Evaluate Babies’ Neuromotor Skills