Apps help make medical diagnoses, but they’re still a run-in


The same devices used to take selfies are being repurposed and marketed for quick access to information needed to monitor patient health. Pressing a finger on the phone’s camera lens can measure your heart rate. A microphone, which is kept next to the bed, can check for sleep apnea.

In the best of this new world, data is transmitted remotely to a medical professional for the convenience and comfort of the patient – all without the need for expensive hardware.

But the use of smartphones as diagnostic tools is still a work in progress. Although doctors and their patients have had some success in the real world, experts said their overall potential remains unrealized and uncertain.

Smartphones come packed with sensors that are capable of monitoring a patient’s vital signs. they can Help evaluate people for concussionsAnd Watch out for atrial fibrillation and performing mental health checks, just to name a few uses of some emerging apps.

Enthusiastic companies and researchers are taking advantage of phones’ built-in cameras and light sensors; microphones. accelerometers, which detect body movements; gyros. And even the speakers. The apps then use artificial intelligence software to analyze the collected sights and sounds to create easy communication between patients and doctors. In 2021, more than 350,000 digital health products were available on app stores, according to a Grand View Research report.

He said, “It is very difficult to put devices in the patient’s home or in the hospital, but everyone is just walking around with a mobile phone connected to the network.” Andrew Justin, a physician and CEO of the sensor network company Artisight. Most Americans own smartphones, including more than 60 percent of people age 65 and older, according to Pew Research Center. The pandemic has also made people more comfortable with virtual care.

The makers of some of these products sought clearance from the Food and Drug Administration to market them as medical devices. others It is classified as exempt From the regulatory process, it is placed in the same clinical classification as Band-Aid. But the way the agency handles medical devices based on artificial intelligence and machine learning is It is still being modified to reflect the adaptive nature of the program.

Ensuring accuracy and clinical validation is critical to securing acceptance by healthcare providers. He said many tools still need fine-tuning Eugene Yangclinical professor of medicine at the University of Washington.

Judging these new technologies is difficult because they rely on algorithms generated by machine learning and artificial intelligence to gather data, rather than the physical tools typically used in hospitals. So the researchers can’t “compare apples to apples” with medical industry standards, Yang said. He added that the failure to build such assurances could undermine the technology’s goals of mitigating costs and access because the clinician must still verify the results.

Big tech companies like Google have invested heavily in the region, catering to doctors and home caregivers, as well as consumers. Currently, Google Fit users can check their heart rate by placing their finger on the back camera lens or track their breathing rate using the front camera.

Google Search uses machine learning and computer vision, a field within artificial intelligence that relies on information from visual inputs such as videos or images. Instead of using a blood pressure cuff, for example, the algorithm can interpret subtle visual changes to the body that act as proxies and biosignals for blood pressure, said Shwetak Patel, director of health technologies at Google and professor of electrical and computer engineering. at the University of Washington.

Google is also studying the effectiveness of its smartphone’s built-in microphone in detecting heartbeats and murmurs and using the camera to preserve eyesight by screening for diabetic eye disease, according to company information. Published in 2022.

tech giant recently Buy Sound Life Sciencea Seattle startup company with domain FDA-approved sonar technology application. It uses the smart device’s speaker to bounce off inaudible pulses from the patient’s body to determine movement and monitor breathing., which is based in Israel, also uses a smartphone camera to calculate vital signs. Company spokeswoman Mona Pobelian-Yona said her software studies the area around the eyes and analyzes the light reflected from the blood vessels to the lens.

Apps reach specialties like optometry and mental health:

  • with microphone, Canary speech It uses the same underlying technology as Amazon’s Alexa to analyze patients’ voices for mental health conditions. The software could integrate with telemedicine appointments and allow clinicians to screen for anxiety and depression using a library of audio biomarkers and predictive analytics, said Henry O’Connell, the company’s chief executive.
  • ResApp Health based in Australia Obtained FDA clearance in 2022 for an iPhone app that monitors moderate to severe obstructive sleep apnea by listening for breathing and snoring. SleepCheckRx, which requires a prescription, is minimally invasive compared to the sleep studies now used to diagnose sleep apnea.
  • brightlamp reflex The app is a clinical decision support tool to help manage concussions and visual rehabilitation, among other things. Using an iPad or iPhone camera, the mobile app measures how a person’s pupils react to changes in light. Through machine learning analysis, the images give practitioners data points to evaluate patients. Brightlamp sells directly to healthcare providers and is used in more than 230 clinics. Doctors pay a standard annual fee of $400 per account, which is not covered by insurance. The Department of Defense has a clinical trial underway using Reflex.

Brightlamp CEO Curtis Sloss said that in some cases, like with the Reflex app, the data is processed directly on the phone — not in the cloud. By processing everything on the device, the app avoids privacy issues, where streaming data elsewhere requires patient consent.

But algorithms need to be trained and tested by collecting reams of data, and this is an ongoing process.

Researchers, for example, have found that some computer vision applications, including some for heart rate and blood pressure monitoring, can be less accurate for darker skin. Studies are ongoing to find better solutions.

“We’re not there yet,” Yang said. “That’s the whole point.”

This article was produced by Kaiser Health NewsAnd Program of the Kaiser Family Foundation, a nonprofit organization that provides information about health issues to the nation.

Leave a Comment