Selfies to self-diagnosis: Algorithm ‘amps up’ smartphones to diagnose disease

Home / Clinical Practice / Selfies to self-diagnosis: Algorithm ‘amps up’ smartphones to diagnose disease

Selfies to self-diagnosis: Algorithm ‘amps up’ smartphones to diagnose disease

Accessible, connected, and computationally powerful, smartphones aren’t just for “selfies” anymore. They have emerged as powerful evaluation tools capable of diagnosing medical conditions in point-of-care settings. Smartphones also are a viable solution for health care in the developing world because they allow untrained users to collect and transmit data to medical professionals.

IMAGE: IMAGES OF A DIAGNOSTIC ASSAY ARE CAPTURED USING A SMARTPHONE CAMERA. REGIONS OF INTEREST ARE EXTRACTED AND ARE CONVERTED TO HSV (HUE, SATURATION, VALUE) SPACE. AFTER THE CONVERSION PROCESS, THE…view more

CREDIT: FLORIDA ATLANTIC UNIVERSITY

Although smartphone camera technology today offers a wide range of medical applications such as microscopy and cytometric analysis, in practice, cell phone image tests have limitations that severely restrict their utility. Addressing these limitations requires external smartphone hardware to obtain quantitative results – imposing a design tradeoff between accessibility and accuracy.

Researchers from Florida Atlantic University’s College of Engineering and Computer Science have developed a novel cell phone imaging algorithm that enables analysis of assays typically evaluated via spectroscopy, a highly sophisticated and powerful device used in scientific research.

Through the analysis of more than 10,000 images, the researchers have been able to demonstrate that the saturation method they developed consistently outperformed existing algorithms under a wide range of operating field conditions. Their findings, published in the journal Analyst of the Royal Society of Chemistry, is a step forward in developing point-of-care diagnostics by reducing the need for required equipment, improving the limit of detection, and increasing the precision of quantitative results.

“Smartphone cameras are optimized for image appearance rather than for quantitative image-based measurements, and they can’t be bypassed or reversed easily. Furthermore, most lab-based biological and biochemical assays still lack a robust and repeatable cell phone analogue,” said Waseem Asghar, Ph.D., lead author and an assistant professor in FAU’s Department of Computer and Electrical Engineering and Computer Science. “We have been able to develop a cell phone-based image preprocessing method that produces a mean pixel intensity with smaller variances, lower limits-of-detection, and a higher dynamic range than existing methods.”

For the study, Asghar and co-authors Benjamin Coleman and Chad Coarsey, graduate students in the Asghar Laboratory in FAU’s College of Engineering and Computer Science, performed image capture using three smartphones: the Android Moto G with a 5 megapixel (MP) camera; the iPhone 6 with a 12 MP camera, and the Samsung Galaxy Edge 7 with a 12 MP camera.

They tested for image capture at various conditions, measured algorithm performance, tested sensitivity to camera distance, tilt and motion, and examined histogram properties and concentration response. They also examined limit-of-detection as well as properties of saturation, ambient lighting levels and relationship with red-green-blue (RGB) color space. Cell phone images are natively stored as arrays of RGB pixel intensities, commonly referred to as color channels.

Using several thousand images, the researchers compared saturation analysis with existing RGB methods and found that it both analytically and empirically improved performance in the presence of additive and multiplicative ambient light noise. They also showed that saturation analysis can be interpreted as an optimized version of existing RGB ratio tests. They verified that the ideal image capture conditions include constant white light, a clean white background, minimal distance to the sample and zero angular displacement of the camera.

Asghar, Coleman and Coarsey also applied the test to an ELISA (enzyme-linked immunosorbent assay), a plate-based assay technique designed for detecting and quantifying substances such as peptides, proteins, antibodies and hormones. They discovered that for HIV, saturation analysis enabled an equipment-free evaluation and a limit-of-detection was significantly lower than what is currently available with RGB methods.

The FAU-developed methodology represents an improvement in repeatability, practicality, and image capture noise rejection. In addition, saturation analysis is not affected by many of the major limiting factors for image-based tests, such as ambient lighting variations, shading, and variable light levels. The researchers anticipate that the favorable properties of saturation analysis will encounter and enable cell phone image-based point-of-care tests with less equipment overhead and lower limits-of-detection.

“The research taking place in the Asghar Laboratory at Florida Atlantic University has important implications for diagnostic medicine and the delivery of health care in developed as well as developing countries,” said Stella Batalama, Ph.D., dean of FAU’s College of Engineering and Computer Science. “Professor Asghar and his team are driven to continue to develop cutting-edge technology that has the ability to remotely detect and diagnose diseases rapidly, accurately and inexpensively. This latest algorithm they have developed is one of the many advances they are making in this field.”

Leave a Reply

Your email address will not be published.