Snap a selfie, check for sign of pancreatic cancer

31 August 2017
News
Pancreatic cancer has one of the worst prognoses — with a five-year survival rate of 9 percent — in part because there are no telltale symptoms or non-invasive screening tools to catch a tumor before it spreads. An easy early warning system could mean add quite a bit to the survival rate here. The University of Washington researchers are developing an app that could allow people to easily screen for pancreatic cancer (and other diseases) — by snapping a smartphone selfie.

###selfie###

The solution is named BiliScreen. It uses a smartphone camera, computer vision algorithms and machine learning tools to detect increased bilirubin levels in a person’s sclera - the white part of the eye. The app is described in a paper that will be presented Sept. 13 at Ubicomp 2017, the Association for Computing Machinery’s International Joint Conference on Pervasive and Ubiquitous Computing.

New screening program

One of the earliest symptoms of pancreatic cancer, as well as other diseases, is jaundice. This yellow discoloration of the skin and eyes is caused by a buildup of bilirubin in the blood. The ability to detect signs of jaundice when bilirubin levels are minimally elevated — but before they’re visible to the naked eye — could enable an entirely new screening program for at-risk individuals.

“The problem with pancreatic cancer is that by the time you’re symptomatic, it’s frequently too late,” said lead author Alex Mariakakis, a doctoral student at the Paul G. Allen School of Computer Science & Engineering. “The hope is that if people can do this simple test once a month — in the privacy of their own homes — some might catch the disease early enough to undergo treatment that could save their lives.”

The results from an initial study are encouraging. In this study  of 70 people, the BiliScreen app — used in conjunction with a 3-D printed box that controls the eye’s exposure to light — correctly identified cases of concern 89.7 percent of the time, compared to the blood test currently used. This blood test currently used to measure bilirubin levels — which is typically not administered to adults unless there is reason for concern — requires access to a health care professional and is inconvenient for frequent screening.

Building upon earlier work

By the time people notice the yellowish discoloration in the sclera, bilirubin levels are already well past cause for concern. The UW team wondered if computer vision and machine learning tools could detect those color changes in the eye before humans can see them.

BiliScreen builds on earlier work from the UW’s Ubiquitous Computing Lab, which previously developed BiliCam. This smartphone app screens for newborn jaundice by taking a picture of a baby’s skin. A recent study in the journal Pediatrics showed BiliCam provided accurate estimates of bilirubin levels in 530 infants. In collaboration with UW Medicine doctors, the UbiComp lab specializes in using cameras, microphones and other components of common consumer devices — such as smartphones and tablets — to screen for disease.

BiliScreen is designed to be an easy-to-use, non-invasive tool that could help determine whether someone ought to consult a doctor for further testing. Beyond diagnosis, it could also potentially ease the burden on patients with pancreatic cancer who require frequent bilirubin monitoring.

Eyes are gateway to the body

“The eyes are a really interesting gateway into the body — tears can tell you how much glucose you have, sclera can tell you how much bilirubin is in your blood,” said senior author Shwetak Patel, the Washington Research Foundation Entrepreneurship Endowed Professor in Computer Science & Engineering and Electrical Engineering.  “Our question was: Could we capture some of these changes that might lead to earlier detection with a selfie?”

BiliScreen uses a smartphone’s built-in camera and flash to collect pictures of a person’s eye as they snap a selfie. The team developed a computer vision system to automatically and effectively isolate the white parts of the eye, which is a valuable tool for medical diagnostics. The app then calculates the color information from the sclera — based on the wavelengths of light that are being reflected and absorbed — and correlates it with bilirubin levels using machine learning algorithms.

Technology has promise

“This relatively small initial study shows the technology has promise,” said co-author Dr. Jim Taylor, a professor in the UW Medicine Department of Pediatrics whose father died of pancreatic cancer at age 70. The next steps for the research team include testing the app on a wider range of people at risk for jaundice and underlying conditions, as well as continuing to make usability improvements — including removing the need for accessories like the box and glasses.