Feb. 20 (UPI) -- Google has developed an artificial intelligence algorithm that can assess someone's risk for heart disease by looking at their retinas.
The tech giant says its method is as reliable as a doctor conducting a blood test, according to a study published Monday in the journal Nature Biomedical Engineering.
Google's software, developed with its health-tech subsidiary Verily, analyzes data, including an individual's age, blood pressure and whether they smoke, to assess risk. Retinal characteristics include a variety of "features, patterns, colors, values and shapes," Google said.
"The caveat to this is that it's early, [and] we trained this on a small data set," Google's lead researcher, Lily Peng, lead researcher on the project, told USA Today. "We think that the accuracy of this prediction will go up a little bit more as we kind of get more comprehensive data. Discovering that we could do this is a good first step. But we need to validate."
Google and Verily's scientists in Mountain View, Calif., used machine learning to analyze a medical dataset of 284,335 patients and validated on two independent datasets of 12,026 and 999 patients.
"We predicted cardiovascular risk factors not previously thought to be present or quantifiable in retinal images, such as age, gender, smoking status, systolic blood pressure and major adverse cardiac events," researchers wrote in the study. "We also show that the trained deep-learning models used anatomical features, such as the optic disc or blood vessels, to generate each prediction."
The rear interior wall of the eye, called the fundus, contains blood vessels that reflect the body's overall health.
With 70 percent accuracy, Google researchers were able to predict who had a cardiovascular event in the past five years and who didn't. This compares well to medical professionals using blood tests called SCORE, which have a 72 percent successful prediction rate.
Google developed a "heatmap," which is a graphical representation of data that determined which pixels in an image have the most significance in a specific risk factor. For example, in making predictions about blood pressure, Google's algorithm paid more attention to blood vessels.
"Pattern recognition and making use of images is one of the best areas for AI right now," said Harlan M. Krumholz, a professor of medicine and director of Yale's Center for Outcomes Research and Evaluation. "And this is going to come from photographs and sensors and a whole range of devices that will help us essentially improve the physical examination and I think more precisely hone our understanding of disease and individuals and pair it with treatments."'
Google's Peng predicts it will be years before doctors will be able to use the retinal technology but says "it's not just when it's going to be used, but how it's going to be used."
Aside from assessing heart disease risk, Peng hopes the technology can be applied in other areas, perhaps even cancer research.
"I am very excited about what this means for discovery," Peng said. "We hope researchers in other places will take what we have and build on it."
As part of its medical research, Google is collecting medical records of 10,000 individuals over four years.
"They're taking data that's been captured for one clinical reason and getting more out of it than we currently do," Luke Oakden-Rayner, a medical researcher at the University of Adelaide who specializes in machine learning analysis, told The Verge. "Rather than replacing doctors, it's trying to extend what we can actually do."