A team of Stanford University researchers are on track to giving smartphones the power to diagnose skin cancer. In a study published in Nature, researchers used a Google-created algorithm to identify and diagnose more than 2,000 different types of skin diseases using
129,450 clinical images.
In the U.S. alone, there are about 5.4 million new cases of skin cancer discovered every year — racking up bills that total around $8 billion. Although the survival rate is nearly 100 percent if melanoma is detected early on, those chances drop to around 14 percent if caught late in the game. To make matters more complex, there’s a stark ratio of 10 biopsied cases per melanoma discovered. Using deep learning, a kind of artificial intelligence that mimics the brain’s neural networks, not only could visual diagnosis become more efficient, but it would also reduce costs and time.
The pre-existing algorithm was already programmed to identify 1.28 million images from 1,000 corresponding object categories. The researchers teamed up with dermatologists at Stanford Medicine and Helen M. Blau, PhD, Stanford Professor of Microbiology and Immunology to classify the nearly 130,000 images of skin legions that were provided by the University of Edinburgh and the International Skin Imaging Collaboration Project. They then tested the results against 21 board-certified dermatologists, ultimately proving that to be able to identify both common and deadly skin cancers.
“Everyone will have a supercomputer in their pockets with a number of sensors in it, including a camera,” said Andre Esteva, co-lead author of the study. “What if we could use it to visually screen for skin cancer?” The researchers believe that in the future after further testing, the algorithm can soon be used on smart phones to diagnose skin cancer or other diseases.
© 2024 Created by radRounds Radiology Network. Powered by
You need to be a member of radRounds Radiology Network to add comments!
Join radRounds Radiology Network