What if we could detect a tumor before it was even visible on a mammogram? Regina Barzilay is working to do just that. The Massachusetts Institute of Technology professor is using machine learning to make early diagnosis decisions. In collaboration, physicians at Massachusetts General Hospital: Taghian Alphonse, chief of breast radiation oncology, Kevin Hughes, co-director of the Avon Comprehensive Breast Evaluation Center; and Constance Lehman, chief of the breast imaging division — the group is analyzing patient data to make more specific detection and diagnostic decisions.
Out of the 1.7 million people who are diagnosed with cancer in the U.S. every year, only three percent participate in clinical trials. As it stands now, advancements in treatment depend on that tiny fraction of patients. Barzilay is developing tools to expand this research. Through using natural language processing (NLP), Barzilay and her students have organized clinical data from 108,000 patient reports. The database they’ve created has an accuracy rate of 98 percent. The next step is to implement treatment outcomes into their findings.
Just as Facebook and Instagram use algorithms to show you ads pertaining to your personal interests, machines can develop insight from collected data. Machines can spot subtle deficiencies and signs in scans that a physician can’t detect. Working with Lehman and graduate student Nicolas Locascio, they’re using deep learning to develop automated functions to interpret mammograms. Their hope is to identify a tumor before it’s detectable by the human eye, and predict which patients are susceptible to recurrence after their first treatment.
“These innovations will make a really big difference,” Barzilay told MIT News. “It is an entry point. There is so much to do. We are just getting started.”
© 2024 Created by radRounds Radiology Network. Powered by
You need to be a member of radRounds Radiology Network to add comments!
Join radRounds Radiology Network