How AI-enhanced imaging technology helps better detect and treat breast cancer
Researchers say technology makes breast cancer cells ‘glow’ in MRIs

Researchers from the University of Waterloo recently developed an AI-optimized imaging technology that makes breast cancer cells glow light green in MRIs, making them easier to spot.
Amy Tai, a computer science PhD student, is part of the research team. She says the technology, originally used to improve prostate cancer detection, will not only improve breast cancer detection but also help determine the best treatment plans for patients.
Tai joined CBC Kitchener-Waterloo's The Morning Edition with host Craig Norris to share more on their new imaging technology.
This interview has been edited for length and clarity
CRAIG NORRIS: Explain this process — it's the correlated diffusion imaging, correct?
AMY TAI: That's correct, yes.
NORRIS: How does it detect cancerous tissue?
TAI: It's a new form of MRI. MRI is basically a test that uses strong magnets and radio waves to take detailed pictures of the inside of your body. And it's really popular because it doesn't use radiation like X-rays and CT scans.
Now to get into the details of it, the way that synthetic correlated diffusion imaging, or CDIs for short, works is that it uses AI. It's specifically optimized using AI to make the tissues more differentiated with each other.
If we think about, for example, density, like breast tissue has different layers of density. If you think about when you go to a grocery store, some aisles are more packed, some aisles have less people.
So in healthy tissue, the aisles are more free, water can roam around them more freely. But in cancerous tissue, the cells actually grow more densely, and they grow more irregularly, right? Because in those regions, there's a lot more cells. They're growing very fast. So if you think about grocery store, there's a lot more people there. So it's hard for the water to go through.
Our technology leverages that difference to kind of really differentiate what the cancerous tissue looks like versus the healthy tissue.
NORRIS: So, it automatically senses, 'Oh, something's not right here.'
TAI: Exactly, yeah.

NORRIS: What makes it turn green?
TAI: The green is just the colour that we chose, but it's more of the intensity. So we could highlight it more in red or a different colour. But we just chose green because we thought it would be more neutral to the eye as well.
NORRIS: Yeah, that's a good point. Your study found that the imaging can lead to more effective treatment as well. Explain that.
TAI: Yeah, of course. So the images themselves contain really key predictive information to help clinicians kind of determine where the tumour boundary is, right? So like the margin. It's more precise for surgeons, for example, when they're trying to extract the tumour because they know exactly where the margin of the tumour is.
And they really help to carefully limit the amount of tissue removed, right? Or ensure that all the cancerous tissue is removed the first time, so a second operation isn't necessarily needed.
NORRIS: How did you go about testing this?
TAI: We did a retrospective study where we used the pre-treatment images of more than 350 patients at 10 different medical institutions, which was done in a study by the American College of Radiology Imaging Network.
NORRIS: Were they MRIs as well?
TAI: They were all MRIs, yeah.
WATCH | Here's how new technology could help improve the detection and treatment of breast cancer:
NORRIS: Now, I understand that you started using the imaging technology on prostate cancer patients. How is that going?
TAI: That's going really well. So that was initially developed by my supervisor Alex, and an amazing, talented previous Masters student Hayden. And it's currently under retrospective study and we're testing it, trying to evaluate the performance, stability, and functionality in different environments as well.
NORRIS: So give us a sense of, do you see or foresee what other cancers this could be used for?
TAI: Oh yes, absolutely. So we've already illustrated this great potential for prostate, and now we're seeing promising results for breast.
And there's more head and neck cancers potentially that might be a good candidate for this, because they have characteristics which are similar to breast and prostate, where the density and so forth would be really important here in helping to delineate or differentiate between cancer and healthy tissue.
NORRIS: Is there something that AI can do that humans can't? You know, like doctors looking at it themselves?
TAI: Absolutely. I mean, the AI really helps to scale the differences, right? So it's looking at, for example in this case, like 350 patients. It can do it really quickly, and it can really optimize for these specific nuances and characteristics. Whereas for the human eye, or even for human annotators, it would take a lot more time, and it will also take a lot more effort to do.
NORRIS: What made you want to do this? What started all this for you?
TAI: So when I graduated undergrad, I really wanted to make a difference in the medical field. And growing up, cancer was a huge part of my life, and like in terms of my family life.
So I really want to do something to help people in those positions, and also just help mediate the effects of cancer overall, like catch it faster, catch us sooner, make the treatment more effective and so forth.
NORRIS: Non-PhD students like me, we always hear about these things. I've been really fortunate, and this show has been really fortunate, we have three world class universities in our listening area, and we always hear about these amazing things, right? We always wonder, down the road, when would this be available to everyone? How does that happen, that this can be broadened out?
TAI: Yeah, absolutely. So a huge part of it, I think, is first, you got to retrospective test it, right? So make sure that it's actually feasible in an offline setting. And then that's when we talk about next steps for working with connections to bring it online.
But of course, during this entire process, we've been partnering with pathologists and doctors to make sure that the technology is feasible, like that's something that they would want to use. And it would be easily incorporated into their clinical workflow before we develop even more accurate tests and so forth, and then integrate it.
NORRIS: And I assume, what has been the reaction, just anecdotally, from pathologists?
TAI: Oh, it's been really positive. So we've been presenting this work at several AI conferences ... We've been seeing really positive feedback from the medical community.
NORRIS: So, what's next for you?
TAI: Next steps, [we] are going to try and broaden it to other types of cancer, as well and also look at ways to potentially integrate this technology into the clinical workflow, and directly work with these pathologists or doctors.