Statistical Modelling and its practical uses in cancer imaging
(Posted: 22 June 2015)
Cancer has something of a sweet tooth: tumours have an enormous need for energy and tend to hog available sugars to fuel their growth. But this relative greed can also help doctors to find them - PET scans that pinpoint hotspots of sugar uptake in the body can highlight the locations and aggressiveness of tumours, explains Dr James Sweeney, a Lecturer in Statistics at UCD School of Mathematical Sciences.
“In Positron Emission Tomography (PET), a glucose substrate laced with a radioactive isotope is injected into the body's bloodstream, and cancerous tissue hoards the glucose,” he says. “So if the patient is scanned in a PET medical imaging machine over a period of, say, 90 minutes, it is possible to infer (from the resulting series of medical images) regions of the body where cancerous growths are prevalent, and their vigour.”
Such scans can help avoid extremely invasive procedures to remove benign lumps or tumours that are growing so slowly as to not register a threat, and which have little or no impact on quality of life, notes Dr Sweeney, but there are caveats.
“PET imaging itself can be invasive, as it needs samples of arterial blood which is a painful and invasive procedure for the patient,” he explains. “The patient also needs to lie stationary in a machine for around 90 minutes in order to get the required number of images.”
That’s why Dr Sweeney and clinical collaborators are looking to statistics to take the pain out of PET scans. Using statistical modelling techniques, he wants to bypass the need for an arterial blood sample. “Instead of taking invasive samples of arterial blood to assess glucose concentration, we can harness statistical tools to try to infer this from the medical images themselves, with the tradeoff being increased uncertainty in our estimates of concentration,” he explains.
He is also looking to speed up the scans, thereby reducing the ‘lying still’ time for each patient, and with the aim of cutting down on waiting times for scans.
“We are also trying to reduce the time it takes for each scan down to 15 minutes,” he says. “At present, this results in a noisy series of images, which are of limited value. But, by incorporating prior information from past studies of large cohorts of patients and using Bayesian statistical methods, we can dramatically increase the value of these limited scans so as to nearly render them comparable to the full studies. This will have the benefit of hugely increasing the throughput of patients and better use of the machines themselves.”
And interestingly, the underlying statistical methods that Dr Sweeney uses to improve medical imaging could also help us to fight fraud. “For statistical analysis of medical images you need tools to deal with the huge volume of information provided by machines, so it is essentially a dimension reduction problem,” he explains. “Similarly, in fraud detection, companies have huge volumes of information on client transactions and attempting to detect patterns and identify the important variables is an analogous task.
Dr James Sweeney was interviewed by freelance journalist Dr Claire O'Connell. At the time of interview Dr Sweeney was in UCD School of Mathematical Sciences and is now in UCD School of Business.