The need for evidence-based, outcome-focused medical imaging research for cancer management

Introduction

Imaging plays an essential role in the diagnosis, staging and treatment monitoring of cancer, and the number of radiological procedures performed on cancer patients has grown considerably over recent years. However, the types of imaging procedures used vary widely, and clinical practice rarely makes use of new advances in imaging techniques early in their development, so patients often do not benefit from them. One simulation-based analysis suggested that increased investment in imaging could yield important improvements in cancer management in countries of all incomes, and the return on that investment is predicted to be significantly larger than that achievable by investment in treatment alone1. The rationale for this prediction is that more effective use of medical imaging could facilitate earlier diagnosis and intervention, identify optimal treatments and improve monitoring of therapy, with earlier modification of ineffective but costly treatment regimens. Other key roles for medical imaging include surgical and radiotherapy planning, especially for treating larger masses that are intrinsically inhomogeneous. Given the generally low response rates of targeted therapies in a series of basket trials2, and the considerable number of targeted drugs in use that lack documented overall survival (OS) benefits3, it has become extremely important to use imaging to select appropriate treatments for individual patients. The increased appreciation of genetic and epigenetic intra-tumoral and inter-tumoral heterogeneity implies that targeted treatment decisions based only on a single core biopsy are inherently flawed. Multi-site sampling is usually not clinically viable, and alternatives such as sampling of plasma biomarkers (e.g. circulating tumour cells) are promising but have yet to be validated and incorporated into routine clinical practice4. Imaging is the only approach that can adequately sample all primary and metastatic tumours non-invasively, and the further development of methods to detect molecular heterogeneity, such as 89Zr–Pertuzumab positron emission tomography (PET)5, to produce validated clinical tools that can guide treatment decisions is essential. However, despite the growth in the adoption of imaging technologies such as X-ray CT, MRI and PET in routine radiological practice, little use is currently made of the rich molecular, cellular, metabolic, microstructural or physiologic information potentially available from advanced multi-modal imaging techniques. In practice, there remain substantial obstacles to current imaging techniques achieving their full potential, which motivates this discussion of how to address these challenges.

Suboptimal clinical translation in medical imaging

In clinical practice, newer iterations of radiotherapy such as stereotactic ablative radiotherapy (SABR)6, proton beam therapy7, novel targeted chemotherapeutics and immunotherapies, are still assessed using conventional imaging metrics such as changes in tumour size or contrast enhancement8, which do not reflect the intra-tumoural responses to treatment in a timely manner. Therefore, there is a clear gap between clinical need and radiological practice that limits cancer management. In clinical practice, we currently make very little use of the information that is available from multi-modal imaging (i.e. image acquisition with different modalities or with a single modality but multiple acquisition techniques) about the heterogeneous molecular and cellular characteristics of tumours. This is partly because of time and financial constraints but also because multi-modal imaging data contain convoluted information that needs to be integrated and interpreted in individual patients in ways unfamiliar to radiologists or other physicians9. For instance, regarding a specific tumour, some specific gene mutation could be detected by 1H MRS10, the altered energy metabolism characterised by 13C or 2H MRSI11,12 or PET, the cellular membrane pathology assessed by 31P MRSI13 and 23Na MRI14, the immune microenvironment assessed using 89Zr PET15, the inflammatory condition visualised via 18F PET16, the cell size estimated by diffusion MRI17, and, more regularly, regional perfusion evaluated by DCE-MRI18. While the development of artificial intelligence and machine learning promises to be able to integrate and interpret these disparate measurements into a biological model, we would argue that sufficient validation and standardisation of each of these techniques must come as a pre-requisite for precise interpretation.

The past decade has witnessed rapid developments and innovations in medical imaging technologies. In magnetic resonance imaging (MRI), at the imaging system level, the number of higher field (3.0 T) MRI scanners has been catching up quickly with the number of lower field devices (1.5 T and below), while 5.0 T and 7.0 T systems are attracting increasing attention19. More recently, public funding in Europe has been made available to build a 14 T whole-body scanner20. Computed tomography (CT) scanners have evolved from the use of broad, non-specific X-ray spectra to more selective energy detection which introduces a range of new tissue contrast mechanisms. Photon counting CT has been introduced, which produces images with better contrast and spatial resolution21. Simultaneously, scan time and patient radiation doses have been significantly reduced. The recent introduction of large field-of-view, total body PET/CT scanners has improved image quality, reduced patient doses and shortened scan times, thereby increasing patient throughput and making dynamic PET acquisition clinically viable22. Ten years earlier, the introduction of hybrid PET/MR systems allowed the simultaneous acquisition of metabolic, molecular, anatomic and physiologic information23. Single photon emission computed tomography (SPECT) is gradually replacing much of the work traditionally performed with 2D gamma cameras, often now combined with CT for 3D anatomical co-registration. Concurrent improvements in technology, for example, semi-conductor detection of radiation replacing sodium iodide scintillator crystals and improved reconstruction algorithms, have improved spatial resolution and photon sensitivity while reducing scan times and radiation dose24.

Historically, MRI, CT and PET scanners were introduced into clinical practice without strong prior evidence of their diagnostic efficacy or advantages, and their impacts were validated only subsequently by mass usage. The introduction of new scanners and techniques has largely followed the same pattern with a lack of high quality evidence from clinical outcomes research providing validated advantages and impact, leaving several questions regarding their utility and cost-effectiveness unanswered. For example, which types of MR scanners are truly critical for a given diagnostic scenario? Or what are the critical clinical benefits of 7.0 T (or 5.0 T) vs. 3.0 T MRI? Do those benefits apply to all clinical scenarios? How do those benefits compare to the increased costs and altered accessibility? Does this comparison vary across countries with different incomes and healthcare systems? Although the clinical introduction of novel radiotracers usually follows strict clinical research standards, only a small subset of agents have shown to be valuable.

At the sub-system level, novel imaging methods are being developed that seek to accelerate imaging speed, enhance the signal-to-noise ratio, improve the spatial, contrast and temporal resolutions, and extract novel quantitative parameters. However, clinical translation of many of these new methods on commercial systems has occurred relatively infrequently25. There are several reasons for this including: (i) some technical advances have focused on originality and novelty rather than practical translation; (ii) lack of a network and infrastructure to support multi-site clinical validation studies; (iii) onerous regulatory steps that inhibit innovation; (iv) challenging pathways to integrate novel imaging within trials of novel therapeutics; (v) the unglamorous nature of studies of repeatability, reproducibility and cost-effectiveness (and the reluctance of high impact factor journals to publish these evaluations due to lack of novelty and direct clinical relevance); and (vi) a shortage of specific funding sources to support appropriate outcomes research.

In an era of precision medicine, medical imaging research requires a thorough transformation to match current advances in cancer treatment and to maximise the impact of the technology that is now available. Here we consider several aspects of imaging research that require attention in the context of cancer management, aiming to raise awareness of high quality clinical research in the medical imaging community, including clinicians and technical innovators. We need to engage academic publishers, equipment manufacturers, policy makers and imaging scientists in joint efforts to enhance clinical research quality and to allow more efficient clinical translation of novel medical imaging technologies.

Actions to be taken

Study design

Evaluations of any new translational imaging method must set endpoints that answer well defined and clinical-relevant questions. Firstly, an unmet clinical need must be identified. Even in the earliest stages of developing a new clinical imaging method, multi-disciplinary input, including opinions from non-radiologist clinicians and statisticians experienced in the design of clinical trials, is important to identify the clinical questions and optimise study design. For example, the impact of a novel imaging method on overall subject survival is currently rarely evaluated. This may be due to the complexity of isolating the effect of specific imaging procedures during the course of cancer treatment from the variation in patient compliance to the treatment and changes in treatment etc. However, taking overall survival as a primary study endpoint may be essential when evaluating imaging procedures, especially those with considerably increased costs and potential adverse effects. For example, in the IFCT-0302 study26, CT follow-up was found to give an overall survival similar to that of X-ray follow-up in monitoring relapse after complete resection of non-small-cell lung cancer. However, a completely different conclusion could have been drawn if the relapse detection rate was selected as the primary endpoint, where CT was more sensitive at detecting relapse. Despite the recent debate on the concept of basket trials27, this type of design could still be widely adopted in imaging research to achieve faster evaluation of imaging techniques. One special concern on imaging studies is that randomised controlled trials (RCTs) of the sort used to evaluate drugs need to be utilised less frequently – questions of diagnostic accuracy, repeatability and reproducibility can be obtained from cohort studies, and comparison of different or existing imaging techniques can be acquired within the same session and compared intra-individually, as exemplified in the MITNEC-A1 study28. Therefore RCTs can often be reserved for evaluating the impact of imaging on clinical practice in the final steps of clinical translation. While some high-impact medical journals emphasise the need for RCTs, their necessity in medical imaging studies could be relieved for many clinical questions. In addition, in view of the enormous number of existing reports of studies that compared different imaging methods in relatively small cohorts, systematic reviews and meta-analyses would help condense the totality of clinical evidence to inform consolidated conclusions.

Standardisation of imaging protocols

Multi-centre trials are preferrable for providing high-level clinical evidence, and for imaging studies, this requires a degree of standardisation of acquisition and analysis protocols as well as an understanding of variable technical factors across the participating centres. Ensuring reproducibility is a logistical challenge in trial design, particularly for quantitative imaging biomarkers. To an extent, this can be addressed with the use of imaging phantoms, but a subset of trial participants may still be required for cross-vendor, cross-site imaging studies. Moreover, although efforts are made to standardise imaging parameters across centres, inherent variations between different vendor instruments and between different iterations of imaging systems often go unnoticed. This type of variation is particularly pronounced in MRI, where the contrast could nominally be the same on different vendor scanners, but the underlying mechanisms responsible for this contrast could be subtly different29. Other cross-vendor variations include image reconstruction and post-processing algorithms that may alter the texture and spatial characteristics of the images, resulting in bias in visual evaluation and differences in model fitting in radiomics and AI studies30. Addressing this issue requires agreements between major imaging system vendors, but also requires the awareness of medical policymakers so that a robust solution optimised for the benefit of both clinical rigour and industry is achieved.

Quantification of imaging results

While tumour biology is assessed quantitatively in the laboratory, and tumour therapies are often developed using quantitative models, the radiological response is still generally reported qualitatively and subjectively. Quantifiable biomarkers are urgently needed in medical imaging research to provide improved confidence in decision-making. Even current criteria, such as RECIST and PERCIST, are mostly semi-quantitative. Response biomarkers could include quantitative parameters, e.g. the volume transfer constant (Ktrans) in dynamic contrast agent-enhanced (DCE)-MRI31, semi-quantitative parameters, e.g. the concentration of 2-hydroxyglutarate in magnetic resonance spectra as an indicator of the presence of IDH mutation10, and model-based metrics such as radiomics parameters that have started to be evaluated rigorously in some recent clinical studies for their potential as response reporters or predictors in cancer treatment32,33. The provision of quantitative results is becoming increasingly important as novel radiological biomarkers emerge. For example, the use of multiple isotopes in multinuclear MRI measurements34 or the analysis of the X-ray spectrum in CT may give a more comprehensive profile of the molecular and cellular characteristics of a tumour. Meanwhile, it is equally important to establish baseline metrics for potential biomarkers on a population basis. This will require wider approaches, including multi-centre efforts (maybe even in international settings, to account for other variables in the population such as gender, age and race), and the resource demand should not be underestimated given the expected clinical impact, considering the established examples such as the baseline setup for serum PSA screening35.

Streamlining clinical translation

The ‘Imaging biomarkers roadmap for cancer studies’ was proposed by experts from Cancer Research UK (CRUK) and the European Organisation for Research and Treatment of Cancer (EORTC)25. The roadmap provided a 14-step guide for the clinical translation of novel imaging biomarkers, from initial proposal through single- and multi-centre validation to evaluation of cost-effectiveness. Essential to biomarker validation is an assessment of technical repeatability, reproducibility, and biological variation in statistically appropriate sample sizes that are representative of each population of interest. Successful translation of a surrogate biomarker requires an understanding of the relationship between image contrast and the clinical endpoint, and a comparison of each novel imaging biomarker with established methods in order to assess its capability to influence patient management. For example, improved sensitivity for an abnormal condition is not necessarily in the patient’s best interest, with overdiagnosis a problem in several cancers, notably breast, prostate and renal cancers. Conversely, a novel imaging biomarker or technology may impact other emerging clinical trends and, hence, might be better accounted for in future clinical research. For example, a patient with a solitary lung metastasis detected on CT would currently be eligible for a range of focal ablative therapies. The introduction of higher-performance imaging will almost certainly detect smaller metastases and more indeterminate lesions than before. Without regular updates of clinical practice guidelines, patients may be excluded from treatments that may still be beneficial for overall survival. Thus, a close collaboration between clinical researchers, industry representatives, policymakers and pharmaceutical companies is required for the efficient clinical translation of novel imaging technologies. Organisations that involve all these parties might be well suited to review, critique, and summarise current challenges, for example, the National Cancer Imaging Translational Accelerator (NCITA)36 in the UK.

Patient and public awareness

Modern evidence-based medicine emphasises establishing better trial designs, obtaining quantitative results, improving their interpretation, and subsequent clinical translation. However, patient values and preferences are also important to clinical decisions (shared decision-making) and patient understanding of the imaging techniques used in their care and how imaging results are interpreted become essential. Proper patient engagement in the planning and execution of medical research can help to focus study design towards topics relevant to patient outcomes and improve consideration of the impact that research has on trial participants. The result is improved research design, delivery and translation of findings into clinical practice37. Although there is increasing recognition of patient involvement in pharmaceutical research38, this issue is currently less emphasised as a goal in clinical imaging research. While patient engagement previously appeared to be a formidable hurdle, partly due to time constraints and knowledge gaps, the situation may be improved with developments in technology and artificial intelligence that may assist patients to better interact on medical questions with which they are concerned39 and to better understand ongoing studies40.

Conclusion and outlook

The successful clinical translation of novel medical imaging technologies is determined by whether they can address outstanding clinical demands in a cost-effective way41. There is an increasing trend of pursuing technical innovations with unproven benefits and even potential harms but with greatly increased dissemination via exploding numbers of publications. These trends are increasingly being recognised by the imaging research community, and thus there is an increasing demand for high-quality evidence, driven by better study design42. Although RCTs are widely accepted as the gold standard needed to achieve the highest level of clinical evidence, their quality control in medical imaging studies should be rigorously monitored43, and we suggest the possibility that intra-patient comparisons may be used as an alternative criterion in medical imaging study designs. Given the rapid development of new imaging techniques and the different versions of any single technique offered by multiple vendors, it becomes necessary to standardise imaging protocols before any clinical study to evaluate efficacy. Quantification of imaging results is another key factor in facilitating clinical translation of medical imaging technologies, and appropriate baselines, including means and variances of measurements, must be established for each imaging-based metric and the differentiation of ‘normal’ and ‘abnormal’. Patient engagement should be a crucial part of medical imaging research to effect successful clinical translation. In addition, a patient-centric healthcare system needs to address the affordability and accessibility of novel imaging technologies. This, of course, requires the implementation of cost-effectiveness analysis in imaging studies, but it also requires a healthcare system that encourages the democratisation of novel imaging technologies from aspects such as registration approval criteria and reimbursement mechanisms (financial incentive), which is beyond the scope of the current article. In view of the points discussed above, the streamlining of clinical translation processes, from original technical innovations through to routine applications, is urgently needed to harmonise the joint efforts of many stakeholders and ultimately impact cancer patient management.

Related Articles

Probabilistic machine learning for battery health diagnostics and prognostics—review and perspectives

Diagnosing lithium-ion battery health and predicting future degradation is essential for driving design improvements in the laboratory and ensuring safe and reliable operation over a product’s expected lifetime. However, accurate battery health diagnostics and prognostics is challenging due to the unavoidable influence of cell-to-cell manufacturing variability and time-varying operating circumstances experienced in the field. Machine learning approaches informed by simulation, experiment, and field data show enormous promise to predict the evolution of battery health with use; however, until recently, the research community has focused on deterministic modeling methods, largely ignoring the cell-to-cell performance and aging variability inherent to all batteries. To truly make informed decisions regarding battery design in the lab or control strategies for the field, it is critical to characterize the uncertainty in a model’s predictions. After providing an overview of lithium-ion battery degradation, this paper reviews the current state-of-the-art probabilistic machine learning models for health diagnostics and prognostics. Details of the various methods, their advantages, and limitations are discussed in detail with a primary focus on probabilistic machine learning and uncertainty quantification. Last, future trends and opportunities for research and development are discussed.

Clinical practice recommendations for the diagnosis and management of X-linked hypophosphataemia

X-linked hypophosphataemia (XLH) is a rare metabolic bone disorder caused by pathogenic variants in the PHEX gene, which is predominantly expressed in osteoblasts, osteocytes and odontoblasts. XLH is characterized by increased synthesis of the bone-derived phosphaturic hormone fibroblast growth factor 23 (FGF23), which results in renal phosphate wasting with consecutive hypophosphataemia, rickets, osteomalacia, disproportionate short stature, oral manifestations, pseudofractures, craniosynostosis, enthesopathies and osteoarthritis. Patients with XLH should be provided with multidisciplinary care organized by a metabolic bone expert. Historically, these patients were treated with frequent doses of oral phosphate supplements and active vitamin D, which was of limited efficiency and associated with adverse effects. However, the management of XLH has evolved in the past few years owing to the availability of burosumab, a fully humanized monoclonal antibody that neutralizes circulating FGF23. Here, we provide updated clinical practice recommendations for the diagnosis and management of XLH to improve outcomes and quality of life in these patients.

Iron homeostasis and ferroptosis in muscle diseases and disorders: mechanisms and therapeutic prospects

The muscular system plays a critical role in the human body by governing skeletal movement, cardiovascular function, and the activities of digestive organs. Additionally, muscle tissues serve an endocrine function by secreting myogenic cytokines, thereby regulating metabolism throughout the entire body. Maintaining muscle function requires iron homeostasis. Recent studies suggest that disruptions in iron metabolism and ferroptosis, a form of iron-dependent cell death, are essential contributors to the progression of a wide range of muscle diseases and disorders, including sarcopenia, cardiomyopathy, and amyotrophic lateral sclerosis. Thus, a comprehensive overview of the mechanisms regulating iron metabolism and ferroptosis in these conditions is crucial for identifying potential therapeutic targets and developing new strategies for disease treatment and/or prevention. This review aims to summarize recent advances in understanding the molecular mechanisms underlying ferroptosis in the context of muscle injury, as well as associated muscle diseases and disorders. Moreover, we discuss potential targets within the ferroptosis pathway and possible strategies for managing muscle disorders. Finally, we shed new light on current limitations and future prospects for therapeutic interventions targeting ferroptosis.

First-principles and machine-learning approaches for interpreting and predicting the properties of MXenes

MXenes are a versatile family of 2D inorganic materials with applications in energy storage, shielding, sensing, and catalysis. This review highlights computational studies using density functional theory and machine-learning approaches to explore their structure (stacking, functionalization, doping), properties (electronic, mechanical, magnetic), and application potential. Key advances and challenges are critically examined, offering insights into applying computational research to transition these materials from the lab to practical use.

Emerging insights in senescence: pathways from preclinical models to therapeutic innovations

Senescence is a crucial hallmark of ageing and a significant contributor to the pathology of age-related disorders. As committee members of the young International Cell Senescence Association (yICSA), we aim to synthesise recent advancements in the identification, characterisation, and therapeutic targeting of senescence for clinical translation. We explore novel molecular techniques that have enhanced our understanding of senescent cell heterogeneity and their roles in tissue regeneration and pathology. Additionally, we delve into in vivo models of senescence, both non-mammalian and mammalian, to highlight tools available for advancing the contextual understanding of in vivo senescence. Furthermore, we discuss innovative diagnostic tools and senotherapeutic approaches, emphasising their potential for clinical application. Future directions of senescence research are explored, underscoring the need for precise, context-specific senescence classification and the integration of advanced technologies such as machine learning, long-read sequencing, and multifunctional senoprobes and senolytics. The dual role of senescence in promoting tissue homoeostasis and contributing to chronic diseases highlights the complexity of targeting these cells for improved clinical outcomes.

Responses

Your email address will not be published. Required fields are marked *