Has MRD monitoring superseded other prognostic factors in adult ALL?

Abstract
Significant improvements have been made in the treatment of acute lymphoblastic leukemia (ALL) during the past 2 decades, and measurement of submicroscopic (minimal) levels of residual disease (MRD) is increasingly used to monitor treatment efficacy. For a better comparability of MRD data, there are ongoing efforts to standardize MRD quantification using real-time quantitative PCR of clonal immunoglobulin and T-cell receptor gene rearrangements, real-time quantitative-based detection of fusion gene transcripts or breakpoints, and multiparameter flow cytometric immunophenotyping. Several studies have demonstrated that MRD assessment in childhood and adult ALL significantly correlates with clinical outcome. MRD detection is particularly useful for evaluation of treatment response, but also for early assessment of an impending relapse. Therefore, MRD has gained a prominent position in many ALL treatment studies as a tool for tailoring therapy with growing evidence that MRD supersedes most conventional stratification criteria at least for Ph-negative ALL. Most study protocols on adult ALL follow a 2-step approach with a first classic pretherapeutic and a second MRD-based risk stratification. Here we discuss whether and how MRD is ready to be used as main decisive marker and whether pretherapeutic factors and MRD are really competing or complementary tools to individualize treatment.

This publication has 85 references indexed in Scilit: