Outcome of childhood acute lymphoblastic leukemia (ALL) in low- and middle-income countries is lagging in many aspects including diagnosis, risk stratification, access to treatment and supportive care.
to report the outcome of childhood ALL at Ain Shams University Children’s Hospitals with the use of risk-based protocols before the implementation of minimal residual disease technology and to evaluate the use of double delayed intensification (DDI) in standard risk patients.
Two hundred and twenty patients with ALL diagnosed between January 2005 and December 2014 were included in the study. Patients were treated according to a modified CCG 1991 and 1961 for standard and high risk respectively. Patients were stratified into three risk groups: standard risk (SR), high-risk standard arm (HR-SA), and high-risk augmented arm (HR-AA).
Among the whole cohort, the 10-year event-free survival (EFS) and overall survival (OS) were 78.1% and 84.3% respectively. Patients with Pre-B immunophenotype (IPT) had significantly better outcome than T-cell IPT (EFS 82.0% versus 58.6%, p < 0.001; OS 86.9% versus 69%, p = 0.003 for Pre-B and T-cell respectively). Among the SR group, patients treated with single delayed intensification (SDI) had comparable EFS and OS rates when compared to patients treated with DDI with EFS 82.4% versus 87.5%, p = 0.825 and OS 88.2% versus 93.5%, p = 0.638 for SDI and DDI groups, respectively.
The use of risk-based protocol with simple laboratory techniques resulted in acceptable survival outcome in resource limited settings. The use of double delayed intensification showed no survival advantage in patients with standard risk.

Copyright © 2021 Elsevier Ltd. All rights reserved.

Author