How does age impact AML treatment response?
AML is a disease of older individuals, with the median age of presentation being between 67 to 70 years. Patients who are considered ‘young’ in AML terms are those under the age of 60 years, whereas the patients are considered ‘older’ are typically age 60 years and up. Data acquired in the 1970s and 1980s demonstrated that even older patients with AML can benefit from treatment. However, older patients tend to have poorer functioning and greater frailty; these types of patients tend not to do nearly as well with intensive induction chemotherapy as younger patients. In fact, the early mortality rate from treatment-related causes in elderly patient can be 20% or more.1
What prognostic factors can help to predict long-term outcomes in AML patients?
The traditional prognostic factors in AML have included high white blood cell count, high tumor burden (as reflected by high LDH levels), and conventional cytogenetics. However, mutational profiling is playing an increasingly important role in this assessment, particularly among patients who have a normal karyotype. Newly identified biomarkers and measurable residual disease (MRD) at remission can be powerful indicators of long-term outcomes. It is now recognized that specific cytogenetic signatures and molecular abnormalities are suggestive of favorable, intermediate, or poor risk.
How can clinicians determine whether AML patients are fit or unfit for intensive chemotherapy?
There are a number of patient- and treatment-related factors that can be taken into account when considering fitness for intensive chemotherapy. Age, performance status (PS), and ability to ambulate or perform activities of daily living can be important markers of ability to tolerate intensive chemotherapy. Additionally, those with a significant comorbidity index and/or spend a significant amount of time in a bed or chair are expected to have poorer outcomes after treatment. Specifically, patients who have abnormal organ function who have underlying liver, kidney, pulmonary, or cardiac comorbidities, and high comorbidity indexes are typically not considered as good candidates for intensive therapy. Other factors, such as personal preference, social-economic issues, and financial considerations, should be incorporated into treatment decisions.
The concept of age is being reconsidered. Traditionally, age has been viewed as a set cut-off for intensive versus non-intensive therapy, with patients between the ages of 18 and 59 generally being considered fit, and those age 60 and older generally being considered unfit. However, it is being increasingly recognized that older patients can be significantly diverse in their presentation and functional status. As a result, many clinicians now use a higher age cut-off of 75 years in their practice.
There are tools and scoring systems that are available to assist clinicians in determining fitness for intensive therapy. The Ferrara criteria,2 which were developed in 2013, include congestive heart failure, cardiomyopathy with an EF under 50%, pulmonary function tests with patients who have underlying COPD, cirrhosis, dialysis, active infection, current mental illness, and performance status. More recently, a modified Ferrara criteria were used in the pivotal phase 3 trial VIALE-A to determine those patients who were considered unfit and eligible for enrollment on that trial of venetoclax/azacitidine.3 In the VIALE-A trial, the criteria included 75 years of age and above, minimal congestive heart failure, DLCO and ECOG performance status, and physician discretion.
The Karnofsky Performance Status (KPS)4 ranks patients from 0% to a 100%, with a cut-off of 50%. Patients whose KPS is under 50% are bedridden for 50% of the time and therefore not eligible for many intensive or even standard of care therapies. Within the fit population, patients in the 50% to 60% category require significant daily assistance with activities, while those who require no assistance or only have mild symptoms fall in the 90% to 100% range.
The ECOG Performance Status (also known as Zubrod or WHO Performance Status)5 is more decisive. Patients receive a score from 0 to 5 based on a description of functional ability. Patients with a score of 3 to 5 are generally considered unfit, while patients with a score of 0 to 2 generally considered fit enough for intensive chemotherapy (although a score of 3 may be included in fitness criteria in some clinical trials).
For hematologic malignancies, we often consider patients for stem cell transplantation. There is a comorbidity index that has been designed specifically to assess patients for this type of therapy. The Hematopoietic Cell Transplant-Comorbidity Index (HCT-CI)6 applies weighted scores based on the comorbidities present. For example, arrhythmia and cerebrovascular disease each have a score of 1, while moderate pulmonary and renal disease each have a score of 2, and diseases such as heart valve disease and moderate/severe hepatic disease have the highest score of 3.
Finally, the Charlson Comorbidity Index (CCI) is a scoring system that, like the HCT-CI, assigns a weighted score to a range of comorbidities to predict long-term outcomes.7 In the CCI, the presence of a metastatic solid tumor or HIV/AIDs represents the highest score of 6, while COPD/asthma, congestive heart failure, dementia, and hypertension all have scores of 1. In the CCI, leukemia is weighted with a score of 2.
Still, the impact of comorbidity and PS on outcomes is not always clear. Some patients present with a low functional state that is brought on by the disease itself, and treatment of the disease can improve their functional status. However, in general, the data indicate functional status as assessed by the ECOG is much more clinically relevant and linked to treatment outcome than a mild or moderate comorbidity burden.8
Management of AML now includes assessment of cytogenetic and mutational abnormalities that predict poor response to standard cytarabine and anthracycline-based therapy. These prognostic classification systems are based on the intended treatment regimen; for example, the 2017 European LeukemiaNet (ELN) classification of AML is based on standard 7+3 AML induction therapy.9 This model predicts the outcomes of patients receiving this therapy by assigning a favorable, intermediate, or adverse risk status based on the presence of cytogenetic or molecular markers such BCR-ABL1, mutated NPM1, and del(5q1).
Composite models have also been developed in an attempt to incorporate biological, functional, and patient-related features into a single prediction model.10 The AML-Composite Model (ACM) developed by Sorror, et al., includes comorbidities, specific laboratory values, age, and cytogenetic and molecular risk.10 Patients may be offered more- or less-intensive induction therapy based on scores developed by this model.
What evidence do we have regarding the effectiveness of patient selection for low- or high-intensity therapy?
A real-world study by Matthews, et al., was recently presented at the ASH 2021 meeting, where investigators assessed outcomes associated with an intensive chemotherapy regimen with liposomal 7+3 (ie, CPX-351) compared with a lower intensive regimen consisting of venetoclax/azacitidine.11 This study included about 800 newly diagnosed AML patients being treated in both community and academic centers. Two hundred and seventeen patents received CPX-351, while 439 received venetoclax/azacitidine.
Patients receiving venetoclax/azacitidine had a median age of 75 years, as opposed to those who received CPX-351, who had a median age of 67 years. Patients who had secondary disease or prior therapy-related disease were more likely to receive CPX-351, while patients with de novo disease who were likely to do well without a liposomal cytarabine anthracycline-based regimen were more likely to be offered venetoclax/azacitidine. In this study, there were no differences in patients being selected for venetoclax/azacitidine versus CPX-351 based on comorbidity index or ECOG PS; age and type of AML were the two factors that primarily influenced treatment selection.
This study found no difference in outcomes between the two patient populations, which were primarily categorized by age, fitness, and disease type. Both overall survival and transplant rates were comparable between groups. While those who received CPX-351 experienced a greater rate of febrile neutropenia and documented infection, and an almost three-fold rate of inpatient stay compared to those who received venetoclax/azacitidine, the CPX-351 group was also older and less fit than the venetoclax/azacitidine group. The similar overall survival rate identified in both groups suggests that current methods of patient selection for low- and high-intensity induction therapy are largely successful in optimizing patient outcomes based on individual patient and disease factors.
In summary, selection of patients for low- or high-intensity induction therapy is far from straightforward, and clinicians must consider a host of factors including comorbidities, PS, age, and cytogenetic and molecular abnormalities. However, evidence suggests that current strategies have been successful in identifying the optimal treatment path for each patient. As research continues, it is the hope that treatment selection will be further refined based on newly identified prognostic factors.
For more information on treatment selection in AML, click here.