Graft rejection is the main obstacle to the long-term survival of any transplanted organ. To date, the primary strategy for curbing rejection has been to reduce the recipient's cellular and antibody-mediated capacity to attack donor cells in the transplanted organ. This strategy has encouraged researchers to develop immunosuppressive drugs designed to reduce the recipient's T-cell population (i.e. the instigators of cellular rejection), and to focus exclusively on therapeutic measures that subdue antibody-mediated rejection (AMR) of the transplanted organ. For over twenty years, modern immunosuppressive therapies have been used successfully to protect the allograft (i.e. transplanted organ/tissue) from acute cellular rejection, which has in turn bolstered the perception that the patient's immune system is harmful and that immune suppression is beneficial. However, although successful for acute graft rejection, this strategy has not reduced the incidence of chronic rejection and additionally serves to increase the likelihood of infection and malignancy long-term.
Specifically, despite some of the benefits of the above immunosuppressive strategy, it has been reported that high levels of immunoglobulin (Ig) M antibodies, a component of innate immunity, are in fact associated with a lower incidence of chronic allograft failure and longer patient survival. This implies that certain components of the recipient's innate immune system may actually be beneficial and should not be suppressed. Despite this, research focusing on the potential benefits of enhancing the recipient's immune response has not conventionally been pursued for several reasons. Primarily, the mechanism responsible for this potentially beneficial association has historically been unknown. Additionally, the prevailing dogma is currently that the recipient's immune system is harmful and should thus be suppressed to protect the foreign allograft from the host body's own defenses. As such, strategies that bolster the patient's immune system are in direct conflict with the dominant medical standards. Indeed, a recent consensus panel has even suggested reconsidering the term AMR, pursuant to theory that identifying donor-specific antibodies is no longer deemed necessary to establish the presence of allograft rejection. Without a proven antibody target, the decision to prevent or treat AMR lacks a mechanistic foundation and rests solely on clinical manifestations of graft failure that are observed in patients with biopsy specimens negative for cellular rejection. Accordingly, it is evident that the success of immunosuppression in combating the early effects of acute cellular rejection has largely lulled researchers into accepting a potentially false paradigm—that harmful immune-related mechanisms must also be involved in producing the late effects seen in chronic rejection. This paradigm has served as the primary motivation for a largely single-minded effort to discover the antibody-mediated source of chronic rejection.
Heart transplantation is the gold standard treatment for refractory advanced heart failure; however, allograft rejection continues to limit graft and patient survival. Despite the advances in immunosuppression and treatment of cardiac allograft rejection, which have improved one-year survival rates, in line with the above-described issues with immunosuppressive therapy, the late outcomes remain dismal with chronic rejection. Chronic cardiac allograft rejection is otherwise known as cardiac allograft vasculopathy (CAV), which is the major cause of long-term morbidity and mortality in cardiac allograft patients. Although CAV's etiology is unknown, several immunological and non-immunological causes have been proposed, including the involvement of innate immunity, inflammation, and coagulation. Innovative clinical and basic research is urgently needed to develop evidence-based therapies to prolong survival of cardiac transplant recipients. The elucidation of the mechanisms involved in late rejection is a critical step in identifying strategies and developing therapies to protect transplanted organs and improve clinical outcomes in patients.
CAV is the principal long-term cause of cardiac graft failure. Although modern immunosuppressive regimens have extended early survival by substantially reducing acute rejection, this has not impacted the incidence of CAV. The confirmed CAV-related deaths become prominent within 1 to 3 years post-transplant and continue to significantly contribute to mortality in subsequent years: 8% at 1 year, 20% at 3 years, 30% at 5 years, and more than 50% at 10 years. (Additionally, deaths from infection and malignancy, possible results of over-immunosuppression, are also prominent as the years progress post-transplant.) Although CAV is a risk factor for long-term mortality, the diagnosis of CAV has a short-term mortality risk (10% of patients die within 1 year of diagnosis). Remarkably, the long-term survival of patients alive one year after transplantation has not improved appreciably in the last 20 years.
Accordingly, there is a need to better understand the physiological and biological mechanisms involved in chronic rejection. Furthermore, a significant need exists for the identification of clinically useful early risk predictors to facilitate early identification of the onset of chronic allograft failure—e.g., to identify those patients most susceptible to developing long-term CAV and CAV-associated allograft failure—as well as to develop an effective strategy for preventing the chronic rejection (including CAV).
The development of new options for the early detection of patients at risk of CAV can prolong the survival of cardiac transplant recipients. Early identification of CAV became possible with the introduction of intravascular ultrasound (IVUS), an invasive technique usually not initiated until at least one year post-transplantation. IVUS is expensive, poses increased risks, and lacks the ability to assess the entire coronary tree. Other invasive tests (e.g., using endomyocardial biopsies) and less invasive tests like stress perfusion, dobutamine echocardiography, ultrafast tomography, and MRI are not sufficiently sensitive or specific to detect early stages of the disease. In view of the same, there is a need to develop a novel diagnostic tool for identifying at-risk patients by detecting early depletion of natural antibodies (NAbs) to phosphorylcholine (PC—one of the key epitopes found on oxLDL, but not native LDL) and creating an evidential base for the pursuit of early vaccine therapies to prevent or ameliorate disease progression. This is particularly relevant given that 10-20% recipients have angiographic evidence of CAV in the first year post-transplantation. CAV increases 10% per year and at 5 years, 50% of patients have some evidence of CAV, which is one of the leading causes of death with a 2-year survival rate of <15% in those with extended disease.
In view of the foregoing, a determination of the protective role that innate immunity plays in CAV would be well received by the scientific and medical communities, as said determinations may lead to diagnostic tools to identify and lead to treatment of at-risk patients.