Despite significant advances in hemodialysis (HD) technology, the mortality risk of chronic HD patients remains well above that seen in the general population. The average remaining life expectancy in the general population is about 4 times higher than in dialysis patients, and the adjusted rates of all-cause mortality are 6.7 to 8.5 times higher for dialysis patients than in the general population. Cardiovascular disease and infectious disease are among the leading causes of mortality, and the overall annual mortality rate in dialysis patients is about 20% in the United States. See United States Renal Data System, USRDS 2009 Annual Data Report, National Institutes of Health.
Current epidemiologic studies seeking to investigate the determinants of mortality risk in dialysis patients usually consider either cross-sectional baseline characteristics (e.g., mean systolic blood pressure in the first 3 months after start of dialysis, serum albumin levels after 6 months) or time-dependent analyses, most commonly time-dependent Cox regression models. Patients are frequently stratified into groups based on descriptive characteristics such as tertiles. In many of these studies, the first date of dialysis is taken as the reference point.
Despite such improvements in hemodialysis technology and patient tracking, chronic hemodialysis patients continue to experience an inordinately high mortality rate. Therefore, there is a need for an improved method of identifying hemodialysis patients at increased risk of death, in order to trigger earlier diagnostic and therapeutic interventions and consequently reduce patient mortality.