The use of dialyzer cartridges with hemodialysis machines to remove blood-borne toxins and by-products of metabolism has been conventional for many years. Typically, such a cartridge contains essentially a pair of chambers separated by a semipermeable membrane. Blood is perfused through the first chamber and returned to the patient. The dialysate solution is simultaneously circulated in the opposite direction through the second chamber. A concentration gradient thereby is established which causes waste products carded in the blood to migrate through the semipermeable membrane and into the dialysate solution to form the dialysate effluent.
The principle of hemodialysis has been refined extensively. A number of semipermeable hollow fiber membranes are now utilized in dialyzer cartridges to greatly increase the total membrane surface area to facilitate diffusion across the membrane structure. The hollow fiber membranes include a variety of materials including, for example, cellulose acetate, cellulose triacetate, polyacrylonitrile, polysulfone, and regenerated cellulose, the latter being most commonly used.
One of the most basic considerations in treating a patient with hemodialysis revolves around treatment adequacy. For instance, how long should a given patient be dialyzed on a given day. A number of medically adverse effects may result from an inadvertent failure to sufficiently dialyze the patient. At the present time, the average dialysis patient has a life expectancy of only about five years. One reason these patients tend to have a short life expectancy is the deleterious effect of a chronic buildup of various toxins that either are not eliminated at all, i.e. do not pass through the hollow fibers, or are not sufficiently reduced to nontoxic levels. The identity of many of these supposed toxins is not known, although those species known to be eliminated in urine, such as urea, creatinine, phosphate, hydrogen ions, etc. are associated with serious medical consequences when permitted to accumulate in excess of normal levels.
A number of factors can have a substantial effect on treatment adequacy. For example, it is common practice in the field of hemodialysis to reuse the dialysis cartridges. There is technology available for cleaning, disinfecting, or sterilizing used dialysis cartridges, for example, as illustrated in U.S. Pat. No. 4,695,385. Eventually, however, an individual cartridge must be discarded because it loses its dialyzing competency. At the present time, the competency of dialyzers is difficult to assess and therefore often is not rigorously monitored, and a dialyzer cartridge is often not discarded until it visually appears unclean after recleaning, or when fiber bundle volumes or ultrafiltration rates are reduced below a predetermined threshold. It now is known that severe dialyzer dysfunction can occur even when appearance, fiber bundle volume and ultrafiltration rates are normal, as reported by Delmez et al., "Severe dialyzer dysfunction during reuse," Kidney International, 35:244 (1989). It is also known that dialyzer competency can not be accurately predicted by the age of the dialyzer cartridge or the number of uses.
Notwithstanding the condition of the dialyzer, one measure of adequacy of dialysis for the individual patient during a given treatment is calculated from the following equation: EQU KT/V.gtoreq.1.0
V is an expression of the volume of distribution of urea which is approximately equal to total body fluid volume. V is derived for each individual patient from data such as height, weight and sex. K is the urea clearance of the particular dialyzer in use in milliliters (ml) of blood cleared of urea each minute. T is the treatment time. K is obtained from the typical product insert enclosed with a case of dialyzers and contains a graph of urea clearance versus blood flow rate obtained by random testing of a sample of dialyzers from a particular manufacturing lot. Upon incorporating these values into the above equation, the minimum treatment time can be calculated for a given KT/V value. Other parameters that may be varied to achieve adequate dialysis include blood flow rate, dialysis solution flow rate, dialyzer competency, and temperature.
It has been determined empirically that KT/V values of about 0.8 or greater are associated with low levels of morbidity. See Gotch, L. A., Sargent, J. A. Kidney International, 28:526-537, 1985. Even with the use of new dialyzers there is some risk that a unit selected from a particular lot will have a significantly lower K value than the value indicated in the product insert. The patient receiving treatment from such a dialyzer is therefore at risk of being under-dialyzed. The likelihood of under-dialysis increases upon reuse of the dialyzer cartridge because of the definite but unquantified loss of dialyzer competence with each successive use. Underdialysis also may occur because of incompetency of access to the patient's circulation. Because of incompetency of the patient's blood access, desired blood flow rates may not be achieved which also can result in underdialysis.
Other parameters than KT/V have also been determined to assess the adequacy of dialysis. Among these are the Urea Reduction Ratio (URR) and Solute Removal Index (SRI). URR is defined as 1-(C.sub.B).sub.pre /C.sub.B).sub.post. A good dialysis treatment will have a URR near one (1) while a poor dialysis treatment will have a URR near zero (0). Unfortunately URR does not take into account generation of urea during dialysis, ultrafiltration, or the two pool nature of removal. Consequently SRI has been proposed as a generalized version of URR which does account for these effects. SRI is defined as the amount of urea removed during a treatment as a fraction of the total body store. Like URR, a good dialysis treatment will have an SRI value near one (1) while a poor dialysis treatment will have an SRI near zero (0). Potentially SRI (unlike KT/V) can indicate the adequacy of a dialysis treatment irrespective of modality (i.e. peritoneal or hemodialysis) and intermittence. Neither URR or SRI however, have been validated as extensively as KT/V as measures of dialysis adequacy.
Although the KT/V, URR and SRI indices are indicative of urea removal and appear to correlate to therapy failure, that is not tantamount to saying that urea is a toxic metabolite. There is early literature to suggest that urea is not toxic, per se. However, urea is a major metabolite of protein catabolism and serves as a convenient marker to monitor treatment adequacy.
Urea has a molecular weight of 60 daltons while some of the other protein catabolites may be much larger. It has, therefore, become a subject of controversy whether the relationship between KT/V and morbidity established with the tighter cellulosic membranes is applicable to the more open membranes used for hemofiltration and high flux hemodialysis or to the natural peritoneal membrane.
There is a considerable body of literature on the urea kinetic model. Computer programs, programmable calculators and time-shared computer services have been determined to make urea kinetics more accessible to the dialysis clinician. It has recently been shown (Lindsay, et at, 1989) that KT/V values of less than 0.8 may be associated with a low dietary protein intake that is intractable to nutritional counseling. However, increasing the KT/V to 1.0 or higher, in conjunction with nutritional counseling, is effective in improving dietary protein intake. As low dietary protein intake may be associated with increased morbidity, monitoring of the KT/V and nPCR are useful adjuncts to other clinical assessments of the dialysis patient.
Traditional urea kinetics entails numerous measurements and is considered mathematically complex by dialysis clinicians. The various measurements required for accurate kinetic measurements are summarized in Table 1.