Liquid chromatography is an important branch of analytical chemistry. Most liquid chromatography systems include a spectrophotometer detector and a high proportion of these detectors are of the type where the wavelength of detection can be selected at will within a given range, i.e. the variable wavelength type of detector. Problems with variable wavelength detectors in liquid chromatography can occur if the actual wavelength of detection is different than the detector's set wavelength. For example, linearity of detection and selectivity of detection can deteriorate if the liquid chromatographic procedure is actually being run at one wavelength while the procedure calls for detection at another wavelength. Even when a variable wavelength detector is accurately calibrated when new, its wavelength accuracy can deteriorate with time. Within the spectral range of such detectors, the low UV zone (190-225 nm) is very often used to obtain a more general response and to enhance detector sensitivity for compounds containing weakly absorbing chromophores. Unfortunately, most organic compounds exhibit steeply sloping absorption bands in this wavelength range. Because of this fact, small wavelength differences, e.g., 3 nm, can cause large effects on the chromatographic results. Therefore, there is an important need to determine the accuracy of wavelength selection of variable wavelength liquid chromatography detectors in the low UV range.
The need to check the accuracy of wavelength selection is also important for general laboratory spectrophotometers and specific test solutions have been discovered for the calibration of general laboratory spectrophotometers such as the rare earth ion solution described in U.S. Pat. No. 4,461,718 to Kaye et al. Rare earth compounds are generally considered to be excellent candidates for the preparation of accuracy standards for variable wavelength spectrophotometers because solutions of rare earth ions often absorb light in very sharp bands as shown, for example, by Moeller et al. in Volume 22, No. 3, March 1950, pages 433-441 of Analytical Chemistry and by Stewart et al. in Volume 30, No. 2, February 1958, pages 164-172 of Analytical Chemistry. The patent and literature references, above, are fully incorporated herein by reference.
The prior rare earth test solutions developed for general laboratory spectrophotometers did not provide acceptable performance for variable wavelength liquid chromatography detectors because of the different optical characteristics of variable wavelength liquid chromatographic detectors. For example, the optical bandpass of most of these detectors is not adjustable and is different for different detector brands (generally ranging from about 2 nm to about 6 nm). The absorption bands of many rare earth ions in solution shift in absorbance maximum with a variation in optical bandpass, e.g., terbium III dissolved in water shifts from 219.4 to 218.6 nm when the bandpass changes from 2 to 6 nm. Another problem with these detectors is that it is more difficult to locate an absorption maximum by varying the wavelength selector of the detector since variable wavelength liquid chromatography detectors are generally designed to operate at one selected wavelength and are not designed to be scanning instruments. Therefore, the absorbance maximum of a test solution for variable wavelength liquid chromatography detectors must be intense or the maximum will not be found (and this is especially true for single beam variable wavelength detectors). For example, the well known holmium III test solution absorbance intensities at 241.1, 278.2 and 287.5 nm are too weak for use with most variable wavelength liquid chromatographic detectors.