The invention generally relates to providing modified natural zeolites for use in treatment of drinking water supplies. More specifically, the invention provides methods and compositions for effectively stabilizing arsenic and other contaminates into natural zeolites where the stabilized natural zeolite is used in the treatment of drinking water. Embodiments of the invention comply with drinking water NSF standard 61.
Cities and towns throughout the world depend on having clean potable water supplies. The dependence on clean water has increased as the population of the world has increased, especially as industrial use of rivers and lakes have become commonplace.
The explosion of world population, and corresponding increase in fresh water use, has, therefore, resulted in a need to maximize water usage. However, the ability to maximize fresh water use has been limited by, (1) increased pollution of the fresh water supplies due to higher industrial output throughout the world (a direct result of the increased population); and (2) increased knowledge and standards for what constitutes clean water, acceptable for use in fanning, industry, and consumption. As a result, there is a current need to increase the efficiency in the use of water, i.e., conserve existing clean water supplies, increase the current capabilities used to remove pollutants from water supplies, and increase the effectiveness of existing and new technologies to effectively treat and reach new standards in water quality. These concerns are especially true when the water source is a drinking water source.
Natural zeolite can be used to remove trace elements from aqueous media, and in particular has been shown to be effective. Compositionally, natural zeolites are similar to clay minerals, i.e., both are essentially alumino-silicates. Natural zeolites generally have a rigid framework of interconnected tunnels and cages. The porous zeolite crystals host water molecules, potassium ions, sodium ions, calcium ions and other positively charged ions. In this context, many if not all natural zeolite materials host some level of arsenic ions. In particular, the porous crystals of natural zeolite tend to trap arsenic but not so tightly that the arsenic does not leach out when in contact with an aqueous medium. Arsenic leaching occurs even if the natural zeolite is pre-rinsed numerous times.
Originally, the drinking water standard for arsenic, set in the 1940's, was set at 50 parts per billion (ppb). Over the last several decades, the Environmental Protection Agency (EPA) and academia have been studying the potential health effects of arsenic intake, and in particular have focused on the health effects of arsenic in and around the EPA set level of 50 ppb. For example, at arsenic levels of around 100 ppb there appear to be potential serious health effects on humans, such as increased potential of certain cancers and a weakened immune system. However, at arsenic levels closer to 50 ppb and lower, the studies show conflicting results as to arsenic's effects on health, suggesting that additional studies are needed to clarify what level of arsenic is appropriate for long term consumption in drinking water.
In the 1990's the EPA recommended that the arsenic limit in drinking water be lowered to 10 ppb. No action was taken on the EPA's proposal until days before the Clinton administration was scheduled to leave office, at which time President Clinton approved of arsenic levels being lowered from 50 ppb to 10 ppb. In addition, wide spread support for further lowering the standard to 5 ppb arsenic has gained acceptance within a number of environmental groups. Recently, NSF International, a non-profit, non-governmental organization has released guidelines directed at drinking water safety (standard 61). NSF has directed, among other things, that arsenic levels within drinking water should be at or below 1 ppb.
This decline in the acceptable level of arsenic in drinking water has posed serious issues for drinking water facilities that utilize natural zeolite during contaminate removal. As noted above, most natural zeolites leach off arsenic into an aqueous medium, which in most cases constitutes at least 1 ppb arsenic. As such, the mere use of a natural zeolite in treating a drinking water supply causes the drinking water supply to fail NSF 61 and in some cases contributes to the drinking water supply failure of the EPA 10 ppb arsenic standard.
As such, give the stringent arsenic standard now in place (via the EPA and/or NSF) the use of natural zeolite in the treatment of drinking water supplies has become unattainable. Therefore, there is a need in the art to provide methods and compositions for limiting the release of arsenic from a natural zeolite during contact with drinking water such that the natural zeolite is still effective at removing other contaminates from the drinking water.
In this light, radium, a radioactive metal that occurs naturally in rocks, soils, and ground water, has become of concern to the water supplies of many population centers throughout the world, and in particular, portions of the world where the metal is found in high concentrations, e.g., Midwestern portions of the United States, Canada, Zaire, France and portions of Russia. Of particular importance to these areas of high radium concentration, is the fact that radium readily dissolves in the acidic environment of ground water, and is often found as a major natural pollutant in these water supplies.
Radium, an element of group IIA of the periodic table, having 14 radioactive isotopes, continuously releases energy into the environment until a stable, non-radioactive material is formed. Conversion of radium, for example radium-226, to a stable, non-radioactive element, for example lead-206, occurs by radioactive decay, for example, through the emission of alpha-particles. During the process, other radioactive isotopes, for example radon-222, form from the original radium. Radium-226 has a half-life of 1,620 years, an indication that the isotope, once in the water supply, will remain radioactive in the water supply until removed (for all practical purposes). In addition, it is important to note that radioactivity is not dependent on the physical state or chemical combination of the material, requiring a radioactive material to be physically removed from the water supply in order to free it of the radioactivity.
The level of radioactivity in a water supply is determined by measuring the different characteristics of energy released within the water. Radioactivity is usually measured in units called “curries” (Ci), and its metric multiplies and fractions, for example, the mega, kilo, milli, micro, and picocurrie. It is well established that a curie is 3.37×1010 disintegrations per second. With regard to drinking water, radioactivity is extremely low and is measured in picocurries (one picocurrie equals one-trillionth of a curie) per liter (pCi/L) or gram (pCi/g) of tested material.
There are several known steps used in determining the level of radium in a water supply. Typically, the first step is to perform a “short-term gross alpha test” (gross meaning total) on a sample of the water supply. Most naturally occurring radioactive elements, like radium, emit alpha particles as they decay, and radium is no exception. Detection of alpha particles in the water signals the presence of specific radioactive substances, and provides a signal that further testing may be required, and that radium is likely present in the water supply (although other alpha emitting radioactive materials may be present in the water, radium represents a major element of concern due to its widespread distribution, especially in the regions of the world discussed above).
The United States Environmental Protection Agency (EPA) has established Maximum Contaminant Levels (MCL) for combined radium-226 and radium-228, and for other gross alpha emissions in drinking water. These MCL are based on current standards of safety with regard to alpha and beta radiation, based on the relative risk of the emissions to the safety of the consuming population of the water. As such, the MCL represents the maximum permissible level of, in this case alpha emissions, that ensures the safety of the water over a lifetime of consumption, taking into consideration feasible treatment technologies for removing radium and other alpha emitters from the water and for monitoring capabilities of these same materials. The MCL for combined radium-226 and radium-228 is 5 pCi/L of water. In addition, the MCL for gross alpha in drinking water is 15 pCi/L (note that specific MCLs for radium-224 or other specific alpha emitters have not been established).
Presently, there are a number of water sources that violate the EPA's MCL for radium. This remains the case even-though these water sources are processed through state of the art water treatment facilities. For example, as of May of 2001, approximately 200 water treatment facilities in a 20 state area were in violation of the mandated MCL for radium. In particular, Illinois had almost 100 facilities in violation of the EPA's standards. It is believed that the number of radium standard violations is likely to reach 500-1000 facilities once a more comprehensive determination of radium levels is performed throughout the United States.
Presently, drinking water treatment facilities in the United States are searching for ways to lower radium levels to comply with the MCL (this applies world wide as well where many countries are attempting to lower radium levels in the drinking water supplies) in a cost effective manner. State of the art solutions include point-of-use technologies, such as reverse osmosis or carbon absorption filters. Larger scale solutions include relatively expensive ion-exchange resins that require the spent resins to be recovered and the radium to be isolated from the resin and disposed of in highly concentrated fashions, i.e., high-level waste. As is well known in the art, high-level waste must be disposed of at highly regulated licensed sites, at exorbitant cost.
As such, there is a need for a radium removal system from water that is relatively inexpensive and allows for the disposal of the collected radium at low-level radioactive waste sites. Note that low-level waste sites typically are characterized as receiving waste having less than 10,000 pCi/g in the material. The inability to remove radium in a condition for low-level radioactivity disposal has traditionally been a major drawback of existing radium removal technology. Against this backdrop the present invention was developed.
Ammonia contamination of water resources has proven to be extremely problematic. High levels of ammonia commonly occur in wastewater, and occasionally drinking water, as a result of well contamination by industrial and agricultural processes. Presently, there is a trend to lower the ammonia discharge limits for facilities toward a range of 2 to 4 parts per million (ppm) from a previous range of 10 to 15 ppm.
Convention ammonia removal technology has focused on additional aeration at wastewater treatment plant lagoons. IN general, this remedy has proven ineffective. IN contrast, a number of new technologies, focused on other wastewater related problems, have had the side-effect of lowering ammonia discharges. For example, activated sludge wastewater plants are being constructed to eliminate a fall range of biological contaminants and have the added benefit of decreased ammonia discharges to 2 ppm or less. These plants however are expensive and not required in areas where the only problem is high ammonia levels. Further, technologies such as Sequence Batch Reactors (SBR's), Rotating Biological Filters (RBF's), and Trickle Filters are also used to solve non-ammonia related wastewater cleanup problems but ammonia reduction seems to be an added benefit. However, these newer technological options require entirely new facilities or expensive rebuilds at existing facilities.