Rapid, voluminous hemorrhage instigates a cascade of events that are almost impossible to reverse without immediate and effective intervention. According to the Centers for Disease control, motor vehicle trauma is the leading cause of death for Americans under age 64, with more than 40,000 victims per year (National Vital Statistics System. National Center for Health Statistics. CDC (2003)). The number of people who die from motor vehicle-related injuries has not changed for the past 10 years. Historically, more than half of those severely injured with concomitant hemorrhage die. While it has been shown that immediate intervention is the best method to limit patient mortality (Regel G et al., Acta Anaesthesiol Scand Suppl 1997; 110:71-6), the methods for controlling bleeding in the prehospital setting have not substantively changed for centuries (Zimmerman L M and Veith I. Great ideas in the history of surgery. Norman Publishing, San Francisco, Calif. (1993)). Tourniquets and recent innovations in hemostatic pressure dressings have been effective for treatment of surface and extremity wounds, but there are few options for head, neck, chest, and abdominal hemorrhage.
On the battlefield, seventy percent of ballistic injuries result in death within the first hour. This is due primarily to the massive blood loss associated with penetrating trauma. In Vietnam, five thousand deaths resulted from bleeding from the extremities. It was estimated that twenty percent of these casualties could have been avoided with better first aid (Neel, S., “Medical Support of the U.S. Army in Vietnam 1965-1970,” Department of the Army, Washington D.C. (1991)). Surprisingly, the percent of wounded who survive the first hour has not changed since the U.S. Civil War. In Iraq and Afghanistan, like in all previous wars, extremity wounds are the predominant injury. Deaths due to hemorrhage represents the majority of those killed in action in Iraq (Peake J B. N Engl J Med 2005; 352(3):219-22).
Many products exist that effectively treat hemorrhage from the extremities (e.g. tourniquets and hemostatic pressure dressings) as well as those that work systemically. Numerous large animal preclinical and human clinical trials have been conducted and published, and many comparative investigations of hemostatic agents have been undertaken (Pusateri A E et al., J Trauma 2003; 55(3):518-26; Alam H B et al., J Trauma 2003; 54(6):1077-82; King K et al., Mil Med 2004; 169(9):716-20). Coagulation adjuvants include mineral-based granules (e.g. QuikClot® hemostatic agent) (Turner S A et al., J Biomed Mater Res. 2002; 63(1):37-47; Pusateri A E et al., J Trauma 2004; 57(3):555-62; Robinson K, J Emerg Nurs 2004; 30(2):160-1; Alam H B et al., J Trauma 2004; 56(5):974-83), numerous dressings (Rothwell S W et al., Thromb Res 2003; 108(5-6);335-40; Alencar de Queiroz A A et al., J Biomed Mat Res A 2003; 64(1):147-54; Vournakis J N et al., J Surg Res 2003; 113(1):1-5; Connolly R J, J Trauma 2004; 57(1 Suppl):S26-8; King D R et al., J Trauma 2004; 57(4):756-9), clotting factors (Martinowitz U et al., J Trauma 2001; 50(4):721-9; Schreiber M A et al., J Trauma 2002; 53(2):252-9), and surgical approaches (Jaskille A et al., J Trauma 2005; 59(6):1305-8; Takasu A et al., J Trauma 2004; 56(5):984-90). Results have been mixed and are dependent on the model used. The zeolite-based QuikClot® hemostatic agent material has done well in some large animal trials, as has the chitosan-based bandage, HemCon® hemostatic bandage. In one extensive comparative trial, fibrin dressings were clearly superior (Pusateri A E et al., J Trauma 2003; 55(3):518-26). However, each of these hemostats has its challenges and limitations. QuikClot® hemostatic agent is known to produce localized heating which has been shown to damage tissue. Both HemCon® hemostatic bandage and QuikClot® hemostatic agent are not intended for internal injuries and must be removed from the wound site, and fibrin bandages are expensive and not yet approved by the Food and Drug Administration.
None of these materials, with the exception of a fibrin foam product that is still in development, can be applied without clear access to the site of hemorrhage, a major goal of first responders. Therefore there remains a great need for optimal materials that can control serious hemorrhage and promote wound healing.
The use of materials derived from keratin in medicine is not new. The earliest documented use of keratin in medicine comes from a Chinese herbalist named Li Shi-Zhen (Ben Cao Gang Mu. Materia Medica, a dictionary of Chinese herbs, written by Li Shi Zhen (1518-1593)). Over a 38-year period, he wrote a collection of 800 books known as the Ben Cao Gang Mu. These books were published in 1596, three years after his death. Among the more than 11,000 prescriptions described in these volumes, is a substance known as Xue Yu Tan, also known as Crinis Carbonisatus, that is made up of ground ash from pyrolized human hair. The stated indications for Xue Yu Tan were accelerated wound healing and blood clotting.
In the early 1800s, when proteins were still being called albuminoids (albumin was a well known protein at that time), many different kinds of proteins were being discovered. Around 1849, the word “keratin” appears in the literature to describe the material that made up hard tissues such as animal horns and hooves (keratin comes from the Greek “kera” meaning horn). This new protein intrigued scientists because it did not behave like other proteins. For example, the normal methods used for dissolving proteins were ineffective with keratin. Although methods such as burning and grinding had been known for some time, many scientists and inventors were more interested in dissolving hair and horns in order to make better products.
During the years from 1905 to 1935, many methods were developed to extract keratins using oxidative and reductive chemistries (Breinl F and Baudisch O, Z physiol Chem 1907; 52:158-69; Neuberg C, U.S. Pat. No. 926,999, Jul. 6, 1909; Lissizin T, Biochem Bull 1915; 4:18-23; Zdenko S, Z physiol Chem 1924; 136:160-72; Lissizin T, Z physiol Chem 1928; 173:309-11). By the late 1920s many techniques had been developed for breaking down the structures of hair, horns, and hooves, but scientists were confused by the behavior of some of these purified proteins. Scientists soon concluded that many different forms of keratin were present in these extracts, and that the hair fiber must be a complex structure, not simply a strand of protein. In 1934, a key research paper was published that described different types of keratins, distinguished primarily by having different molecular weights (Goddard D R and Michaelis L, J Biol Chem 1934; 106:605-14). This seminal paper demonstrated that there were many different keratin homologs, and that each played a different role in the structure and function of the hair follicle.
Earlier work at the University of Leeds and the Wool Industries Research Association in the United Kingdom had shown that wool and other fibers were made up of an outer cuticle and a central cortex. Building on this information, scientists at CSIRO conducted many of the most fundamental studies on the structure and composition of wool. Using X-ray diffraction and electron microscopy, combined with oxidative and reductive chemical methods, CSIRO produced the first complete diagram of a hair fiber (Rivett D E et al., “Keratin and Wool Research,” The Lennox Legacy, CSIRO Publishing; Collingwood, VIC, Australia; 1996).
In 1965, CSIRO scientist W. Gordon Crewther and his colleagues published the definitive text on the chemistry of keratins (Crewther W G et al., The Chemistry of Keratins. Anfinsen C B Jr et al., editors. Advances in Protein Chemistry 1965. Academic Press. New York:191-346). This chapter in Advances in Protein Chemistry contained references to more than 640 published studies on keratins. Once scientists knew how to extract keratins from hair fibers, purify and characterize them, the number of derivative materials that could be produced with keratins grew exponentially. In the decade beginning in 1970, methods to form extracted keratins into powders, films, gels, coatings, fibers, and foams were being developed and published by several research groups throughout the world (Anker C A, U.S. Pat. No. 3,642,498, Feb. 15, 1972; Kawano Y and Okamoto S, Kagaku To Seibutsu 1975; 13(5):291-223; Okamoto S, Nippon Shokuhin Kogyo Gakkaishi 1977; 24(1):40-50). All of these methods made use of the oxidative and reductive chemistries developed decades earlier.
In 1982, Japanese scientists published the first study describing the use of a keratin coating on vascular grafts as a way to eliminate blood clotting (Noishiki Y et al., Kobunshi Ronbunshu 1982; 39(4):221-7), as well as experiments on the biocompatibility of keratins (Ito H et al., Kobunshi Ronbunshu 1982; 39(4):249-56). Soon thereafter in 1985, two researchers from the UK published a review article speculating on the prospect of using keratin as the building block for new biomaterials development (Jarman T and Light J, World Biotech Rep 1985; 1:505-12). In 1992, the development and testing of a host of keratin-based biomaterials was the subject of a doctoral thesis for French graduate student Isabelle Valherie (Valherie I and Gagnieu C. Chemical modifications of keratins: Preparation of biomaterials and study of their physical, physiochemical and biological properties. Doctoral thesis. Inst Natl Sci Appl Lyon, France 1992). Soon thereafter, Japanese scientists published a commentary in 1993 on the prominent position keratins could take at the forefront of biomaterials development (Various Authors, Kogyo Zairyo 1993; 41(15) Special issue 2:106-9).
Taken together, the aforementioned body of published work is illustrative of the unique chemical, physical, and biological properties of keratins. However, there remains a great need for optimized keratin preparations for use in biomedical applications, particularly for the treatment of bleeding, and for the treatment of wounds.