Technical Field
The present invention relates to a health care device including physiological data acquisition and methods of analysis and use for the device.
Related Background Art
As sensors for physiological data and data acquisition and data handling systems have improved, the amount of physiological data available to caregivers has expanded. It is now common practice to acquire data continuously from electronic sensors attached to patients. Nonlimiting examples of such sensors include temperature probes, probes sensitive to movement to detect breathing, sensors that detect electrical signals from the patient such as electroencephalograms (EEG) and electrocardiograms, sensors for chemistry such as blood oxygen detectors and blood glucose levels. The data is typically acquired versus time. The signal from the sensors is often a voltage or current measurement that is passed through an analog to digital converter to provide a numeric intensity measurement versus time. The analyses look for variations or patterns in the acquired data that are indicative of a disease or abnormal state. In many cases, such as that in the case of electroencephalogram and electrocardiogram data, the data represents repeating waveform patterns. The analysis uses filtering and transform techniques to extract waveform morphology, fundamental frequencies and patterns in the acquired data. The data may be acquired over periods of time from seconds to months. The sensors and data acquisition may be used for patients that are not moving, such as those confined to a bed and those in an intensive care unit of a hospital or the sensors may be attached to ambulatory patients and data is collected continuously as the patient moves about in their normal life routines.
A common feature of the data analysis for such physiological information is to look for anomalies that may have indicated either a disease state or a critical state where a caregiver intervention is required to aid the patient. The latter are common in intensive care unit situations. The large amount of data being acquired from a large number of patients has required the development of automated routines to evaluate the collected data. Frequently the analysis is used to provide automated response, such as in the case of insulin dosing systems responsive to automated blood glucose measurements or in the case of pace makers where an external electrical stimulus is provided upon detection of irregularity in the patient's heartbeat. The physiological data analysis is also frequently used to trigger alarms indicating immediate action is required, such as in an intensive care unit monitoring of an at risk patient. A common failure of all of these analyses and is that false alarms are common. It has been reported that in electrocardiogram data collected in an intensive care unit as much as 86% of the alarms were false alarms.
The data analysis typically involves looking for patterns in the data that are indicative of a disease or abnormal state. Automated algorithms are applied to measure for example in the case of an electrocardiogram, the heart rate, variations in the heart rate and shapes of the repeating waveforms. Algorithms are typically tested against a standard database of acquired data that includes cases where diagnoses of the state of the patients have been independently confirmed. Heretofore algorithms have been tested one at a time and optimized for accuracy and sensitivity to a particular condition. The goal has been to find a single algorithm that will provide the sensitivity and accuracy for all patients. No such algorithm has been found and indeed variations in patients and conditions make such a Holy Grail algorithm unlikely. Caution has dictated to set the sensitivity of the algorithm to be at a very high value, so as to not miss disease or emergency states. This procedure has resulted in algorithms that when applied to the general population of patients, results in errors especially in the form of excessive false positive results for disease or emergency responses. Algorithms optimized for a database have been found on average when applied to individual patients to produce excessive errors that must be reconciled by a trained technician.
The current state of the art for detecting cardiac events in ambulatory patients involves either running a single algorithm on a patient attached device or running a single algorithm on servers that receive a full disclosure data stream from an ambulatory patient attached device. In some cases, a technician reviews every beat of one or two days of full disclosure ECG using a semi-automated algorithm that assists the technician in this review. Electrocardiogram data acquired over a period of days is typically referred to as a “Holter scan” and provides detailed information on the actual number of beats of each morphology, number of abnormal beats, and exact length and type of arrhythmic episodes. Ambulatory algorithms are typically tuned on a small data set to be as sensitive as practical on the entire patient population, and these performance numbers are published using a specific standard so that physicians can compare the performance of different algorithms on a standard small data sent constructed to reflect what the algorithm would encounter in the real world. This usually results in a large number of false positive events that a technician must deal with in order to get to acceptable levels of sensitivity. These false positive events require technician time to review and increase the cost of providing ambulatory monitoring services. In addition, device side algorithms or server side algorithms typically do not provide quantifiable beat counts as a Holter scan would. They also do not typically provide interpretive statements, which the technician applies after reviewing and possibly correcting the event presented by the algorithm.
Patients may also present with distinctly different cardiac signals depending on their disease state, the normal amplitude of the electrical activity of their heart, the orientation of their heart in their chest cavity and other idiosyncrasies provide challenges to detecting events with high specificity. Currently, a single algorithm must take into account all of the possible signals it may encounter from any patient in order for the algorithm to provide adequate sensitivity and diagnostic yield. This generally results in large numbers of false positive, and typical efforts to reduce the number of false positives (increase specificity) usually result in some loss of sensitivity—i.e. the algorithm could miss real events.
Improved methods that maintain the sensitivity while reducing false positive results are needed. The discussions here will demonstrate the techniques applied specifically to electrocardiogram data, but those skilled in the art will readily see the applicability to any other similar timing varying physiological data.