Analysis of raw test data has required the need for performing filtering and smoothing of the data in order to allow for an effective analysis. Many times, researchers only look for visual evidences of smoothness. For instance, this evidence might be shapes that are generally pleasing to the eye. However, data requirements can be more stringent. For instance, to be useful to a program, a curve must be numerically smooth. This may be a difficult property to attain without solving complicated linear equations.
In the experimental data context, physical measurements introduce error that can be both random and specific to a domain. Methods to handle error have been shown to be problematic. In many cases, there is a shifting of the data caused by averaging in time. As well, the resulting model has poorly defined derivatives.
Due to the problem of obtaining good and smooth data, researchers have been reduced to hand calculations to preserve data integrity. These hand calculations rely on many visual cues. However, they cannot produce the quality of data or derivatives needed to support modeling, such as simulation or dynamic data analysis.
One common approach to both filtering and smoothing of raw data is to use a sliding window that crosses multiple data points, and to apply one of a variety of schemes to smooth the data. The schemes include linear or non-linear methods. An example linear method would parametrically set the size of the window, thereby controlling the number of coefficients that are used. The main complaint about this method is that it causes a shift in the sampling phase. Another approach applies least-squares analysis on the fit error in order to determine a curve to represent the data. This method is not sensitive to domain constraints, such as the minimal temperature delta that makes sense in the domain, such as for a particular alloy of a metal. There are many other approaches that require knowledge and insight into higher-order mathematics, such as analysis and filtering in frequency space. Such methods presuppose that the researcher knows how to distinguish between noise and signal within the data stream.
Therefore, there exists a need for an improved data filtering and smoothing process that can be used in a variety of environments, that can appeal to the intuition, and that can provide highly desirable properties of benefit to the researcher.