Computer algebra systems executed on computing devices such as personal computers and graphing calculators are configured to graph mathematical concepts such as functions, equations, and parametric functions. When creating these graphs, particularly for computing devices that have high resolution displays, many computer algebra systems use sampling to reduce the number of data points required to render a graph, which in turn increases the speed at which a graph can be rendered. This sampling can result in approximation errors when incorrect assumptions are made by the computer algebra system about the continuity of sample points. Perhaps the most common of these approximation errors occurs as a line or surface connecting two points or regions, which in actuality are separated by an asymptote, and thus should not be connected.
A first example of such an approximation error in a two dimensional graph is illustrated in FIG. 1. The approximation error in FIG. 1 is due to an incorrect assumption that was made about the continuity of the curve between two points on opposite sides of an asymptote. The result is that the graph of the approximated curve, shown in solid lines, greatly deviates from the actual curve, shown in dotted lines. A second example of such an approximation error in a three dimensional graph of a surface is shown in FIG. 2. The approximation error of FIG. 2 appears as a plane connecting to disjointed parts of the surface. It will be appreciated that there is no vertical plane in the actual function, but a plane is displayed in the graph due to the approximation error.
These approximation errors can be frustrating to students, engineers, and scientists alike, and present a barrier to understanding mathematical concepts. Further, these errors may undesirably erode trust in the accuracy of computer algebra systems.