An oscilloscope is an electronic instrument which is used to detect, analyze and display voltage signals. A logic analyzer is an electronic instrument used to detect, analyze and display digital voltage signals. (Generally, digital voltage signals may be regarded as a sub-set of voltage signals. Typically, a digital voltage signal conveys binary information, that is, the signal is either logic "high" or logic "low". For instance, signal voltage is above a according to one convention, if the given threshold voltage value it is "high" whereas if it is below the threshold value it is "low". In conventional binary logic, "high" represents a "1" and "low" represents a "0".) Both instruments acquire input signals and display their representative waveforms. The instruments' display screens are essentially windows through which users can view acquired signal waveforms.
The oscilloscope displays an input voltage signal's waveform as a function of voltage-versus-time, typically with voltage amplitude measured along a vertical axis and time measured along a horizontal axis. Like an oscilloscope, the logic analyzer also displays a digital signal's waveform as a function of voltage-versus-time with voltage amplitude measured along a vertical axis and time displayed along a horizontal axis. Unlike an oscilloscope, however, the logic analyzer does not display high vertical (that is, voltage) resolution. Rather, for a given signal the logic analyzer waveform will resemble a square-wave and therefore be either "high" or "low". Since the signal carries binary information, that is, 1's and 0's, it is the signal's transition above or below the threshold value, rather than its absolute value, that is important. Therefore, the logic analyzer's square-wave waveform display mode is both adequate and ideal for digital signals.
Certainly, the oscilloscope, as opposed to the logic analyzer, is the more general purpose instrument and most engineers, particularly electrical engineers, are familiar with its function and operation. The operational user-interface for the oscilloscope comprises two major controls: seconds-per-division, also known as range, and delay. Typically, the display screen of an oscilloscope is divided into equal size divisions along both the voltage amplitude and the time axes. The seconds-per-division control allows the user to change the amount of time displayed across the time axis of the display screen. The total time across the screen is known as the range. For any given display, decreasing seconds-per-division increases display resolution in a manner analogous to increasing the magnification of a microscope. The delay control positions the display window in time relative to the trigger event. The trigger defines the conditions which the input signal must meet before it is acquired by the instrument. By adjusting the oscilloscope's delay, the user can move the waveform display window forward or back in time relative to the trigger.
The function and operation of the logic analyzer, however, is not as familiar to most engineers. Typically, the logic analyzer displays the digital waveforms of many individual digital signals at once, such as the signals on the multiple lines of a data-bus or an address-bus. The logic analyzer samples an input signal, digitizes the samples, stores the digitized samples in memory and maps the stored digital values into a representative square-wave waveform on the display screen. The typical logic analyzer can show sixteen digital waveforms at once. Thus, on such an analyzer the signals on the sixteen lines of a 16-bit address bus would be shown as sixteen individual square-wave waveforms positioned from the top to the bottom of the display screen. The user of the logic analyzer can then view the sixteen waveforms simultaneously in a single display. Typically, the user is interested in visually comparing the transitions from high to low of numerous waveforms, such as a comparison of the transitions on certain address lines of a microprocessor with the transitions on certain control lines.
The major operational user-interface controls for a logic analyzer are: sample-period, magnification, magnify-about, magnify-about marker-movement, start/center/end, and hardware-delay. The sample-period control allows the user to set the sample period of the sampling circuit which acquires and digitizes samples of the input signal. The magnification control allows the user to select various magnifications of the display window. The magnify-about control allows the user to select one of two markers (usually an "x" and an "o") located in the display window; magnification can only occur about the location of a magnify-about marker. The magnify-about marker-movement control allows the user to position magnify-about markers in the display window. The start/center/end control allows the user to define where, in the digitized samples that are stored in memory, the trigger condition is located. (Similar to an oscilloscope trigger, the logic analyzer trigger defines a condition which the input signal(s) must meet before the instrument begins acquiring samples of input signal(s). The logic analyzer user can define the trigger using so-called edge, pattern and glitch controls.) Finally, the hardware-delay control allows the user to specify a certain delay time from trigger that the sample acquisition hardware will wait before acquiring samples. These typical logic analyzer operational controls are fully explained in the Hewlett-Packard 1984 Operating and Programming Manual for the Model 1630A/D/G Logic Analyzer. These logic analyzer controls give the user much control but they also require a certain degree of understanding of the logic analyzer's sampling hardware function and the schemes for storage of digitized samples in memory. The relatively large number of controls therefore can make the logic analyzer seem confusing to a person who is familiar with the oscilloscope.