1. Field of the Invention
This invention relates generally to rotating magnetic storage devices, and more particularly to analog to digital conversion of signals related to those storage devices.
2. Description of the Related Art
Analog-to-digital converters (ADC) are widely used in digital devices to interface between the analog and digital world. An ADC converts an analog signal such as a voltage or a current into a digital signal that can be further processed, stored, and decimated using digital computers. For example, ADCs are used in communications, appliances, signal processing, computers, and any other fields that require conversion of analog signals into digital forms.
As is well known in the art, the ADC encodes an analog input signal into a digital output signal of a predetermined bit length, N. The encoding of the analog input V.sub.A, into a digital output signal of N-bits is typically approximated as a binary fraction of a full-scale output voltage, V.sub.SS. Hence, the output of the converter corresponds to an N-bit digital word D given as: EQU D=V.sub.A /V.sub.SS =(B.sub.1 /2.sup.1)+(B.sub.2 /2.sup.2)+ . . . +(B.sub.N /2.sup.N),
where B.sub.1, B.sub.2, . . . , B.sub.N are the binary bit coefficients having a value of either a one or a zero. In this setting, the binary coefficient B1 represents the most significant bit while B.sub.N represents the least significant bit of the digital word. The binary bit coefficients are obtained from the output of the ADC converter.
Conventional ADCs often use a successive approximation of techniques to convert an analog signal into a digital signal. In successive approximation methods, the analog input voltage is successively approximated one bit at a time to arrive at the output digital voltage signal. For example, for a 10-bit result, ten different approximations take place with one approximation to set each one of the bits.
Typically, conventional ADCs use a fixed number of clock cycles for all of the bits to be set so that the same amount of time is used for each bit approximation. In practice, however, the number of clock cycles needed to set each of the bits is usually different for each of the bits. These ADCs select a period of time for clock cycle based on the worst bit case in the approximation. For example, the most significant bit may require 16 clock cycles to convert. For all subsequent bits, the clock cycles for this worst case bit is used to set each of the bits even though the remaining bits may not require as many clock cycles to convert. This means that for the rest of the bits, substantial amount of time will be wasted. Accordingly, the speed of the ADC in converting an analog signal to a digital data may consume significantly more time than is necessary.
Thus, what is needed is an ADC that can convert individual data bits in clock cycles that are tailored to the timing requirements for each individual data bits to reduce conversion time. What is also needed is an ADC converter that is programmable to the conversion time requirements of different ADC converters bits due to manufacturing process variations.