Inertial sensors such as gyroscopes and accelerometers are used in a variety of applications for detecting and measuring inertial motion in one or more directions. In the design of some inertial navigation systems (INS), for example, such devices are used in sensing slight variations in linear and rotational motion of an object traveling through space. Typically, such motion is sensed by detecting and measuring displacement of a resonating structure such as a number of cantilevered beams or interdigitated comb fingers. In an inertial sensor employing a MEMS-type gyroscope and accelerometer, for example, a number of oscillating proof masses can be used to sense displacement and acceleration in response to movement of the device about an input or “rate” axis. In some designs, one or more of the gyroscopes and/or accelerometers can be provided as a part of an inertial measurement unit (IMU) that can be used to measure inertial motion and acceleration in multiple directions.
Inertial sensors are often used in environments that inherently subject the sensors to significant vibrations. When provided on aircraft and weapons, for example, significant vibration-induced bias errors can occur as a result of the constantly changing vibratory environment, affecting the sensor's ability to detect and measure subtle changes in motion. Such environments are especially problematic in those systems employing microelectromechanical (MEMS) sensors, which typically utilize vibratory mechanisms for rate and acceleration sensing. In a commonly used MEMS resonant beam accelerometer having a nominally one mili-g accuracy, for example, the presence of a constantly changing vibratory environment may produce bias shifts on the order of several mili-g's. For those inertial sensors exhibiting vibration sensitivity, the most common effect is a slowly varying low frequency error component that changes as a function of the applied vibration spectrum. Other errors may be present, however, depending on the application.
To overcome bias shifts resulting from vibration changes in the environment, many inertial sensors employ a sensor model and calibration process that is independent of the actual operating environment. In the case of an inertially guided weapon launched from an aircraft equipped with an inertial navigation system (INS), for example, an in-flight transfer alignment and calibration procedure is typically performed prior to release of the weapon. During this period, velocity differences (or related quantities) between the aircraft INS and the weapon INS may be processed by a Kalman filter to initialize the attitude and heading of the weapon INS, and to estimate errors in the IMU including any gyroscope and/or accelerometer bias errors. The vibration spectrum present at the weapon IMU is strongly driven by its captive-carry environment, such as a wing-store station or weapon bay. During the captive carry, the weapon's inertial sensor bias errors will be affected by the vibration environment. However, after release of the weapon, a significantly different vibration environment will typically exist, leading to gyroscope and accelerometer bias shifts that can cause vibration-induced errors in the sensor output. In some situations, other vibratory and non-vibratory related factors can also lead to such bias shifts in the sensor. Depending on the magnitude of these bias shifts, such change can negate any benefit of the pre-launch weapon IMU calibration.