The present invention relates generally to wheel slip/slide control systems for alternating current induction motor powered traction vehicles such as locomotives or transit vehicles and, more particularly, the invention relates to a method for calibrating wheel diameter during operation of a vehicle for enabling detection of wheel slip or slide.
Locomotives and transit vehicles as well as other large traction vehicles are commonly powered by electric traction motors coupled in driving relationship to one or more axles of the vehicle. Locomotives and transit vehicles generally have at least four axle-wheel sets per vehicle with each axle-wheel set being connected via suitable gearing to the shaft of a separate electric motor commonly referred to as a traction motor. In the motoring mode of operation, the traction motors are supplied with electric current from a controllable source of electric power (e.g., an engine-driven traction alternator) and apply torque to the vehicle wheels which exert tangential force or tractive effort on the surface on which the vehicle is traveling (e.g., the parallel steel rails of a railroad track), thereby propelling the vehicle in a desired direction along the right of way. Alternatively, in an electrical braking mode of operation, the motors serve as axle-driven electrical generators, torque is applied to their shafts by their respectively associated axle-wheel sets which then exert braking effort on the surface, thereby retarding or slowing the vehicle's progress. In either case, good adhesion between each wheel and the surface is required for efficient operation of the vehicle.
It is well known that maximum tractive or braking effort is obtained if each powered wheel of the vehicle is rotating at such an angular velocity that its actual peripheral speed is slightly higher (motoring) or slightly lower (braking) than the true vehicle speed (i.e., the linear speed at which the vehicle is traveling, usually referred to as "ground speed" or "track speed"). The difference between wheel speed and track speed is referred to as "slip speed." There is a relatively low limit value of slip speed at which peak tractive or braking effort is realized. This value, commonly known as maximum "creep speed," is a variable that depends on track speed and rail conditions. So long as the maximum creep speed is not exceeded, the vehicle will operate in a stable microslip or creep mode. If wheel-to-rail adhesion tends to be reduced or lost, some or all of the vehicle wheels may slip excessively, i.e., the actual slip speed may be greater than the maximum creep speed. Such a wheel slip condition, which is characterized in the motoring mode by one or more spinning axle-wheel sets and in the braking mode by one or more sliding or skidding axle-wheel sets, can cause accelerated wheel wear, rail damage, high mechanical stresses in the drive components of the propulsion system, and an undesirable decrease of tractive (or braking) effort.
Many different systems are disclosed in the prior art for automatically detecting and recovering from undesirable wheel slip conditions. Typically, differential speeds between axle-wheel sets or rate of change of wheel speed or a combination of these two measurements are used to detect wheel slip. Speed is monitored and if found to exceed predetermined differentials or rates of change, power to the motors is reduced in an attempt to bring speed to a value at which traction is regained.
In general, locomotive speed or tangential wheel speed can be calculated from measured motor rotor revolutions per minute ("RPM") values given the diameter of the associated wheel. Conventionally, a speed sensor or revolution counter is coupled to sense the rotational speed of an output shaft of each drive motor. The sensed speed is then converted to a value representative of wheel RPM by multiplying the sensed value in RPM by the gear ratio between the drive motor shaft and wheel/axle set. Tangential wheel speed is then calculated from wheel RPM. For example, a standard 42 inch locomotive wheel has a circumference C equal to .pi. times diameter D or 131.95 inches so that one wheel revolution advances the vehicle by 131.95 inches, assuming zero slip. From this it can be readily determined that a wheel RPM of 200 will produce a locomotive speed of about 25 MPH or, more precisely, about 24.9899 MPH. If the actual wheel diameter is 41.5 inches, the true velocity can be calculated to be 24.6924 MPH which introduces an error of about 0.3 MPH. This speed difference represents an error which produces slip, since the control system regulates based on the assumed ideal diameter, and leads to a loss of tractive effort as well as creating additional wear on the wheels and rails. More importantly, if wheel calibration is in error, the control system will derate (reduce the available tractive or braking effort) when it is not necessary since the system will detect a speed error indicative of a wheel slip or slide.
The need for wheel diameter calibration has been recognized in the art. Typically, a locomotive is provided with an auxiliary ground speed sensor such as a radar unit (similar to the type used by police for monitoring automobile speed) or a satellite sensor (generally referred to as global position sensor or GPS). The ground speed signal from one of these sensors is compared to the speed determined from the motor shaft RPM sensor value and any error is corrected by adjusting the calculated value of wheel diameter. One problem with the prior art systems is that the comparison or calibration could only be performed when the locomotive was in a coast mode, i.e, the traction motors were not energized for either powering or braking of the locomotive. Further, it was generally necessary for the locomotive to be in such a coast mode for an extended, continuous time in order to complete the calibration. However, there are many instances in which the opportunity to operate a locomotive for an extended period in a coast mode is simply impractical. Accordingly, it would be advantageous to provide a wheel diameter calibration system which does not require coast mode operation and which does not require an extended, continuous time to achieve calibration.