Today's electric vehicle drives require increasing levels of torque and power to meet aggressive vehicle acceleration and grade-climbing requirements. To be practical, commercial electric vehicle drives require minimized cost, size, and weight. During the last decade, as higher voltage semiconductor devices have become available, industry has increased drive system power density by increasing the system voltage (i.e., dc link voltage) from approximately 100 V to 300 V, thereby allowing higher voltage-rated, but lower current-rated, power semiconductors to be used in the inverter. This trade-off advantageously minimizes the size and cost of electric drive systems. Except for the battery, the inverter is the most expensive subsystem in the total electric drive system, and the power semiconductor switches are generally the most expensive components in the inverter. For typical voltage levels of these drives, power semiconductor cost increases more rapidly as a function of current rather than voltage.
Increasing the dc system voltage has resulted in significant performance improvements while reducing costs. However, increasing the dc system voltage also requires the battery to be designed for relatively high voltage (typically 300 V nominal), which is accomplished by both designing lower current capacity cells and connecting larger numbers of small cells (e.g., 2 V) in series, disadvantageously reducing battery reliability and life due to cell-to-cell capacity mismatch. The larger number of cells that are connected in series, the greater the probability of cell-to-cell variations. Battery weight constraints also limit the number of series strings of cells that can be connected in parallel, further reducing reliability.
Accordingly, it is desirable to provide a solution to the high-voltage battery reliability problem and to increase efficiency in ac electric drive systems by decoupling the energy storage system from the dc link voltage.