With the advent of battery-powered portable devices and mandatory adoption of power factor correction, noninverting buck-boost converters are garnering lots of attention. Conventional two-switch or four-switch noninverting buck-boost converters choose their operation modes by measuring input and output voltage magnitude. The criterion for the selection of the operation mode can cause higher output voltage transients in the neighborhood, where input and output are close to each other. For the mode selection, due to the voltage drops raised by the parasitic components, it is not enough just to compare the magnitude of input and output voltages. In addition, the difference in the minimum and maximum effective duty cycles between controller output and switching device yields discontinuity at the instant of mode change. Moreover, the different properties of output voltage versus a given duty cycle of buck and boost operating modes contribute to the output voltage transients. In this paper, the effect of the discontinuity due to the effective duty cycle derived from the device switching time at the mode change is analyzed. A technique to compensate the output voltage transient due to this discontinuity is proposed. In order to attain additional mitigation of output transients and a linear input/output voltage characteristic in buck and boost modes, the linearization of DC gain of the large-signal model in boost operation is analyzed as well. Analytical, simulation, and experimental results are presented to validate the proposed theory.
Read full abstract