Abstract

In the recent years, digital current mode control has gained popularity due to its technical benefits. However, designers face difficulties in controller design due to the delays associated with analog-to-digital (A/D) converter and controller computation. In software-based digital controller implementation, the cumulative delay of A/D conversion and controller computation may be comparable to the switching time period. This brief shows that the one-cycle delay in digital implementation in fact enhances the stability margin in a mixed-signal current mode controlled buck converter. By deriving a discrete-time model, we obtain the stability region in the parameter space and show that, if the controller gain is set within a specific range, the software-based implementation can achieve stable periodic (period-1) behavior without a ramp compensation even when the duty ratio exceeds 0.5. The theoretical results are validated with a buck converter prototype with a digital controller realized on an FPGA device.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call