Abstract
In the recent years, digital current mode control has gained popularity due to its technical benefits. However, designers face difficulties in controller design due to the delays associated with analog-to-digital (A/D) converter and controller computation. In software-based digital controller implementation, the cumulative delay of A/D conversion and controller computation may be comparable to the switching time period. This brief shows that the one-cycle delay in digital implementation in fact enhances the stability margin in a mixed-signal current mode controlled buck converter. By deriving a discrete-time model, we obtain the stability region in the parameter space and show that, if the controller gain is set within a specific range, the software-based implementation can achieve stable periodic (period-1) behavior without a ramp compensation even when the duty ratio exceeds 0.5. The theoretical results are validated with a buck converter prototype with a digital controller realized on an FPGA device.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Circuits and Systems II: Express Briefs
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.