Abstract
Voltage ripple in single-phase ac–dc converters is usually disregarded to design the dc-link voltage control system, since large dc-link electrolytic capacitors are typically employed. However, replacement of electrolytic capacitors by film capacitors has been widely considered for increasing reliability. Consequently, due to cost constraints, such replacement usually employ low capacitance and may be combined with fast dc-bus voltage controllers. This paper shows that the conventional linear time-invariant (LTI) model may not represent the real converter behavior for reduced capacitance and/or faster controllers, leading to poor system performance or even instability. In this way, for any of these cases, the linear time-periodic (LTP) model is highly encouraged for stability and transient analysis as a tool for the controller design. Experimental results confirm that stability margins are precisely obtained from the LTP model but may not from the LTI model. Finally, this paper also compares both LTI and LTP models while considering the control gain and the dc-bus capacitance size. It is graphically revealed that the phase margin of the LTI model diverges from the LTP model for low dc-bus capacitances and high control gains.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.