Abstract

Compact heat exchangers have gained increased attention in recent years, particularly in demanding applications where high temperatures, high pressures, and/or high power densities are required. For decades, the heat exchanger (HX) community believes that flow maldistribution is a key factor for HX effectiveness, that is, reducing the degree of flow maldistribution (MALD) can help increase the HX effectiveness. Therefore, significant efforts have been devoted in the past to optimizing the header geometry to minimize flow maldistribution. This work was initially motivated by this, and the original goal was to figure out a HX header design with the lowest maldistribution. However, by systematically constructing a comprehensive maldistribution matrix, the analysis revealed that the HX effectiveness is not actually determined by the MALD, but instead dominated by the degree of maldistribution mismatch (MISM). This conclusion was also theoretically generalized, which indicated that matching of the local heat capacity rate is key for achieving maximum performance. The MISM provides a local means of tracking this information, while the MALD only provides a global approximation of the maldistribution itself. With this new perspective, flow maldistribution needs not necessarily be avoided, but instead matched between two fluid streams, to improve the HX performance. We demonstrated that by carefully designing the header geometry to match the velocity profiles of the two fluids in a 2 MW PCHE with molten salt and supercritical carbon dioxide (sCO2) as the heat transfer fluids, the HX could achieve a higher effectiveness even when the maldistribution increased. A technoeconomic study using a CSP system as an example revealed that the use of this new HX design paradigm could result in CSP capital cost savings as large as 16.6%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call