The receiver in a concentrating solar thermal system is a costly component, and its contribution to system-level techno-economics must be carefully evaluated. This study examined the strategy of adding ‘blades’ to conventional tubular receivers, in the form of additional tube-banks perpendicular to the primary aperture surface, to improve light trapping and reduce size. An integrated optical–thermal model was used, incorporating geometrical configuration, flow path design and aiming strategy, and optical and thermal performance assessed followed by an economic analysis of its dynamic annual performance. A 65MW˙th polar heliostat field and molten nitrate salt working fluid were assumed, and design iterations conducted to maximise overall optical–thermal receiver efficiency. The final design achieved a 30% reduction in optical and thermal losses, equivalent to 2.5% increased efficiency. Even though the final design can sustain higher peak flux, its size could not be reduced without incurring excessive spillage loss. Meanwhile, the added blades did not increase the intercepted radiation. As a result, in the techno-economic evaluation, the approximately doubled tube mass increased the receiver cost by 16%–25%, and only 1–3USD/MWh˙e savings were achieved in the levelised cost of electricity (LCOE). The LCOE of bladed receivers was evaluated for the first time in this study, confirming that bladed receivers in their current form are not promising. However, the study provides insight into the benefits of the related ‘STAR’ concept, where vertical blades are added to cylindrical receivers, which increase the receiver aperture while decreasing the tube cost per aperture area.