As a result of the quick and vast development of instrumentation and software capabilities, the optimization and control of complex energy systems can presently take advantage of highly sophisticated engineering techniques, such as CFD calculations and correlation algorithms based on artificial intelligence concepts. However, the most advanced numerical prediction still relies on strong simplifications of the exact transport equations. Likewise, the output of a neural network, or any other refined data-processing device, is actually based in a long record of observed past responses. Therefore, the implementation of modern diagnosis tools generally requires a great amount of experimental data, in order to achieve an adequate validation of the method. Consequently, a sort of paradox results, since the validation data cannot be less accurate or complete than the predictions sought. To remedy this situation, there are several alternatives. In opposition to laboratory work or well-instrumented pilot plants, the information obtained in the full scale installation offers the advantages of realism and low cost. This paper presents the case-study of a large, pulverized-coal fired utility boiler, discussing both the evaluation of customary measurements and the adoption of supplementary instruments. The generic outcome is that it is possible to significantly improve the knowledge on combustion and heat transfer performance within a reasonable cost. Based on the experience and results, a general methodology is outlined to cope with this kind of analysis.