This paper describes a low cost detection system for Laser Induced Breakdown Spectroscopy based on a simple spectrograph employing a conventional diffraction grating and a non-intensified, non-gated, non-cooled 1024 pixel Complementary Metal Oxide Semiconductor linear sensor array covering the spectral range from about 250 to 390 nm. It was employed in conjunction with a 1064 nm, 5 ns pulse duration Nd:YAG laser source for analyzing steel samples using the integration of 300 analysis pulses (35 mJ each). That led to gains in the signal-to-noise ratio of approximately 3 and 16 for Mn and Fe peaks, respectively, in addition to gains in the emission intensities of about 5.3, both in comparison with the integration of just 50 analysis pulses. The acquired emission spectra were used for Mn determination, in the range from 0.214 to 0.608% m/m as previously determined by ICP OES, evaluating different univariate (at different discrete wavelengths) and multivariate (over different spectral ranges) calibration strategies. The best results, using a PLS calibration model in the spectral range from 292.9 to 294.5 nm (related to Mn emission peaks), had relative errors of prediction of the Mn concentrations, for samples not employed in the calibration, from 0.3 to 7.3%, which are similar to or better than those obtained for Mn determination in steel using higher cost detection systems. The successful analytical application of the new detection system demonstrated that the performance of low cost detection systems can be very good for specific applications and that its low resolution and sensitivity can be at least partially compensated by the use of chemometrics and the integration of analysis pulses.