In this letter, we study the performance of a noisy Gallager B decoder used to decode irregular low-density parity-check (LDPC) codes. We derive the final bit error rate (BER) as a function of both the transmission noise and processing errors. We allow different components of the decoder associated with certain computational units (i.e., bit and check nodes of varying degrees) to have different processing errors. We formulate an optimization problem to distribute available processing resources across different components of a noisy decoder to achieve minimal BER. Simulations demonstrate that the optimal resource allocation derived from our analysis outperforms uninformed (random) resource assignment.