The aim of this research is to investigate source coding, the representation of information source output by finite R bits/symbol. The performance of optimum quantisers subject to an entropy constraint has been studied. The definitive work in this area is best summarised by Shannon' s source coding theorem, that is, a source with entropy H can be encoded with arbitrarily small error probability at any rate R (bits/source output) as long as R>H. Conversely, If R<H the error probability will be driven away from zero, independent of the complexity of the encoder and the decoder employed. In this context, the main objective of engineers is however to design the optimum code. Unfortunately, the ratedistortion theorem does not provide the recipe for such a design. T he t heorem d oes, h owever, p rovide t he t heoretical l imit s o t hat w e know how close we are to the optimum. The full understanding of the theorem also helps in setting the direction to achieve such an optimum. In this research, we have investigated the performances of two practical scalar quantisers, i.e., a LloydMax quantiser and the uniformly defined one and also a wellknown entropy coding scheme, i.e., Huffman coding against their theoretically attainable optimum performance due to Shannon' s limit R. I t ha s be en s hown that our uniformly defined quantiser could demonstrate superior performance. The performance improvements, in fact, are more noticeable at higher bit rates.