Information granules are generic building blocks supporting the processing realized in granular computing and facilitating communication with the environment. In this paper, we are concerned with a fundamental problem of encoding–decoding of information granules. The essence of the problem is outlined as follows: given a finite collection of granular data X 1, X 2,…, XN (sets, fuzzy sets, etc.), construct an optimal codebook composed of information granules A 1 , A2 , …, Ac , where typically c N, so that any Xk represented in terms of A i 's and then decoded (reconstructed) with the help of this codebook leads to the lowest decoding error. A fundamental result is established, which states that in the proposed encoders and decoders, when encoding–decoding error is present, the information granule coming as a result of decoding is of a higher type than the original information granules (say, if Xk is information granule of type-1, then its decoded version becomes information granule of type-2). It would be beneficial to note that as the encoding–decoding process is not lossless (in general, with an exception of a few special cases), the lossy nature of the method is emphasized by the emergence of information granules of higher type (in comparison with the original data being processed). For instance, when realizing encoding–decoding of numeric data (viz., information granules of type-0), the losses occur and they are quantified in terms of intervals, fuzzy sets, probabilities, rough sets, etc., where, in fact, the result becomes an information granule of type-1. In light of the nature of the constructed result when Xk is an interval or a fuzzy set, an optimized performance index engages a distance between the bounds of the interval-valued membership function. We develop decoding and encoding mechanisms by engaging the theory of possibility and fuzzy relational calculus and show that the decoded information granule is either a granular interval or interval-valued fuzzy set. The optimization mechanism is realized with the aid of the particle swarm optimization (PSO). A series of experiments are reported with intent to illustrate the details of the encoding–decoding mechanisms and show that the PSO algorithm can efficiently optimize the granular codebook.