Underdetermined data are faced in many areas of computer science. It is of interest to determine which relations in information theory can be extended to underdetermined data (and in which form). In [1] (a brief exposition can be found in [2]), the entropy properties of underdetermined data were investigated and it was found that some of them coincide with the properties of the Shannon entropy, some are their modification, and some are considerably different. In this paper, we analyze the entropy addition rule H ( X ) + H ( Y | X ) = H ( XY ) , which plays an important role in information theory. For general underdetermined data, we show that a modification of this rule, called a generalized entropy addition rule, holds. Some special cases in which the usual entropy addition rule holds can be found in [3]. Let M = {0, 1, …, m – 1} and every T ⊆ M , T ≠ be assigned a symbol a T . The alphabet of all symbols a T is denoted by A , and its subalphabet { a 0 , a 1 , …, a m – 1 } whose symbols correspond to the singletons T is denoted by A 0 . Symbols in A 0 are called basic, while symbols in A are called underdetermined. A specification of a T is any basic symbol a i , i ∈ T . Consider underdetermined sources X that independently generate symbols a T ∈ A with probabilities p ( a T ) ≥ 0 . If p ( a T ) = 0 for all a T ∉ A 0 , then the source is called completely determined. The specification operation for a source X is defined by a set of transition probabilities p ( a i | a T ), T ⊆ M , i ∈ T , ( a i | a T ) = 1. Its application to X gives a completely determined source X ' generating symbols a i with probabilities p '( a i ) = ( a T ) p ( a i | a T ) . The source X ' is called a specification of X . To distinguish the symbols a i of X and X ' , a p iT ∈