Variability in multivariate Poisson is a topic that has been neglected in the literature. We propose here to measure this joint variability by means of the multivariate entropy. We use the fact that any p-variate entropy H( χ 1, χ 2,…, χ p ) can be partitioned into ( p + 1) components to estimate its value, namely, H(x 1,x 2,…,x p) = σ p i=1 H(x i) − T(x 1,x 2,…,x p) . The expression for the mutual information of the p-variate Poisson T( χ 1, χ 2,…, χ p ) was obtained in Guerrero [ Am. Stat. Ass. 1993 Proc.]. Given a Poisson distribution with parameter Θ, a closed expression for Poisson entropy H( χ i ) is unknown. We propose an accurate polynomial approximation H( χ i ). Numerical studies are performed to determine the parameter space within which the approximation is accurate. When the parameters of the process are unknown, we estimate the entropy using the usual maximum likelihood estimator of Θ. A large sample approximation is given to the distribution function of H( χ). These results are extended to the general p-variate case, obtaining two accurate expressions for the entropy of the multivariate Poisson distribution.