A theory is developed on giant quantum attenuation of ultrasound in bismuth. The present theory successfully explains the following experimental results in strong magnetic fields (H ≲ 100 kG): (i) When two attenuation peaks, the one due do electrons and the other due to holes, coincide as a function of magnetic field, the attenuation is exceptionally large at temperatures around 1 K and decreases rapidly with increasing temperatures; (ii) on the contrary, an isolated attenuation peak shows only a weak temperature dependence; (iii) the line shape of an isolated hole peak is highly asymmetric. The theory includes both intraband and interband impurity scatterings, acoustic phonon scattering, and takes account of Coulomb correlation effects via electron-electron, hole-hole and electron-hole two-body distribution functions. As a result, the electron-hole attractive correlation is found to play a crucial role in making the large attenuation mentioned in (i). For (ii), the electron-hole correlation is ineffective because of the large difference in Fermi velocities, and the acoustic phonon scattering is found to be important. Finally, the result (iii) is attributed to the small density of states of the reservoir Landau subbands in the strong magnetic field regime. The present theory assumes no phase transition to account for the result (i) in contrast to previous theories.