Humpback whale song consists of sequences of frequency-modulated sounds whose exact purpose remains unknown. Tracking multiple individuals simultaneously may provide insights into song function and assist in population estimation; unfortunately, during the winter breeding season off Hawaii so many whales produce overlapping songs that identifying the same individual becomes challenging on underwater sensors spaced a few kilometers apart. Here we present a triangulation technique using three bottom-mounted DIFAR acoustic vector sensors for tracking multiple animals simultaneously. A time-frequency representation of the dominant azimuth (“azigram”) of the acoustic energy is computed from estimates of the active intensity, (i.e., the conjugate product of pressure and particle velocity). By defining a set of azimuthal sectors, azigrams from each sensor can be subdivided into a series of binary images, with each image associated with energy propagating from a particular azimuthal sector. Spectrogram correlation methods applied to binary images from different sensors yield individual song fragments that can be used to mask the original azigram, yielding the azimuth of the fragment from each sensor, and thus the singer’s position. The technique, which has also been demonstrated on coral reef fish, is illustrated using singer data collected in 2020 off Maui, Hawaii.