The use of higher and higher frequencies in telecommunication systems requires the investigation of the effects of troposphere on electromagnetic wave propagation, and particular attention must be paid to the attenuation induced by raindrops along the radio path. For this purpose the direct measurements approach, in many cases, is not feasible or convenient, and the use of indirect measuring technique is now a common practice. In this respect, meteorological radar operating at nonattenuating frequencies can be of great help owing to their ability in collecting data over wide areas in a short time. The drawback is the incomplete knowledge of the relation between radar reflectivity (Z) and the quantity of direct interest so that an external source of “calibration” is required. In this paper we show that radiometers are proper instruments to calibrate single parameter radar and specifically to derive, on event basis, the relation to convert Z into specific attenuation (α). For this purpose, brightness temperatures at 13 GHz collected by a radiometer along the 30.6° slant path toward the Olympus satellite, direct attenuation measurements, and radar reflectivity profiles have been jointly worked out. First, the choice of the average effective medium temperature to estimate attenuation from brightness measurements has been addressed. Second, the best α ‐Z relation on an event basis has been evaluated through a best fit procedure that forced the radar predicted attenuation to agree with radiometer “measured” ones. The results are very encouraging and confirm the possibility to make use of radiometers as an external source of radar calibration.