Attenuation of longitudinal ultrasonic waves was measured in the frequency range from 25 to 700 MHz in a single crystal of high purity niobium. Ratio of the attenuation coefficients in the superconducting and normal states, α s/α n , was measured as a function of magnetic field, H, at various constant temperatures. For the magnetic field near the upper critical field, H c2 , the result was found to be composed of two parts: In the immediate vicinity of H c2 , the attenuation coefficient obeyed the relation, 1− (α s/α n)=A(H c2−H) , whereas the result at field rather below H c2 was described by 1− (α s/α n)=C(H c2−H) 1/2, a relation of Maki's pure limit theory. The above linear dependence on H hardly seems to obtain a plausible explanation at present. The observed value of C varied rather steeply around the frequency corresponding to ql⋍I, where q is the impressed wave number. The dip of the attenuation just above the lower critical field was also investigated. The depth of the dip was found to depend on the frequency as well as temperature, and the frequency dependence could not be accounted for by the existing proposals. Besides the above-mentioned results, the superconducting energy gap at 0°K without magnetic field was found to be smaller for higher frequencies.