Technical advances in 3D imaging have contributed to quantifying and understanding biological variability and complexity. However, small, dry‐sensitive objects are not easy to reconstruct using common and easily available techniques such as photogrammetry, surface scanning, or micro‐CT scanning. Here, we use cephalopod beaks as an example as their size, thickness, transparency, and dry‐sensitive nature make them particularly challenging. We developed a new, underwater, photogrammetry protocol in order to add these types of biological structures to the panel of photogrammetric possibilities.We used a camera with a macrophotography mode in a waterproof housing fixed in a tank with clear water. The beak was painted and fixed on a colored rotating support. Three angles of view, two acquisitions, and around 300 pictures per specimen were taken in order to reconstruct a full 3D model. These models were compared with others obtained with micro‐CT scanning to verify their accuracy.The models can be obtained quickly and cheaply compared with micro‐CT scanning and have sufficient precision for quantitative interspecific morphological analyses. Our work shows that underwater photogrammetry is a fast, noninvasive, efficient, and accurate way to reconstruct 3D models of dry‐sensitive objects while conserving their shape. While the reconstruction of the shape is accurate, some internal parts cannot be reconstructed with photogrammetry as they are not visible. In contrast, these structures are visible using reconstructions based on micro‐CT scanning. The mean difference between both methods is very small (10−5 to 10−4 mm) and is significantly lower than differences between meshes of different individuals.This photogrammetry protocol is portable, easy‐to‐use, fast, and reproducible. Micro‐CT scanning, in contrast, is time‐consuming, expensive, and nonportable. This protocol can be applied to reconstruct the 3D shape of many other dry‐sensitive objects such as shells of shellfish, cartilage, plants, and other chitinous materials.
Read full abstract