Primates have evolved sophisticated, visually guided reaching behaviors for interacting with dynamic objects, such as insects, during foraging.1,2,3,4,5 Reaching control in dynamic natural conditions requires active prediction of the target's future position to compensate for visuo-motor processing delays and to enhance online movement adjustments.6,7,8,9,10,11,12 Past reaching research in non-human primates mainly focused on seated subjects engaged in repeated ballistic arm movements to either stationary targets or targets that instantaneously change position during the movement.13,14,15,16,17 However, those approaches impose task constraints that limit the natural dynamics of reaching. A recent field study in marmoset monkeys highlights predictive aspects of visually guided reaching during insect prey capture among wild marmoset monkeys.5 To examine the complementary dynamics of similar natural behavior within a laboratory context, we developed an ecologically motivated, unrestrained reach-to-grasp task involving live crickets. We used multiple high-speed video cameras to capture the movements of common marmosets (Callithrix jacchus) and crickets stereoscopically and applied machine vision algorithms for marker-free object and hand tracking. Contrary to estimates under traditional constrained reaching paradigms, we find that reaching for dynamic targets can operate at incredibly short visuo-motor delays around 80ms, rivaling the speeds that are typical of the oculomotor systems during closed-loop visual pursuit.18 Multivariate linear regression modeling of the kinematic relationships betweenthe hand and cricket velocity revealed that predictions of the expected future location can compensate for visuo-motor delays during fast reaching. These results suggest a critical role of visual prediction facilitatingonline movement adjustments for dynamic prey.