Abstract

This paper presents a multivariate dataset of 2866 food flipping movements, performed by 4 chefs and 5 home cooks, with different grilled food and two utensils (spatula and tweezers). The 3D trajectories of strategic points in the utensils were tracked using optoelectronic motion capture. The pinching force of the tweezers, the bending force and torsion torque of the spatula were also recorded, as well as videos and the subject gaze. These data were collected using a custom experimental setup that allowed the execution of flipping movements with freshly cooked food, without having the sensors near the dangerous cooking area. Complementary, the 2D position of food was computed from the videos. The action of flipping food is, indeed, gaining the attention of both researchers and manufacturers of foodservice technology. The reported dataset contains valuable measurements (1) to characterize and model flipping movements as performed by humans, (2) to develop bio-inspired methods to control a cooking robot, or (3) to study new algorithms for human actions recognition.

Highlights

  • Background & SummaryFlipping food, typically when cooking, is not straightforward, either for humans[1] or robots[2]

  • To what concerns the acquisition of forces and torques and the general acquisition protocol, the methods described below include a detailed and expanded version of the description stated in our related work[1]

  • We reviewed all the movements in the videos and, for each subject, we created a file with the list of movements performed by that subject and their corresponding “success” or “fail” label

Read more

Summary

Background & Summary

Typically when cooking, is not straightforward, either for humans[1] or robots[2] This movement is applied to different food (e.g., meat, vegetables, bread, eggs)[2,3,4] with heterogeneous characteristics (e.g., greasy, adherent, deformable) that influence the movement stability, duration and outcome[1,2]. In17, two recorded actions (both with a spatula) are somewhat related to food flipping: “scrape a piece of dough from a table” and “flip bread on a pan” The former resembles the first elementary movement that some subjects execute when flipping meat, i.e., scrapping (to unstick the meat from the surface). Additional data were derived offline to complement the acquired dataset: 2D trajectories of the food, estimated from the videos, using a custom Computer Vision method These data were intended to support the analysis of kinematic/kinetic data. The scientific community may find our dataset useful to model flipping movements for robotic applications and to study methods of human actions recognition[9,10,11]

Methods
Limitations
Findings
Code availability
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call