1,527 publications found
Sort by
FuseBot: mechanical search of rigid and deformable objects via multi-modal perception

AbstractMechanical search is a robotic problem where a robot needs to retrieve a target item that is partially or fully-occluded from its camera. State-of-the-art approaches for mechanical search either require an expensive search process to find the target item, or they require the item to be tagged with a radio frequency identification tag (e.g., RFID), making their approach beneficial only to tagged items in the environment. We present FuseBot, the first robotic system for RF-Visual mechanical search that enables efficient retrieval of both RF-tagged and untagged items in a pile. Rather than requiring all target items in a pile to be RF-tagged, FuseBot leverages the mere existence of an RF-tagged item in the pile to benefit both tagged and untagged items. Our design introduces two key innovations. The first is RF-Visual Mapping, a technique that identifies and locates RF-tagged items in a pile and uses this information to construct an RF-Visual occupancy distribution map. The second is RF-Visual Extraction, a policy formulated as an optimization problem that minimizes the number of actions required to extract the target object by accounting for the probabilistic occupancy distribution, the expected grasp quality, and the expected information gain from future actions. We built a real-time end-to-end prototype of our system on a UR5e robotic arm with in-hand vision and RF perception modules. We conducted over 200 real-world experimental trials to evaluate FuseBot and compare its performance to a state-of-the-art vision-based system named X-Ray (Danielczuk et al., in: 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS), IEEE, 2020). Our experimental results demonstrate that FuseBot outperforms X-Ray’s efficiency by more than 40% in terms of the number of actions required for successful mechanical search. Furthermore, in comparison to X-Ray’s success rate of 84%, FuseBot achieves a success rate of 95% in retrieving untagged items, demonstrating for the first time that the benefits of RF perception extend beyond tagged objects in the mechanical search problem.

Open Access
Relevant
Multi-directional Interaction Force Control with an Aerial Manipulator Under External Disturbances

AbstractTo improve accuracy and robustness of interactive aerial robots, the knowledge of the forces acting on the platform is of uttermost importance. The robot should distinguish interaction forces from external disturbances in order to be compliant with the firsts and reject the seconds. This represents a challenge since disturbances might be of different nature (physical contact, aerodynamic, modeling errors) and be applied to different points of the robot. This work presents a new $$\hbox {extended Kalman filter (EKF)}$$ extended Kalman filter (EKF) based estimator for both external disturbance and interaction forces. The estimator fuses information coming from the system’s dynamic model and it’s state with wrench measurements coming from a Force-Torque sensor. This allows for robust interaction control at the tool’s tip even in presence of external disturbance wrenches acting on the platform. We employ the filter estimates in a novel hybrid force/motion controller to perform force tracking not only along the tool direction, but from any platform’s orientation, without losing the stability of the pose controller. The proposed framework is extensively tested on an omnidirectional aerial manipulator (AM) performing push and slide operations and transitioning between different interaction surfaces, while subject to external disturbances. The experiments are done equipping the AM with two different tools: a rigid interaction stick and an actuated delta manipulator, showing the generality of the approach. Moreover, the estimation results are compared to a state-of-the-art momentum-based estimator, clearly showing the superiority of the EKF approach.

Open Access
Relevant