Abstract

The development of assistive robots is gaining momentum in the robotic and biomedical fields. An assistive robotic system is presented in this research paper for object manipulation to aid people with physical disabilities. The robotic arm design is imported to a simulated environment and tested in a virtual world. This research includes the development of a versatile design and testing platform for robotic applications with joint torque requirements, workspace restrictions, and control tuning parameters. Live user inputs and camera feeds are used to test the movement of the robot in the virtual environment. To create the environment and user interface, a robot operating system (ROS) is used. Live brain computer interface (BCI) commands from a trained user are successfully harvested and used as an input signal to pick a goal point from 3D point cloud data and calculate the goal position of the robots’ mobile base, placing the goal point in the robot arms workspace. The platform created allows for quick design iterations to meet different application criteria and tuning of controllers for desired motion.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call