Abstract

This study introduces the design and fabrication of a family robot which can interact with people by voice and speech. The robot is cute and small, which can be put on the table. It has four main capabilities which are simple dialogue, photo taking, remote monitoring, and timed reminder. The robot equips with four sound sensors, a micro-phone, a webcam, a speaker and a 7 inches LCD screen. There are also three motors which rotate with different degrees to perform the postures of the body. The four sound sensors can detect the sound source position and the motor in the base can turn the robot's body to face the direction of the sound. The microphone can receive the speech of users and the robot can have simple dialogue with users by speech recognition system. Depending on the dialogue contents, the LCD screen can show the corresponding facial expression, which may be happiness, anger, or sadness, and the robot's body can also have some corresponding postures. On the other hand, the robot can take photos according to the command of users, and then the photos can be saved in the cloud drive by internet and users can pick up the photos according to the QR codes shown in the LCD screen. We also can remotely control the turning of the robot to monitor the object or take it photos in front of the robot. The last mode is the “timed reminder” which can remind users with voice, music and motions to do something when the set time is up.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.