Abstract

We present a versatile, behavioral, and rule-based animation system that includes autonomous humanoid actors whose behavior is based on synthetic sensors that are used for perceiving the virtual environment. We combine the following in a consistent approach: L-systems, a behavioral production rule system; a particle system; an acoustic environment model, including a speech recognition module; a virtual life network; and a humanoid library. Together, these systems create a real-time-structured virtual environment that both high-level autonomous humanoids and interactive users can easily share.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.