Over the last few years, recent advances in user interface and mobile computing, introduce the ability to create new experiences that enhance the way we acquire, interact and display information within the world that surrounds us with virtual characters. Virtual reality (VR) is a 3D computer simulated environment that gives to user the experience of being physically present in real or computer-generated worlds; on the other hand, augmented reality (AR) is a live direct or indirect view of a physical environment whose elements are augmented (or supplemented) by computer-generated sensory inputs. Both technologies use interactive devices to achieve the optimum adaptation of the user in the immersive world achieving enhanced presence, harnessing latest advances in computer vision, glasses or head-mounted-displays featuring embedded mobile devices. A common issue in all of them is interpolation errors while using different linear and quaternion algebraic methods when (a) tracking the user’s position and orientation (translation and rotation) using computer vision; (b) tracking using mobile sensors; (c) tracking using gesture input methods to allow the user to interactively edit the augmented scene (translation, rotation and scale) and (d) having animation blending of the virtual characters that augmented the mixed reality scenes (translation and rotation). In this work, we propose an efficient method for robust authoring (rotation) of Augmented reality scene using Euclidean geometric algebra (EGA) rotors and we propose two fast animation blending methods using GA and CGA. We also compare the efficiency of different GA code generators: (a) Gaigen library, (b) libvsr and (c) Gaalop using our animation blending methods and compare them with other alternative animation blending techniques: (a) quaternions and (b) dual-quaternions, so that a future user of GA libraries can choose the most appropriate one that will give the most optimal and faster results.
Read full abstract