Abstract

We present an approach for precomputing data-driven models of interactive physically based deformable scenes. The method permits real-time hardware synthesis of nonlinear deformation dynamics, including self-contact and global illumination effects, and supports real-time user interaction. We use data-driven tabulation of the system's deterministic state space dynamics, and model reduction to build efficient low-rank parameterizations of the deformed shapes. To support runtime interaction, we also tabulate impulse response functions for a palette of external excitations. Although our approach simulates particular systems under very particular interaction conditions, it has several advantages. First, parameterizing all possible scene deformations enables us to precompute novel reduced coparameterizations of global scene illumination for low-frequency lighting conditions. Second, because the deformation dynamics are precomputed and parameterized as a whole, collisions are resolved within the scene during precomputation so that runtime self-collision handling is implicit. Optionally, the data-driven models can be synthesized on programmable graphics hardware, leaving only the low-dimensional state space dynamics and appearance data models to be computed by the main CPU.

Highlights

  • Deformation is an integral part of our everyday world, and a key aspect of animating virtual creatures, clothing, fractured materials, surgical biomaterials, and realistic natural environments

  • Y(t) are dependent variables defined in terms of the deformed state that describe our reduced appearance model but do not affect the deformation dynamics; and

  • We note that the data reduction process is a black box step, but that we use the least-squares projection (PCA) since it provides an optimal description of small vibrations [Shabana 1990], and can be effective for nonlinear dynamics [Krysl et al 2001]

Read more

Summary

Introduction

Deformation is an integral part of our everyday world, and a key aspect of animating virtual creatures, clothing, fractured materials, surgical biomaterials, and realistic natural environments. Our method sacrifices the quality of continuous control in favor of a simple discrete (impulsive) interaction This allows us to avoid learning and (local) interpolation by using sampled data-driven IRF primitives that can be directly “played back” at runtime; this permits complex dynamics, such as nonsmooth contact and self-collisions, to be reproduced and avoids the need to generalize motions from possibly incomplete data. Our contribution: In this paper we introduce a precomputed data-driven state space modeling approach for generating real-time dynamic deformable models using black box offline simulators This avoids the cost of traditional runtime computation of dynamic deformable models when not absolutely necessary. The reduced phase space dynamics model supports the precomputation and data reduction of complex radiance transfer global illumination models for real-time deformable scenes. Nonlinear, viscoelastic dynamic deformation; all models are attached to a rigid support, and reach equilibria in finite time (due to damping and collisions)

Data-Driven Deformation Modeling
Data-driven State Spaces
Model Reduction Details
Dynamics Precomputation Process
Impulse Response Functions
Reduced Global Illumination Model
Radiance Transfer for Low-frequency Lighting
Dimensional Model Reduction
Runtime Synthesis
Low-rank Model Evaluation
Results
Summary and Discussion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call