This paper introduces a technique for combining performance-based animation with a physical model in order to synthesize complex interactions in an animated scene. The approach is to previsualize interaction of final integrated scene, online, while the performance is being recorded. To accomplish this goal, we propose a framework which unifies kinematic playback of motion capture and dynamic motion synthesis. The proposed method augments a real-time recording of a human actor with dynamics-based response in order to modify motion data based on the conditions of the character. The system unifies kinematic and dynamic aspects of the final motion while allowing user control over the outcome both temporally and spatially across the character’s body. Examples of complex interactions interleaved with intelligent response underscore the power of the technique along with multi-person captures in which remote users interact physically in a shared virtual world.
Interacting with a simulated virtual creature
Two actors interacting remotely
Performance capture with physical interaction
Nam Nguyen, Nkenge Wheatland, David Brown, Brian Parise, C. Karen Liu Victor Zordan
ACM SIGGRAPH/Eurographics Symposium on Computer Animation (SCA), 2010.