Abstract:
By combining motion capture data and dynamic simulation, realistic character animations can be obtained, which can interactively respond to contact forces in the environment. However, the previous character animation generation methods cost so much searching time to find out the appropriate motion capture sequences in the database so that the motion generation process can not run online. Moreover, animators need much manual adjustments to achieve the final realistic character animation. The authors present a parallel algorithm of two processes, and employ an artificial neural network to predict and pre-classify the recovery motion database in order to reduce the size of the search region. The artificial neural network is trained offline by a set of recovery motion capture data sequences and the database is classified according to the recovery motion strategy of the characters. The artificial neural network accepts several key DOFs of the character body segments as input, and outputs the subset label of the recovery motion sequence database as a result. In addition, the matching algorithm is improved for searching for motion capture sequences. The experiment demonstrates that the characters can be controlled and switched between motion capture data and dynamic simulation control modes naturally, and the system can generate reactive virtual character animation in real-time.