This project focuses on expressing different personality traits in animal characters' animation through procedural modifications. Previous research achieves a similar task for human motion using animation adjustments based on Laban Movement Analysis. Animals have different skeletal structures, so we want to focus on a few selected animals people are more familiar with, such as cats and dogs. The related theory applied to human movement can be adapted to the animals with various changes. For example, introversion can be expressed in humans with a slanted posture and lack of activity; bending the animal's spine to have a similar effect would require altering the rotated bones. Animals use their ears and tails actively; these limbs should be well utilized to express the desired traits. The animation modifications are to be done on neutral motions. We will perform a user study to show the effectiveness of the implemented system. We plan to use Unity, but Unreal or ThreeJS is also accepted.
Gestures accompany people's speech in conversations. Existing works can synthesize co-speech gesture animations from input speech but do not consider the mood or personality of the generated animation. While adding personality using procedural approaches applies to the generated animations, including mood or personality in the generative model would be more effective. The input can be the speech file as in previous research, or it can also include the dialogue text, together with the target mood or personality. We expect to use a data-driven model; however, approaching the problem using procedural methods is also acceptable. For example, certain speech patterns can be mapped to specific gestures where mood or personality determine the extent of motion. An angry or extraverted agent may move hands faster, covering more space. In a procedural gesture animation, these features can control the movement range, speed, and characteristics of the hand motion.
Inverse Kinematics aims to find plausible body positions given target end effector locations. There are many techniques for calculating Inverse Kinematics, and data-driven methods are one possibility. To this end, a neural network can learn the relationship between the end effector positions and skeleton configurations (bone rotations). Laban Movement Analysis is a way of understanding high-level aspects of human motion using low-level measurements. Given a body pose, the corresponding Laban parameters can be calculated, which can also become the input of the neural network model. This project aims to choose descriptive Laban features and make a data-driven Inverse Kinematics system that receives end effector positions and the desired Laban parameter values to output a plausible body pose. We will interpolate the end effector positions to prepare different animations and run a user study to evaluate the realism and accuracy of representing the desired Laban features.
We have a dataset of human animations matched with their personality scores. We want to expand this dataset using animations in the wild; this means we can utilize any public video (MediaPipe or similar systems support extracting 3D animations from 2D videos) or animation as new data, but they will not have any style, personality, or semantic labels. In this project, the task is first to gather a set of animations from various sources. Then, we would like to label the new samples in terms of what semantic action they contain and which personality traits they convey. This task will utilize existing animations with labels and action recognition networks. We can use off-the-shelf action recognition models, but for personality labeling, either a clustering model or a different data-driven approach is required. We will validate the estimated labels with a user study and publish the dataset for future studies.