This paper presents a parameterized facial muscle model for creating realistic facial animations. The model allows for a more general and flexible approach to facial expression animation by using a non-specific facial topology and a limited set of parameters. The model is based on the Facial Action Coding System (FACS), which defines facial expressions in terms of Action Units (AUs) that correspond to specific muscle movements. The model uses a minimum set system to define the necessary parameters for each AU, allowing for the simulation of a wide range of facial expressions.
The paper discusses the structure and function of facial muscles, including their attachment points and the types of movements they can produce. It describes the development of a muscle model that can be applied to various facial topologies, using a combination of linear/parallel muscles and sphincter muscles. The model is parameterized to allow for the simulation of facial expressions based on the FACS system.
The paper also addresses the challenges of modeling facial expressions, including the complexity of facial muscle interactions and the need for a flexible model that can accommodate different facial types. It discusses the use of polygonal data structures for modeling facial topologies and the importance of maintaining a regular mesh to avoid polygonal intersections and 'facet popping' when the model is articulated.
The model is implemented as a parameter-driven program that generates polygonal or vector descriptions of facial expressions. The program uses parameters to control the position and movement of facial muscles, allowing for the simulation of a wide range of facial expressions. The model is tested against the FACS system, with adjustments made to the parameters to achieve realistic results.
The paper concludes with a discussion of future developments, including the addition of more realistic features such as creasing and buckling of the flesh, the creation of more accurate models of the cranium and mandible, and the development of a task-level system for controlling facial expressions. The model is shown to be effective in simulating a wide range of facial expressions and can be extended to other non-rigid objects.This paper presents a parameterized facial muscle model for creating realistic facial animations. The model allows for a more general and flexible approach to facial expression animation by using a non-specific facial topology and a limited set of parameters. The model is based on the Facial Action Coding System (FACS), which defines facial expressions in terms of Action Units (AUs) that correspond to specific muscle movements. The model uses a minimum set system to define the necessary parameters for each AU, allowing for the simulation of a wide range of facial expressions.
The paper discusses the structure and function of facial muscles, including their attachment points and the types of movements they can produce. It describes the development of a muscle model that can be applied to various facial topologies, using a combination of linear/parallel muscles and sphincter muscles. The model is parameterized to allow for the simulation of facial expressions based on the FACS system.
The paper also addresses the challenges of modeling facial expressions, including the complexity of facial muscle interactions and the need for a flexible model that can accommodate different facial types. It discusses the use of polygonal data structures for modeling facial topologies and the importance of maintaining a regular mesh to avoid polygonal intersections and 'facet popping' when the model is articulated.
The model is implemented as a parameter-driven program that generates polygonal or vector descriptions of facial expressions. The program uses parameters to control the position and movement of facial muscles, allowing for the simulation of a wide range of facial expressions. The model is tested against the FACS system, with adjustments made to the parameters to achieve realistic results.
The paper concludes with a discussion of future developments, including the addition of more realistic features such as creasing and buckling of the flesh, the creation of more accurate models of the cranium and mandible, and the development of a task-level system for controlling facial expressions. The model is shown to be effective in simulating a wide range of facial expressions and can be extended to other non-rigid objects.