
Facial expression synthesis is an important part of the visual human-computer interaction. In order to establish a highly realistic, self-adaptive and automatic real-time facial expression synthesis system, this paper proposes a method based on MPEG-4 to generate three-dimensional facial expression animation. The key steps include: firstly, mark some feature points in three-dimensional human faces and evaluate how these points move from neutral to other facial expression to get animation parameters of these points, secondly, calculate the motility factor of non-feature points in meshed human faces using the interpolation algorithm in this paper and the animation parameters of feature points, lastly, simulate the movements of feature points along time axis to realize facial expression animation. The finally synthesized facial expression looks real and natural, which confirms the validity of the method proposed in this paper.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
