Evaluasi Sintesis Ekspresi Wajah Realistik pada Sistem Animasi Wajah 3D dengan Teknologi Motion Capture

Samuel Gandang Gunanto


The human face has a unique shape and size, as well as a 3D character face model. The facial animation of 3D virtual characters is mostly done manually by moving the rigging in each frame. The more characters used, the more production costs that must be incurred. The absence of a cheap facial motion transfers system is also one of the reasons why not many studios are using motion capture technology in Indonesia.

This research will evaluate the implementation of a facial expression synthetically using motion capture technology built from radial basis function (RBF) as a method of marker transfer as a reference for rigging movement in point cluster system. Testing is done by approaching facial expressions according to FACS theory and questionnaire of synthesis results.

The experimental results show that according to FACS theory the requirement of expression formation has been fulfilled by referring to changes in facial features, but the implementation is not always able to describe perfectly the desired condition, namely the average percentage of faces easily recognizable by 35.53%. Therefore, the influence of animators in the control of micro expression improvements or the addition of exaggeration principle elements in the manufacture of facial animation is very important to produce facial expressions that are easily recognized by the audience.


evaluation, FACS, facial animation

Full Text:



Chenoweth, M. E. (2012). A Local Radial Basis Function Method for the Numerical Solution of Partial Differential Equations. Theses, Dissertations and Capstones, Paper 243.

Dutreve, L., Meyer, A., & Bouakaz, S. (2008). Feature Points Based Facial Animation Retargeting. ACM Symposium on Virtual Reality Software and Technology, (hal. tware and technology, 2008, pp.197-200). Bordeaux, France.

Ekman, P., & Friesen, W. V. (1978). Facial Action Coding System: a technique for the measurement of facial movement. Palo Alto: Consulting Psychologists Press.

Essa, I., Basu, S., Darrell, T., & Pentland, A. (1996). Modeling, tracking and interactive animation of faces and heads using input from video. Proceedings of Computer Animation, (hal. 85–94).

Hubbert, S., Gia, Q. L., & Morton, T. M. (2015). Spherical Radial Basis Functions, Theory and Applications. (N. Bellomo, M. Benzi, P. E. Jorgensen, T. Li, R. Melnik, O. Scherzer, . . . P. Zhang, Penyunt.) Cham Heidelberg New York Dordrecht London: Springer.

Ju, E., & Lee, J. (2008). Expressive Facial Gestures From Motion Capture Data. Eurographics.

Kwon, J.-Y., & Lee, I.-K. (2012). The Squash-and-Stretch Stylization for Character Motions. IEEE Transactions on Visualization and Computer Graphics Vol 18, 488-500.

Li, B., Zhang, Q., Zhou, D., & Wei, X. (2013). Facial Animation Based on Feature Points. TELKOMNIKA, 11(3), 1697-1706.

Lorenzo, M. S., Edge, J. D., King, S. A., & Maddock, S. (2003). Use and Re-use of Facial Motion Capture Data. Vision, Video, and Graphics, 1-8.

Lucey, P., Cohn, J. F., Kanade, T., Saragih, J., & Ambadar, Z. (2010). The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression. IEEE Computer Society Conference on Computer Vision and Pattern Recognition (hal. 94-101). San Francisco, CA: IEEE.

Parke, F. (1972). Computer Generated Animation of Face. Proceedings of the ACM annual conference (hal. 451-457). ACM.

Pighin, F., & Lewis, J. P. (2006). Facial Motion Retargeting. SIGGRAPH.

Song, J., Choi, B., Seol, Y., & Noh, J. (2011). Characteristic facial retargeting. Comp. Anim. Virtual Worlds, 22, 187-194.

Troy, Pranowo, & Gunanto, S. G. (2016). 2D to 3D Space Transformation for Facial Animation Based on Marker Data. The 6th International Annual Engineering Seminar (InAES). Yogyakarta.

Umenhoffer, T., & Tóth, B. (2012). Facial animation retargeting framework using radial basis. Sixth Hungarian Conference on Computer Graphics and Geometry. Budapest.

DOI: https://doi.org/10.24821/rekam.v14i2.1747

Article Metrics

Abstract view : 29 times
PDF - 17 times


  • There are currently no refbacks.