The Effect of Emotional Colour on Creating Realistic Expression of Avatar

10
Copyright © 2012 by the Association for Computing Machinery, Inc. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail [email protected]. VRCAI 2012, Singapore, December 2 – 4, 2012. © 2012 ACM 978-1-4503-1825-9/12/0012 $15.00 The Effect of Emotional Colour on Creating Realistic Expression of Avatar Mohammed Hazim Alkawaz ǂ Faculty of Computer Science and Information Systems Universiti Teknologi Malaysia, Malaysia Ahmad Hoirul Basori ǂǂ 1 Department of Graphics and Multimedia, Faculty of Computer Science and Information Systems Universiti Teknologi Malaysia, Johor, Malaysia 2 Department of Informatics, Faculty of Information Technology, Institut Teknologi Sepuluh Nopember Surabaya, Indonesia Abstract The research on facial animation has grown very fast and become more realistic in term of 3D facial data since the laser scan and advance 3D tools can support creating complex facial model. However, that approaches still lacking in term of facial expression based on emotional condition. Facial skin color is one parameter that is contributing on augmenting the realism of facial expression, because it’s closely related to the emotion which happens inside the human. This paper provides a new technique for facial animation to change the colour of facial skin based on linear interpolation. CR Categories: I.3.3 [Computer Graphics]: Three- Dimensional Graphics and Realism—Display Algorithms I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism—Animation, Color, Fractals; Keywords: facial animation, emotion colour, facial expression, change skin colour. 1. Introduction Facial Animation is the main element to express emotion and personality. Computer facial animation has many applications in different aspects, e.g. realistic virtual humans with various facial expressions used in the industry of entertainment and edutainment. In communication applications, it can improve the interaction between users and machines by using interactive talking faces, and can also attract users by providing a friendly interface. Fred Park 1974 was the first who showed interest in facial animation. He is also the first who used the power of 3D computer graphics in a way that delivered the highest level of control that drives the acts of a face. Applications at that time included computer generated actors and a rich research instrument for those studying the field of non-verbal communication and human facial expression. As the online game manufacturing develops fast, it requires users to improve, greater quality of the graphics, and to achieve a significant feat in artificial intelligence. These elements motivated the gaming industry full with priceless artificial intelligence and graphics like real photos. Initially, the games consisted few number of polygons in characters and were supported by a few active computers in that regard. Presently, the characters in the game appear to be more natural, but there are still challenges in expressing the emotions of characters in a detailed way. However, several colour model researches were only capable of proposing the facial colour model (Buddharaju et al., 2005; Kalra and Magnenat-Thalmann, 1994; Yamada and Watanabe, 2004) that are based on pulse, skin temperature and real human blood flow. These mentioned techniques end up conveying the color of the face with an increasing redness (Kyu-Ho and Tae-Yong, 2008). 2. Related Work The recent development in digital technology has stabilization in high-speed processing of great quantities of information. Therefore, applications used for long-distance communication such as Conferencing and video telephony are being constantly developed. A distinct element that showed the advanced communication prowess of these systems is the presentation and estimation of emotions using the color of the face as well as expression. In candid terms, if the color of the face changes, emotional changes that are related can be used for the fusion of the facial expression to change the basic emotions of visual anthrop orphic agents; this may likely play a key role vetting communication between computers and humans.(Yamada and Watanabe, 2004). Furthermore, the other researcher also enhanced the avatar emotional expression by haptic sensation.( Basori et al., 2008, Basori et al., 2010 ) ǂEmail: [email protected] ǂǂEmail:[email protected] 143

Transcript of The Effect of Emotional Colour on Creating Realistic Expression of Avatar

Copyright © 2012 by the Association for Computing Machinery, Inc. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail [email protected]. VRCAI 2012, Singapore, December 2 – 4, 2012. © 2012 ACM 978-1-4503-1825-9/12/0012 $15.00

The Effect of Emotional Colour on Creating Realistic Expression of Avatar Mohammed Hazim Alkawazǂ

Faculty of Computer Science and Information Systems Universiti Teknologi Malaysia, Malaysia

Ahmad Hoirul Basoriǂǂ 1Department of Graphics and Multimedia,

Faculty of Computer Science and Information Systems Universiti Teknologi Malaysia,

Johor, Malaysia 2Department of Informatics,

Faculty of Information Technology, Institut Teknologi Sepuluh Nopember Surabaya,

Indonesia

Abstract The research on facial animation has grown very fast and become more realistic in term of 3D facial data since the laser scan and advance 3D tools can support creating complex facial model. However, that approaches still lacking in term of facial expression based on emotional condition. Facial skin color is one parameter that is contributing on augmenting the realism of facial expression, because it’s closely related to the emotion which happens inside the human. This paper provides a new technique for facial animation to change the colour of facial skin based on linear interpolation. CR Categories: I.3.3 [Computer Graphics]: Three-Dimensional Graphics and Realism—Display Algorithms I.3.7 [Computer Graphics]: Three-Dimensional Graphics and Realism—Animation, Color, Fractals; Keywords: facial animation, emotion colour, facial expression, change skin colour. 1. Introduction Facial Animation is the main element to express emotion and personality.

Computer facial animation has many applications in different aspects, e.g. realistic virtual humans with various facial expressions used in the industry of entertainment and edutainment. In communication applications, it can improve the interaction between users and machines by using interactive talking faces, and can also attract users by providing a friendly interface. Fred Park 1974 was the first who showed interest in facial animation. He is also the first who used the power of 3D computer graphics in a way that delivered the highest level of control that drives the acts of a face. Applications at that time included computer generated actors and a rich research instrument for those studying the field of non-verbal communication and human facial expression. As the online game manufacturing develops fast, it requires users to improve, greater quality of the graphics, and to achieve a significant feat in artificial intelligence. These elements motivated the gaming industry full with priceless artificial intelligence and graphics like real photos. Initially, the games consisted few number of polygons in characters and were supported by a few active computers in that regard. Presently, the characters in the game appear to be more natural, but there are still challenges in expressing the emotions of characters in a detailed way. However, several colour model researches were only capable of proposing the facial colour model (Buddharaju et al., 2005; Kalra and Magnenat-Thalmann, 1994; Yamada and Watanabe, 2004) that are based on pulse, skin temperature and real human blood flow. These mentioned techniques end up conveying the color of the face with an increasing redness (Kyu-Ho and Tae-Yong, 2008). 2. Related Work The recent development in digital technology has stabilization in high-speed processing of great quantities of information. Therefore, applications used for long-distance communication such as Conferencing and video telephony are being constantly developed. A distinct element that showed the advanced communication prowess of these systems is the presentation and estimation of emotions using the color of the face as well as expression. In candid terms, if the color of the face changes, emotional changes that are related can be used for the fusion of the facial expression to change the basic emotions of visual anthrop orphic agents; this may likely play a key role vetting communication between computers and humans.(Yamada and Watanabe, 2004). Furthermore, the other researcher also enhanced the avatar emotional expression by haptic sensation.( Basori et al., 2008, Basori et al., 2010 )

ǂEmail: [email protected] ǂǂEmail:[email protected]

143

Facial colour of humans would be channeled to display the virtual effect, emotional estimate, facial image, remote health care, and individual identification. In which it can be regarded as one of the “the most peculiar and the most human of all expressions”(Darwin, 1872) ,blushing is a common topic of psychological study, it has been evidenced to be an important facial cue which serves vital functions in interpersonal communication (Ekman and W., 1974) . The explanations of why people blush is still an issue among psychologists. Most people consider blushing in public as an uncontrollable response and most people feel embarrassed when they blush in front of others. Furthermore, blushing is a symptom that made it even worse for people who suffer from social phobia (Pan et al., 2008) While pallor effects occur through temporary cerebral anemia and contractions of the treatment capillaries in the face due to the rising of cerebral blood flow. Pallor perhaps happens Couse to fear pain, or shock. This action decreases the blood flow in the face and forwards it to the brain for relief and recovery. In the literature that was researched, the measurement of pallor was not found. Psychophysiological analysis of pallor is likely to provide known results because the red areas are more likely to show pallor and the time pattern of pallor. However, we have a formal monitoring that shows that the pallor has gained far less attention in the psychophysiological analysis compared with redness. The central component of the computer model that is of interest is the identification of the main characters that reflect this phenomenon. We consider the emotions of the blood vessels and skin color, accent sets occur (Kalra and Magnenat-Thalmann, 1994). Former studies focus mainly on the geometric features of these alterations, for instance, the facial surface animation (e.g. skin stretching, wrinkle structures). Similarly, changes in hemoglobin concentration cause changes in skin color. It can also be caused by the reaction of histamine or other skin conditions for example blushing and rashes. Blushing specifically consists of a number of emotions such as joy, shame, and arousal. Regardless of their ability to convey emotion, the dynamic changes that occur in skin pigmentation are mostly ignored by present skin appearance models (Ersotelos and Dong, 2008; Jimenez et al., 2010). Reasonable behavior and realistic appearance is essential for simulating communicative behavior. Postures and simulation also reflect the emotional behavior. Several models of emotions exist, the psycho-evolutionary theory was developed by Plutchik(Plutchik and Kellerman, 1980) and the FACS by brought up by Ekman. However, the vast of these models are merely appropriate for muscular expressions. Nevertheless, in graphics too, the change in colour is a less control field. Moreover, other physiological features such crying will not be undertaken. Blushing effects as shown in (Kalra and Magnenat-Thalmann, 1994) can occur when there is a very intense sensation(Jung et al., 2009). The most frequently affected parts are the cheeks; there is a relationship between temperature increase and blushing (Jung, et al., 2009; Shearn et al., 1990) . The usual blushing starts at 35s with the highest intensity after 15s. The face and Ear blush likewise. Blood flow reduction causes Pallor (Kalra and Magnenat-Thalmann, 1994)e.g. causing fear. Parts that blush may also become pale. Sweat and weeping (sometimes accompanied by blushing) can be considered as vegetative functions that are controlled by the autonomic nervous system (ANS) (Jung, et al., 2009). Surprise, fear and happiness are accompanied with very light colors. Sadness and disgust usually with colors of medial lightness, and anger is mostly associated with dark. Colors

associated with fear and sadness are actually desaturated (close to 0, 0), whereas surprise, happiness and anger are related extremely to chromatic color(Pos and Green-Armytage, 2007). The skin of the face is rich with vessels. A vast quantity of blood vessels and a high level of activity in the facial skin by the enzyme may reflect the high metabolic activity. There are many kinds of formal subsidiarity in the face. Kalra and Moretti conducted a study of vascular patterns in the skin in different areas of the face. The region based approach highlighted that areas like the nostril and forehead have bilateral divisions at sharply acute angles called "fronds," And areas such as cheeks and jaws from one side branching nearly at right angles is called "candlestick” (Kalra and Magnenat-Thalmann, 1994; Moretti et al., 1959). Some areas show in-between types of subsidiary between the "frond" and "candelabra." Numerous areas portrayed several forms in the capillary stream to the skin. There are many regional varieties and any one region varies greatly according to the individual, especially with the age component. These differentials may may highlight why certain areas show more external activity than others. For example, the redness may is seen more on the cheeks, ears and forehead rather than the rest of the body(Kalra and Magnenat-Thalmann, 1994).refer to Table 1.

Table 1. Emotion and Change in Face (Jung, et al., 2009)

EMOTION FACIAL APPEARANCE VARIATION

Neutral No changes, neutral face color

Joy Cheeks become RosyEnthusiasm/ Ecstasy Cheeks become Rosy,

tears of happinessSurprise Cheeks become RosyDisgust Cheeks become Pale Down Low waterySadness Cheeks become

blushing, raised lacrimation

Grief Cheeks become blushing, blotchy red and intensive lacrimation

Apprehension Cheeks become PaleFear The whole face is palePanic Face becomes Pale,

sweat on the forehead, low lacrimation

Annoyance Cheeks become blushing

Anger Cheeks become blushing, blotchy red in the face

Rage Cheeks become blushing, blotchy red in the face, the face is red

Overview of visually distinguishable emotional states caused by vegetative functions, used for the parameterization of emotions that result in different facial complexions (Jung, et al., 2009).

144

Figure 1. The first three images showed anger, hence the cheeks

are blushing and then red blotches appear. The fourth image showed weeping badly and also the cheeks blushing. In the last

three images after positions and colours have been changed, droplets appear and effects like perspiration or nose bleeding can

also be simulated. (Jung, et al., 2009). The animated facial models are not just less expensive when compared to human performers; nonetheless they are more elastic, which offers customization in style and appearance. These advantages can provide users with alternatives to substitute computer generated models with human actors. Several ways are used to achieve 3D or 2D models in animation. A software modeling tool can be used to generate it manually, by capturing it from a 3D clay model with a digitizing pen, which is attached to a mechanical arm which will calculate the location of the tip of the pen; or took from an actual human by cameras or other scanning technology. The obtained 3D data is further parameterized into a mathematical representation as in splines, implicit surfaces or polygonal models, and then it can be manipulated using the computer (Cerekovic et al., 2007; Cosi et al., 2005). 3. Methodology This research consist of the main parts: Emotion colour, Theory of colour and Emotion colour using linear interpolation like shown in Figure 2.

Figure 2. Research Methodology

3.1 Emotion Colour Emotions are complex states of mind, which includes physiological correlates, cognitive factors and social roles. Emotions function in giving individuals with energy in order to react to a behavior. Among the types of emotions, there are basic emotions in which the emotions cannot be further breakdown. Basic emotions are regarded as universal, in which in occurs even though there are different personal experiences and different cultures. As the basic emotions are universal, they can be

identified solely through facial expression. The six types of basic emotions that have been mentioned earlier are happiness, surprise, disgust, anger, sadness and fear. Even though facial expressions are commonly unintentionally expressed but it can controlled through practice (Pos and Green-Armytage, 2007). Micheal Argyle stated that facial expressions reflect social signal and it also show one’s emotional states. One’s emotional states are shown through his/her facial expression and it is commonly accompanied with color changes, either increasing or decreasing of blood flow, which will result to either pallor or blushing. John Berendt (Berendt, 2005) explained an example of colour changes through a significant feature of the Felice Casson prosecutor (p.79), he states “It was his tendency, when he was angered, for his face to turn pink, then red, then scarlet, from the top of his forehead to the neckband of his collarless shirt”(Pos and Green-Armytage, 2007). In order to provide the viewer with information about the nature of objects is a central role of colours, viewers can be presented with the simulation and their responses are analysed. Any changes that occur to the objects, either positive or negative values need to be recorded by the observer. If the changes include changes of facial colour, it can be linked to emotion. It has been proven that there is a connection between emotion and colour to human biology. Parts of the colours can be redness (blusing) or whiteness (pallor), but there are more variety of colours to express emotion. Thus, some correspondence rules between emotions and colours can be defined Pos and Green-Armytage studied whether or not emotional facial expression and its colours are universal across cultures, especially among the European and Asian people. The study consists of two groups of participants, which are younger and older groups. The study includes the selection of a single colours along with the combinations of three colours in performing the six basic emotions. The experiments were identically conducted for both groups. As results, it is revealed that emotions can be easily identified through the colour changes. The study had adapted two techniques in analyzing the data of the study, which are from the psychology and design discipline. These basic emotions (anger, sadness, happiness, surprise, disgust, and fear) are related to the colour positions as written on CIELAB. CIELAB is a system. It is the two systems, which had been adopted by CIE in 1976 as the models that showed better uniform colour spacing, the temperature features, colour space and the distinctions of colour tiplets. Besides, processes of translating colours into facial expressions and vice versa can also be done. By examining the success and failures in communication, designers can make this as their guidelines in using single colour and combinations of colours (Pos and Green-Armytage, 2007). 3.2 Theory of colour The research literature is constructed from different areas. This includes the history and colour theory from Goethe, descriptions of colour meanings as suggested by Cluadia Cortes and user test by Naz Kaya. Besides, it also includes opinions from Shirley Willett, who used to design websites and also a process example from Yan Xue. 3.2.1. Johann Wolfgang von Goethe “Colour Theory”, a work by Johann Wolfgang von Goethe have shown many remarkable points that are related to colour meanings, especially in the “Sinnlich-sittliche Wirkung der Farbe”. According to Johann Wolfgang von Goethe, colours minus/negative part and plus/positive part. The plus part includes

145

colours of yellow, red-yellow (orange) and yellow-red (vermeil). These colours are linked to arousal, lively and ambitious. The minus colours are represented by blue, blue-red and red-blue. These colours show restless, yielding and yearning. Table 2 shows the meanings of colour by Goethe (Nijdam, 2006).

Table 2. Colour J.W. von Goethe colour summary(Nijdam, 2006).

Emotion Negative trait Positive trait ColourCalm Neutral ,calm GreenSame as red blue, but more Negative

More restless More active Blue-red

Faith Charm/grace Dignity, Seriousness,

Red

Joy nreinen”,green Unpleasant

Purity Pleasant

Yellow

Powerful Irritating Energetic, Yellow-red

Happiness Warmth, energetic, passive

Red-yellow

Discomfort Restless Active Red-blue

Sadness Void cold Comfort Blue 3.2.2. Claudia Cortes Claudia Cortes had suggested the characteristics of positive and negative traits through research conducted by her. The scholar had summed up the associations of colours with particular and she had also described the relations of the colours. Even though it is difficult to extract colours according to their meanings but this have given an influenced emotion meanings and their grid locations, which is a main element of the model. Table 3 presents the summary of the traits. Shown in the table are the overlap traits and not all traits(Nijdam, 2006).

Table 3. Claudia Cortes colour extraction(Nijdam, 2006)

Emotion Negative trait

Positive trait

Colour

Melancholic, introspective

Sorrow, arrogant

Passive, leadership

Purple

Greed, faith Sick, greedy Neutral,calm

Green

Sadness, confident Depressed Traditional, faithful

Blue

Anger, love Embarrassed, offensive

Emotional, active

Red

Happiness/joy, fear Cautious Energetic, lively

Yellow

Determination, joy Tiring Ambition Orange

This is a small part of her site, and it includes merely the information that has been used as a proof of the concept. Information about symbolism can also be found on her page.

3.2.3. Naz Kaya Naz Kayas’ research shows the difficulty in stating things about colours and how to interpret them. The tests’ results are shown in frequency. The emotion and colour combination with the peak overall score are shown in Table 4A and 4B (Nijdam, 2006) .

Table 4A. Colour summary by Naz Kaya (Nijdam, 2006)

Table 4B. Colour summary by Naz Kaya (Nijdam, 2006)

Emotion Colour with munsell notation

Powerful, fearful, depressed Black (n/1) Peaceful, lonely, empty/void, innocent

White (n/9)

Depressed, confused, bored, sad Gray (n/5) Sick, confused, annoyed Blue-green (5bg 7/8)No/emotion, loved Red-purple (10rp

4/12) Powerful, calm Purple-blue (7.5pb

5/12) It is useful to take note that a few frequencies of the colour are very near to each other. Other than that, it is also should be noted that Munsell colour tree is a standard colour for industrial. The colour tree characterized the colours in a 3D way, which includes hue, chrome, and value. This can be seen in Figure 3, and it can be said that it may the common used colour system. Figure 3. TheMunsell colour space (Nijdam, 2006).The Munsell

Colour system is based on the Newton colour circle 3.2.4. Colour Wheel Pro The commercial program Colour Wheel Pro also contributes to the definitions of colours meanings. Summary of colours with the associated traits and meanings are presented in Table 5 (Nijdam, 2006).

Emotion Colour with munsell

notation Annoyed, disgust Green-yellow (2.5gy

8/10) Tired Purple (5p 5/10) No emotion, excited, energetic Yellow-red (5yr

7/12) Loved, anger Red (5r 5/14)

Peaceful, hopeful, comfortable Green (2.5g 5/10)Happy Yellow (7.5y 9/10)Calm Blue (10b 6/10)

146

It should be noted that in the summary, there are overlapping traits and their exact meanings are dubious. These overlapping traits are commonly depended on its context. The table has included the most appropriate words based on the meanings of the colours.

Table 5. Colour Wheel Pro colour summary(Nijdam, 2006).

Emotion Negative trait Positive trait ColourPower Death Elegance Black Sadness Frustration romantic,

nostalgic Purple

(Not given) Power Safety, purity

White

Aggressive/anger,intense, Emotionally

Courage ,offensive

Passion Red

Joy, happiness Jealousy, sickness

Freshness Yellow

Joy, happiness Distrust, domination

Desire, wisdom

Orange

Greed

Sickness,disorder, envy

Growth, good health

Green

Trust Depression

Understanding

Blue

3.2.5. Shirley Willett The proceeding table presents colours and the characteristics of the colours according to Shirley Willett’s point of view. The colours and their properties act as the standard for the relationship between basic emotions and colours. Willets’ model is presented in (Figure 4) that overlaps with the data found from Claudia Cortes’ work but it differs from it as it reduces the list to the basics.

Figure 4. Colour codification of emotions by Shirley Willett (Nijdam, 2006)

Based on Figure 4, the figure shows several layers. The external layer shows traits that are positive, the second layer presents six other emotions, the third layer is where the negative traits and at the central is the depression spot, which represents the combination of all traits that are negative (Nijdam, 2006). Besides the figure, similar information is also presented in Table 6 (Nijdam, 2006).

Table 6. Summary of colours and the associated traits by Shirley Willett (Nijdam, 2006).

3.2.6. Yan Xue Yan Xue devotes a few words on the utility of colour support in a thesis produced by him on Philips ICat. The ICat use colours of blue, red and green to produce LED colours, which are placed in feet and ears. Yan Xue had developed a model based on the colours and used Russels circumflex model as a basis as shown in Figure5 (Nijdam, 2006).

Figure 5.Yan Xue colour distribution(Nijdam, 2006) 3.2.7. Comparing The Different Models The information collected from the literature review is extracted in order to make it beneficial to prove the concept. The intensive colour meanings lead to the default in handling it. The research done by Naz Kaya is the most important because it delivers the meanings that colours can imply. Claudia Cortes’ work supports it for it constrains almost all parts of the results into the basic colours. However, Goethe’s theory is uneasy to be implemented because the truthful of the information difficult to verify. Nevertheless, Shirley Willets model displays a solid model providing answers to what a colour can be. Finally the model by Yan Xue presents a simplistic way of interpreting the three basic colours RGB. One item is surprising; it does not have a colour. Despite that “surprise” is a basic emotion (as Ekman says) it is not clearly stated with a colour (Nijdam, 2006).

Emotion Negative trait Positive trait Colour

Power Impotence Leadership Purple

Greed Hoarding Satisfaction Green Confusion Racing Clarity Blue

Anger Rage Enthusiasm Red

Fear Panic Awareness Yellow

Shame Disgrace Pride Orange

147

3.3. Emotion colour using linear interpolation 3.3.1. Linear Interpolation Linear interpolation is a Fundamental of the interpolation. Also, it is assessment way of the property (in this state colour) of two point or more than two points.

Figure 6. Show the colour of two points the red point is A, the green point is B and which colour between these points is Y

(website, 2009). To calculate Y they are use the linear interpolation’s formula below: Figure 7.Show the linear interpolation formula (website, 2009).

The information in the table 7, 8 and 9 is acquired from previous study.

Table 7. Yvonne Jung, Christine Weber, Jens Keil, and Tobias Franke (Jung, et al., 2009).

Table 8. Kyu-Ho Park and Tae-Yong Kim(Kyu-Ho and Tae-Yong, 2008).

Table 9. Shirley Willett, Color codification of emotions(Ahmad

Hoirul Basori1 and Nadzaari Saari1, 2012).

Table 10. Summary of three colour’s values of tables 7, 8 and 9

for each value of colour

Depend on the information in the table 10 each (Cyan), (Magenta) and (Yellow) has three values for each one, we use the linear interpolation based on formula (1) that above. We choose the smallest value and the biggest one, as show we in Figure 8.

Figure 8. Show the smallest value of the colour and the biggest

one.

No R G B C M Y Emotion

Colour

1 248 219 50 4 9 90 Joy

2 166 169 174 37 28 26 Sadness

3 133 72 156 56 85 0 Fear

4 177 31 37 21 100 98 Anger

No R G B C M Y Emotion Colour1 0 255 0 63 0 100 Joy 4 0 0 255 88 77 0 Sadness 5 54 57 72 77 70 50 Fear6 98 51 24 39 75 92 Anger

No R G B C M Y Emotion Colour 1 0 255 57 62 0 100 Joy 2 255 180 0 0 33 100 Sadness 3 255 255 0 6 0 97 Fear 4 255 0 57 0 76 99 Anger No R G B C M Y Emotion

1 0,0,248 155,219,255

0,50,57 4,62,63

0,0,9

90,100,100 Joy

2 0,166,255

0,169,180

0,174,255

0,37,88

28,33,77

0,26,100 Sadness

3 54,133,255

57,72,255

0,72,156 6,56,77

0,70,85

0,50,97 Fear

4 98,177,255

0,31,51

24,37,57 0,21,39

75,76,10

0

93,98,99 Anger

148

Notes: The black point is the maximum value of C, M, and Y. The blue point is the minimum value of C, M, and Y. The green point is the average between the maximum and minimum. The L is the distance between the maximum and minimum. The l is the distance between the minimum and the average.

For example: The first emotion (Joy) in the table 10 .The maximum value is (63, 9,100) and the minimum value is (4, 0, 90) lc=|(63+4)/2 -4 =30 lm=|(9+0)/2 -0 =5 ly=|(100-90)/2 -90=5 And Lc = 63-4=59 Lm=9-0=9 Ly=100-90=10

Final colour Cyan = 4 + 30 (63-4)/59 =34 Final colour Magenta = 0 + 5(9-0)/9 =5 Final colour Yellow = 90 + 5(100-90) =95 The final colour for Joy is (34, 5, and 95) is presented in table 11

Table 11. The colour from proposed approach

4. Implementation We apply these equations for linear interpolation to the face using Programming language C#, XNA and Facial Action Coding System algorithm (FACS).

4.1 Facial action code system (FACS) It is a system that measures and describes facial behaviors by understanding every facial muscle. It was developed by Paul Ekman and Wallace Friesen in 1976. In1978, the earliest FACS manual was published. It is a known standard that shows the way in which each facial muscle changes the facial appearance. It is derived from a facial anatomy analysis, which describes human face’s muscles conduct including the movements of the tongue and jaw. Through exploring facial anatomy, we can conclude that changes in facial expressions are caused by facial actions. FACS starts working from facial actions to understand facial behaviors. Unlike other muscle-based techniques, FACS studies the actions that are caused by the muscles instead of studying the muscle directly. As facial expressions are complicated, and no expression is made by only one muscle, the same muscle can make various expressions. The configuration is very complex. So, studying each muscle individually would be uneasy to facilitate a clear comprehension of this area. (Liu, 2009). Moreover, Facial action units (AUs) are made in accordance to these actions and every AU can involve with numerous facial muscles. When an expression is made, the involved muscles cannot be recognized easily. Vice versa, each muscle can also be divided by two or more AUs to explain quite self-governing actions of various muscle parts. FACS divides the human face into 46 action units. Every unit embodies an individual muscle action or a group of muscles that characterize a single facial position. The principle is that every AU is the smallest unit that cannot be reduced into minor action units. Though accurately sorting various AUs on the face, FACS was able to mimic all the facial muscle movements. Unalike combination of action units that produce altered facial expressions. For example, joining the AU4 (Brow Raiser), AU15 (Lip Corner Depressor), AU1 (Inner Brow Raiser), and AU23 (Lip Tightener) generates a sad expression. Originally, FACS was not made to help generating facial animations. It was aimed at describing score facial movements. But because of its widespread usage as well as acknowledgement for facial actions, it has been broadly used in facial animation in the recent decades. FACS assists animators exemplify and build realistic facial expressions by detailing all possible facial animation units made by the human face with descriptions for measuring the positions of head and eye (Liu, 2009). The weak side of FACS is that it only accounts for obvious facial changes and neglects all the delicate changes that may be hard to distinguish. It also discounts any movements underneath the skin which are not noticeable but can affect facial expressions. FACS is very effective in distinguishing the actions of parts of the face, but not the face as a whole. Particularly, it does not count for all of the visible, distinguishable, reliable actions of the faces’ lower parts which makes it unable to recognize the lip-sync requirement. As lip-synchronization has become very important and has greatly earned researchers’ attention, FACS has showed its weakness in this field because it lacks muscle control on the faces’ lower parts. Future techniques of facial animation should be capable of synchronizing voice with speech animation easily. This reduces FACS’s ability to come to be the dominant technique in the industry In general, FACS accounts for most expressions and facial movements by facial action coding. It can offer very dependable face codes on the upper face areas including foreheads, eyelids and eyebrows, but it does not give sufficient face codes on the lower parts of the face. Though, FACS still and overcomes the limits of interpolation and has been a milestone in techniques of facial animation. It assuredly cheers other similar techniques and

No R G B C M Y Emotion Colour1 181 202 62 34 5 95 Joy 2 156 126 123 44 53 50 Sadness 3 159 142 130 42 43 49 Fear 4 202 70 47 20 88 96 Anger

149

is extensively used with muscle or simulated muscle-based approaches Table 12. Show the Sample single facial action unites (Liu, 2009)

. AU FACS Name AU FACS Name AU FACS Name 6 Check Raiser 23 Lip Tightner

15 Lip Corner depressor

1 Inner Brow Raiser

26 Jaw Drop 5 Upper Lid Raiser

17 Chin Raiser

20 Lip Stretcher

9 Nose Wrinkler

4 Brow lower

14 Dimpler

16 Lower Lip Depressor

10 Upper Lid Raiser

2 Outer Brow Raiser

12 Lid Corner Puller

Table 13. Show the Example sets of action units for basic

expressions (Liu, 2009).

Expressions Involved Action Units Anger AU 26,4,17,10,9,20,2 Fear AU2,4,5,26,15,20,1 Happiness AU14, 12,6,1 Sad AU 23,1, 15, 4

Depend on the information above that is about Facial Action Coding System the Expression and the colour for the face will be became such as below for each expression . For Anger the expression of Facial Action Coding System will take AU2, 4, 7, 9,10,20,26, emotion colour and expression for the face become like Figure 9.

Figure 9. Show the colour and expression for the person if feels

anger.

For Fear the expression of Facial Action Coding System will take AU1, 2, 4, 5, 15, 20, 26, emotion colour and expression for the face become like Figure10.

Figure10. Show the colour and expression for the person if feels

fear.

For Happy the expression of Facial Action Coding System will take AU1, 6, 12, 14, emotion colour and expression for the face become like Figure10. For Sad the expression of Facial Action Coding System will take AU1, 4, 15, 23, emotion colour and expression for the face become like Figure11.

Figure11. Show the colour and expression for the person if he

feels happy. For Sad the expression of Facial Action Coding System will take AU1, 4, 15, 23, emotion colour and expression for the face become like Figure12.

150

Figure12. Show the colour and expression for the person if feels

sad. 5. Conclusions In this paper, we explain the diverse and often overlapping categories of facial animation, covered the emotion colour, provide a new technique for facial animation to change the colour of facial skin based on linear interpolation. The proposed approach is able to give new features to facial expression and enhance their realism on expressing their emotion expression. This finding is believed will bring great benefit to the improvement of virtual human research, virtual reality and game development. Acknowledgements

This research is supported by the Ministry of Science and Technology (MOSTI) and collaboration with Research Management Center (RMC), Universiti Teknologi Malaysia (UTM). This paper is financial supported by E-Science Grant Vot. No.: R.J130000.7928.4S030. References AHMAD HOIRUL BASORI1, A. BADE., MOHD SHAHRIZAL SUNAR AND

NADZAARI SAARI, D. D. A. M. S. H. S. (2012). An Integration Framework For Haptic Feedback To Improve Facial Expression on Virtual Human. International Journal of Innovative Computing, Information and Control.

BASORI, A. H., BADE, A., SUNAR, M. S., DAMAN, D. & SAARI, N. 2010.

E-Facetic: the integration of multimodal emotion expression for avatar through facial expression, acoustic and haptic. Proceedings of the 9th ACM SIGGRAPH Conference on Virtual-Reality Continuum and its Applications in Industry. Seoul, South Korea: ACM.

BASORI, A. H., DAMAN, D., BADE, A., SUNAR, M. S. & SAARI, N. 2008.

The feasibility of human haptic emotion as a feature to enhance interactivity and immersiveness on virtual reality game. Proceedings of The 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry. Singapore: ACM.

BERENDT, J. (2005). The city of falling angels. London. BUDDHARAJU, P., PAVLIDIS, I. T. AND TSIAMYRTZIS, P. (2005).

Physiology-based face recognition. Proceedings of the 2005 Advanced Video and Signal Based Surveillance, 2005. AVSS 2005. IEEE Conference on. 15-16 Sept. 2005. 354-359.

CEREKOVIC, A., HUNG-HSUAN, H., ZORIC, G., TARASENKO, K.,

LEVACIC, V., PANDZIC, I. S., ET AL. (2007). Towards an Embodied Conversational Agent Talking in Croatian. Proceedings of the 2007 Telecommunications, 2007. ConTel 2007. 9th International Conference on. 13-15 June 2007. 41-48.

COSI, P., DRIOLI, C., TESSER, F. AND TISATO, G. (2005).

INTERFACE toolkit: a new tool for building IVAs. In P. Themis, G. Jonathan, A. Ruth, B. Daniel, O. Patrick & R. Thomas (Eds.), Lecture Notes in Computer Science (pp. 75-87): Springer-Verlag.

DARWIN. (1872). The Expression of the Emotions in Man and

Animals. London. EKMAN AND W., F. (1974). Detecting deception from the body or

face. 29: 288-298. EKMAN, P. AND FRIESEN, W. V. (1978). Facial action coding

system: A technique for the measurement of facial movement. (Consulting Psychologists Press).

ERSOTELOS, N. AND DONG, F. (2008). Building highly realistic

facial modeling and animation: a survey. The Visual Computer, 24(1), 13-30. doi: 10.1007/s00371-007-0175-y

JIMENEZ, J., SCULLY, T., BARBOSA, N., DONNER, C., ALVAREZ, X.,

VIEIRA, T., ET AL. (2010). A practical appearance model for dynamic facial color. ACM Trans. Graph., 29(6), 1-10. doi: 10.1145/1882261.1866167

JUNG, Y., WEBER, C., KEIL, J. AND FRANKE, T. (2009). Real-Time

Rendering of Skin Changes Caused by Emotions. Paper presented at the Proceedings of the 9th International Conference on Intelligent Virtual Agents, Amsterdam, The Netherlands.

KALRA, P. AND MAGNENAT-THALMANN, N. (1994). Modeling of

vascular expressions in facial animation. Proceedings of the 1994 Computer Animation '94., Proceedings of. 25-28 May 1994. 50-58, 201.

KYU-HO, P. AND TAE-YONG, K. (2008). Facial color adaptive

technique based on the theory of emotion-color association and analysis of animation. Proceedings of the 2008 Multimedia Signal Processing, 2008 IEEE 10th Workshop on. 8-10 Oct. 2008. 861-866.

LIU, C. (2009). An analysis of the current and future state of 3D

facial animation techniques and systems. Master of Science, B.A., Communication University of China, Canada.

MORETTI, G., ELLIS, R. A. AND MESCON, H. (1959). Vascular

Patterns in the Skin of the Face1. The Journal of Investigative Dermatology, 33(3), 103-112.

NIJDAM, N. A. (2006). Mapping emotion to color. University of

Twente, the Netherlands.

151

PAN, X., GILLIES, M. AND SLATER, M. (2008). The Impact of Avatar Blushing on the Duration of Interaction between a Real and Virtual Person. Proceedings of the 2008 Presence 2008: The 11th Annual International Workshop on Presence,

PLUTCHIK, R. AND KELLERMAN, H. (1980). Emotion, theory,

research, and experience. New York: Academic Press. POS, O. D. AND GREEN-ARMYTAGE, P. (2007). Facial Expressions,

Colours and Basic Emotions http://www.colour-journal.org/2007/1/2/.

SHEARN, D., BERGMAN, E., HILL, K., ABEL, A. AND HINDS, L.

(1990). Facial Coloration and Temperature Responses in Blushing. Psychophysiology, 27(6), 687-693. doi: 10.1111/j.1469-8986.1990.tb03194.x

YAMADA, T. AND WATANABE, T. (2004). Effects of facial color on

virtual facial image synthesis for dynamic facial color and expression under laughing emotion. Proceedings of the 2004 Robot and Human Interactive Communication, 2004. ROMAN 2004. 13th IEEE International Workshop on. 20-22 Sept. 2004. 341-346.

WEBSITE, T.-A. (2009). http://tech algorithm.com/articles/linear-

interpolation/

152