L2 Outputs1 - Heriot
Transcription
L2 Outputs1 - Heriot
Summary of programme Affect and Personality in Interaction with Ubiquitous Systems – speech, language, gesture, facial expressions, music, colour Professor Ruth Aylett Vision Interactive Systems & Graphical Environments MACS, Heriot-Watt University www.macs.hw.ac.uk/~ruth Introduction and overview (today) Affective outputs Affective/Personality models and action-selection approaches Affective inputs Applications – Embodied Conversational Characters, Intelligent Virtual, Agents, human-robot interaction Evaluation approaches Today’s topics Describing emotion Music Colour Shape and form Displaying emotion Emotions can be shown via – Acoustic and visual behaviors: facial expression, voice, gesture, posture – Behavior expressivity: voice and body movement quality – Music – Colour Reasons to display emotional state: – Create affective awareness – Create social relationship – Engage user in communication Thanks to Catherine Pelachaud! But how do we know what to output? – Some systematic description of emotion? 1 Defining types of affective states Russell’s system Scherer et al.,Univ. Geneva Behavior impact Rapidity of change Appraisal elicitation Event focus Synchronization Duration Types of Affect Intensity Design Features Moods: cheerful, gloomy, irritable, listless, depressed, buoyant Interpersonal stances: distant, cold, warm, supportive, contemptuous Preferences/Attitudes: liking, loving, hating, valuing, desiring Sherer’s descriptive framework Active adventurous • • AROUSED lusting ASTONISHE triumphant D bellicose • hostile • TENSE hateful ALARMED envious • ANGRY • AFRAID EXCITED • enraged Hi Power/Control selfconfident ambitious • DELIGHTEenthusiasti c D determined HAPPY • joyous Positive lighthearted amused content discontente bitter d suspicious insulted bored startled impressed astonished Negative taken aback worried relaxed solemn SERENE • disappointe d MISERABL dissatisfied E• apathetic confident hopeful attentive CONTENT • AT EASEfriendly •SATISFIED contemplati polite • RELAXED ve • CALM • longing serious melancholi c peaceful empathic feel guilt languid • SAD uncomforta ble DEPRESS • GLOOMY despondent ashamedED • desperate pensive conscientio us reverent SLEEPY •• TIRED embarrass ed hesitant Sad wavering • BORED sad lonely Lo Power/Control anxious dejected two components (1) pleasuredispleasure VALANCE (2) arousal-sleep AROUSAL An empirical subset suitable for describing emotions in human-machine interaction Scherer et al. Univ. Geneva distrustful expectant interested feel well PLEASED amourous • GLAD • Conducive impatient excited passionate Happy Obstructive contemptuo angry us • DISTRESS disgusted ED indignant loathing FRUSTRATED • Angry jealous convinced elated defiant ANNOYED • conceited feeling superior courageous Circumplex Model of Affect – Russell 1980 Emotions: angry, sad, joyful, fearful, ashamed, proud, elated, desperate Affect dispositions: nervous, anxious, reckless, morose, hostile insecure Preliminary list of 55 terms, from HUMAINE summer school 2004, Belfast: stress, annoyance, boredom, panic, impatience, disapproval, hot anger, anxiety, disappointment, fear, satisfaction, sadness, surprise, shock, amusement, worry, excitement, pleasure, cold anger, interest, effervescent happiness, nervousness, approval, embarrassment, distraction, disagreeableness, disgust, despair, indifference, neutrality, hurt, friendliness, weariness, relief, confidence, contentment, shame, contempt, affection, sympathy, relaxation, mockery, pride, resentment, calm, guilt, jealousy, determination, serenity, coldness, cruelty, hopeful, wariness, greed, admiration DROOPY • doubtful Passive 2 Expressive cues Affective Music – Mapping between expressive acoustic cues and emotions Low Activity High Activity • • SADNESS slow mean tempo (Ga95) legato articulation (Ju97a) small articulation variability (Ju99) low sound level (Ju00) dull timbre (Ju00) large timing variations (Ga96) soft duration contrasts (Ga96) slow tone attacks (Ko76) flat micro-intonation (Ba97) slow vibrato (Ko00) final ritardando (Ga96) Visualization of musical expression – Colours – Facial expressions HAPPINESS fast mean tempo (Ga95) small tempo variability (Ju99) staccato articulation (Ju99) large articulation variability (Ju99) high sound level (Ju00) little sound level variability (Ju99) bright timbre (Ga96) fast tone attacks (Ko76) small timing variations (Ju/La00) sharp duration contrasts (Ga96) rising micro-intonation (Ra96) slow mean tempo (Ga96) slow tone attacks (Ga96) low sound level (Ga96) small sound level variability (Ga96) legato articulation (Ga96) soft timbre (Ga96) large timing variations (Ga96) accents on stable notes (Li99) soft duration contrasts (Ga96) final ritardando (Ga96) Simulation of emotions in music performance From Juslin (2001) • • TENDERNESS Bresin, KTH Sweden Positive Valence • FEAR staccato articulation (Ju97a) very low sound level (Ju00) large sound level variability (Ju99) fast mean tempo (Ju99) large tempo variability (Ju99) large timing variations (Ga96) soft spectrum (Ju00) sharp micro-intonation (Oh96b) fast, shallow, irregular vibrato (Ko00) ANGER high sound level (Ju00) sharp timbre (Ju00) spectral noise (Ga96) fast mean tempo (Ju97a) small tempo variability (Ju99) staccato articulation (Ju99) abrupt tone attacks (Ko76) sharp duration contrasts (Ga96) accents on unstable notes (Li99) large vibrato extent (Oh96b) no ritardando (Ga96) Negative Valence R. Bresin Lens model: quantifies the expressive communication between performer and listener Example: SADNESS Accuracy .87 The Performer Encoding intention The Performance Decoding expressive cues The Listener judgment Tempo Anger .26 .47 .63 -.26 Loudness Timbre .22 .55 .61 -.39 Expressive Cue Analysis Synthesis (Director Musices) Tempo Slow Tone IOI is lengthened by 30% Sound level Moderate or low Sound level is decreased by 6 dB Articulation Legato Tone duration = Tone IOI Time deviations Moderate Articula. Phrase Arch Rule applied on sub phrase level (k = 1.5) others Cue Utilization Final ritardando Yes Cue Utilization r Performer Duration Contrast Rule (k = -2) Phrase Arch Rule applied on phrase level (k = 1.5) Anger Obtained from the Phrase Arch Rule r Listener Matching .92 R. Bresin R. Bresin 3 Better Polyphonic Ringtones MOODIES Bresin, KTH Coldplay Talk Original Classy Happy Romantic Aggressive La Linea Original Classy Happy Romantic Aggressive Colour, Movement, Shape Hayfa Original Classy Happy Romantic Aggressive www.notesenses.com R. Bresin Visualization of Musical Expression Color and Emotion Bresin et al, KTH • Perceptual study: Link musical performances to colours • Performances with different emotional intentions • Set of colour nuances in hue, brightness, saturation • Result of perceptual study: HUE Happiness Yellow Fear Blue Sadness Violet & Blue Anger Red Love Blue & Violet BRIGHTNESS Observed tendency: Minor tonality Low brightness (Dark colours) Major tonality High brightness (Light colours ) Tool for real-time visual feedback to expressive performance Mapping between acoustic cues and emotions ExpressiveBall: Mapping of emotions and colors GretaMusic: Mapping of emotions and facial expressions – music emotion facial expression – music volume spatial and power – music tempo temporal and overall activation – music articulation fluidity From Bresin 4 The ExpressiBall GretaMusic Bresin, Juslin, KTH Bresin, KTH – Mancini, Pelachaud U Paris8 Slow Fast Legato Sad Slow attack Low energy Soft Soft Loud Staccato Angry Fast attack High energy Color Emotion Shape Articulation Loud X Tempo Y Sound level Z Attack velocity & Spectrum energy Slow Fast From Bresin Mutual Interaction • Interactive virtual dancer: • dance together with the user to the beat of the music • adapt its performance to whatever the human user is doing - beat detection to align dance with music tempo - agent’s movement chosen from database of capture movements Affective Diary Höök et al, SICS •Diary: express inner thoughts and record experiences of past events •Affective diary: • capture emotional experience over time via mobile phone • replay of the experience • reflect on the experience From Höök 5 eMoto - Expressing emotions in a digital world Affective Diary Höök et al, SICS ”[pointing at the first slightly red character] And then I become like this, here I am kind of, I am kind of both happy and sad in some way and something like that. I like him and then it is so sad that we see each other so little. And then I cannot really show it.” Sundström, Ståhl, Höök, SICS-KTH From Höök eMoto - Expressing emotions in a digital world Sundström, Ståhl, Höök, SICS-KTH eMoto: mobile messaging service for sending and receiving affective messages Use affective gestures of users to convey the emotional content of their messages eMoto – Example Sundström, Ståhl, Höök, SICS-KTH Input: movement detection through pen Output: colours, shapes and animations on mobile Bored Excited Happiness 6 I-Shadows AINI (Anticipatory Believability) Martinho, Paiva Gaips Paiva et al, Gaips Help children to build the virtual and real world of Interactive Shadows Create a learning environment where children will be able to build logical narratives on-the-fly Agent with autonomous anticipatory mechanism Study of the relation between anticipation and emotion Prediction of the next sensor value of agent – interpretation of the mismatch between sensation and anticipation to direct both the focus of attention and the expression of emotions . QUESTIONS? 7