Jonathan Scratch from USC
He started the talk by an example of an emotionally intelligent machine that is used for therapy of PTST patients. The machine is in form of human avatar with human voice and facial expressions while speaking. The machine does facial expression recognition, semantic analysis of what the patient says, detects affect in speech, and detects affect in posture and gestures. Then based on what is received, the machine demonstrates appropriate responses and facial expressions.
He also specifies that http://acii2017.org/Home and https://society-for-affective-science.org/reg/ are two places to check out.
He categorized emotion AI in three major groups
1- Emotion and Cognition : what are the cognitive antecedents and consequences of emotion? Appraisal theory
- Can a machine predict and simulate these effects? The answer is yes. Using Appraisal Theory, machines can figure out what emotions to put on display. For example, if a robot steps on your foot, it can learn not to smile and show guilt emotion instead.
2. Emotion and Social Cognition: How do people react to another’s emotions? How do people react to an emotional machine? People react because they use reversed appraisal theory and their react can be observed in their future decision making.
- For example, if a robot steps on your foot and smiles at you, you probably will interoperate the smile as the robot being malicious. Then, you may decided to get rid of the robot next time, it steps on your foot.
- Scratch first studied Prisoner’s dilemma game with a machine. He figured the followings
- Corporate Defect
- Corporate SMILE GUILT
Defect ANGER SADNESS
Then he thought if we swap the SMILE and GUILT emotions, then the machine can mislead the human. It can smile as if it is cooperating with the human but in fact it is defecting. So then there is this conclusion that emotion can be a tool for influence. Therefore, the sender shows strategic emotion expression.
3. Emotion and Persuasion: Can machines use emotion against us (to shape human actions)?
-
- In another words, can machine manipulate us by exhibiting strategic emotions?
- The answer is yes.
- Now, going beyond machine-human interaction, can affective computing help us understand human decision making?
- In another words, can machine manipulate us by exhibiting strategic emotions?
- How do machines or people determine what fake emotions to show? what is the algorithm?
- look at human negotiations . Identify which strategy they use.
- Scratch studied The misrepresentation Game:identifying set of false emotions in a way that 1) the self interest is maximized 2) but opponents feel they received a fair deal. This problem can be formulated into two stages. 1) learn what the opponent wants first 2) avoid to reveal what you want. In other words
- calculate how to maximize the actual reward
- find the best false preference
- while appearing to be fair
- How do machines or people recognize fake emotions? How do they calibrate their mind-reading?
- Seems like the strategy mentioned above is highly used in negotiations.
- Now what should we do with these findings?
- Should we build deceptive agents?
- they earn more money for their users
- human lawyers are involved in these kind of deceptions
- deception can enhance joint value in some contexts
- Pro-social liers
- Should we build deceptive agents?
Appraisal Theory: Emotion reflects person-environment relationship.
Environment as well as mental state make us appraise and consequently make emotional response come on the surface. Then depending on the action response we either perform problem-focused coping or emotion-focused coping.
For example, if I want to become a genius physicist (mental sate) and i failed in my physics class (environmental state) , then desirability, expectedness, controllability, and causal attribution aspects of mental and environmental states lead me to experience sadness emotion, physiological responses that come with it, and consequently action tendencies. The action tendencies are either problem-focused coping or emotion-focused coping. An example of problem-focused coping is getting a tutor and working harder for next physics test. An example of emotion-focused coping is giving up on the dream of becoming a physicist and joining a music band.
Luckily, the desirability, expectedness, controllability, and causal attribution can all be modeled.
Other Work
- Psychology of technology
- Do people treat machines like people
- What are the advantages of NON-anthropomorphism
- Connection with theories of dehumanization
- Technology for Mental Health
- Technology for mental health screening – this is interesting! creating games and models that predict the emotion that should come on surface and when it doesn;t, predict the mental illness.
- Technologies to establish rapport and elicit honest responding.
- Human Agent Negotiation
- Machine negotiation processes
- teaching negotiation skills
- Ethics of affective computing
- e.g., should machines lie w/ their emotions?
- what about pro-social lies?