Wednesday, 7 November 2012

Hook HCI (2008)


Knowing, Communicating, and Experiencing through Body and Emotion
Kristina Höök (2008)
IEEE Transactions on Learning Technologies
OCTOBER-DECEMBER 2008 (Vol. 1, No. 4) pp. 248-259
1939-1382/08/$26.00 © 2008 IEEE

Published by the IEEE Computer Society

Three trends
·      New wearable technologies
·      Third wave HCI
·      New approach in learning research  -emotion and cognition  are interrelated.

Second wave of HCI
P248 ‘To deal with the complexities of collaboration, sociologists and ethnographers were consulted, providing richer descriptions of what people do when they work ‘together.

Third wave of HCI: ‘a movement that aims to design for experiences involving users emotionally, bodily, and providing for aesthetic experiences.

P248 ‘The goal of this new movement is to try and design for experiential values rather than efficiency, for entertainment and fun rather than work. This has brought a whole new dimension to the field……….. HCI researchers now have to deal with highly elusive, subjective, and holistic qualities of interaction—qualities that are hard to design for, but also hard to validate through traditional measurements. How can you, for example, measure the tenderness of a touch?’

Three examples of third wave HCI that are ‘non reductioist, do not try to measure emotion and respond with a technological intervention

(1) eMoto
Backgrounds for text messages
The user writes the text message and then chooses which expression to have in the background from a big palette of expressions mapped on a circle. The expressions are designed to convey emotional content along two axes: arousal and valence. For example, aggressive expressions have high arousal and negative valence and are portrayed as sharp, edgy shapes, in strong red colors, with quick, sharp animated movements. Calm expressions have low arousal and positive valence which is portrayed as slow, billowing movements of big, connected shapes in calm blue-green colors. To move around in the circle, the user has to perform a set of gestures using the stylus pen (that comes with some mobile phones) which we had extended with sensors that could pick up on pressure and shaking movements.

Studies of eMoto showed that the circle was not used in a simplistic one-emotion-one-expression manner, mapping emotions directly to what you are experiencing at the time of sending an emoto [ 50 ]. Instead, the graphical expressions were appropriated and used innovatively to convey mixed emotions, empathy, irony, expectations of future experiences, surrounding environment (expressing the darkness of the night), and, in general, a mixture of their total embodied experiences of life and, in particular, their friendship. The "language" of colors, shapes, and animations juxtapositioned against the text of the message was open-ended enough for our users to understand them and express themselves and their personality with them. There was enough expressivity in the colors, shapes, and animations to convey meaning, but at the same time, their interpretation was open enough to allow our participants to convey a whole range of messages. We look upon the colors, shapes, and animations as an open "surface" that users may ascribe meaning to.

(ii) Affector
Affector is a distorted video window connecting the neighboring offices of two friends (and colleagues). A camera located under the video screen captures video as well as "filter" information  (Senger et al)

‘While the designers originally intended for this to communicate the emotional moods of the two participants to one another, it turned out that what was needed and what they ended up designing throughout the two-year process was to communicate something else. It became a tool for companionable awareness of the other in an aesthetically pleasing and creative way. It was not a simple identification of the partner's emotional mood, but a complex reading of what was going on in the other person's office, highlighting bodily movements, figuring out how this related to what they already knew about each others work life, and interpreting this.’

‘The distortions of the video became the "surface" that was open enough to invite creative use, and allowed the two participants to put meaning to the expressions based perhaps not only on the visual expression, but also on all the other knowledge they had of each other's work life. Pressing deadlines, late night work, getting papers accepted, or knowledge of each other's private life was mixed into their interpretation and meaning-making processes in using

(iii) Affective Diary: A Personal Logging System

‘As a person starts her day, she puts on the body sensor armband. During the day, the system collects time-stamped sensor data picking up movement and arousal. At the same time, the system logs various activities on the mobile phone: text messages sent and received, photographs taken, and the presence of Bluetooth in other devices nearby. Once the person is back at home, she can transfer the logged data into her Affective Diary. The collected sensor data as shown in Fig. 4 is presented as somewhat abstract, ambiguously shaped, and colored characters placed along a timeline’.

e.g. For Ulrica ( one of the participants)  then, her reflections using the diary provided an explanation of why people sometimes misunderstood her and her emotional reactions. Further, it led her to conclude that she should let more of her inner feelings be expressed in the moment. In short, Ulrica used the diary to reflect on her past actions and, as a consequence, to decide to change some of her behaviors; a process of reflection, learning, and change appeared to result from using the diary.

Themes and lessons learned
All three examples make use of sensor technologies as a means to capture something else than what we normally express through written text.

‘None of the systems try to represent these emotion processes inside the system or diagnose users' emotions based on their facial expressions or some other human emotion expression. Instead, they build upon the users own capabilities as meaning making, intelligent, active coconstructors of meaning, emotional processes, and bodily and social practices. In that sense, they are nonreductionist.’

‘An important lesson from these designs is that they have all left space, or "inscribable surfaces," open for users to fill with content [ 21 ]. If users recognize themselves or others through the activities they perform at the interface—if they look familiar to the user through the social or bodily practice they convey—they can learn how to appropriate these open surfaces. The activities of others need to be visible and what can be expressed users should be allowed to shape over time.’

Emotion in HCI: Three Design Approaches
(1) Affective Computing
‘The most discussed and widespread approach in the design of affective computing applications is to construct an individual cognitive model of affect from first principles and implement it in a system that attempts to recognize users' emotional states through measuring the signs and signals we emit in face, body, voice, skin, or what we say related to the emotional processes going on in inside. Emotions, or affect, are seen as identifiable states. Based on the recognized emotional state of the user, the aim is to achieve an as life-like or human-like interaction as possible, seamlessly adapting to the user's emotional state and influencing it through the use of various affective expressions. This model has its limitations, both in its requirement for simplification of human emotion in order to model it, and its difficult approach into how to infer the end-users emotional states through interpreting our sign and signals. This said, it still provides for a very interesting way of exploring intelligence, both in machines and in people.’
(ii) Hedonistic Usability
(iii) The Interactional Approach ( the approach adopted for the above three examples
‘An interactional approach to design tries to avoid reducing human experience to a set of measurements or inferences made by the system to interpret users' emotional states. While the interaction of the system should not be awkward, the actual experiences sought might not only be positive ones. eMoto may allow you to express negative feelings about others. Affector may communicate your negative mood. Affective Diary might make negative patterns in your own behavior painfully visible to you. An interactional approach is interested in the full range of human experience possible in the world’

Meltz paper at Stllar



Meltzoff, A.N., Kuhl, P.K., Movellan, J., and Sejnowski., T.J. (2009)
Foundations for a new science of learning
Science, 325, 284-288

p284 'Human learning and cultural evolution are supported by a paradoxical adaptation. We are born immature. During the first year if life , the brain of an infant is teeming with structural activity' with sensory processes developing before higher activity'

'Three principles are emerging from cross-disciplinary work in psychology, neuroscience, machine learning, and education, contributing to a new science of learning'  and, in particular, are useful for explaining,, language and social understanding.
1.    Learning is computational, implicit
2.    Learning is social, implicit
3.    Learning  is supported by brain circuits linking perception and action
1. Learning is computational
' infants and young children possess powerful computational skills that allow them to automatically infer structural models of their environment from the statistical patterns they experience' eg 'before they are three, children use frequency distributions to learn which phonetic units distinguish words in their native language' p 285 ' Statistical regularities and co variations in the world thus provide a richer source of information than previously thought' and the learning    running around these regularities is implicit. ' Learning from probabilistic input provides an alternative to Skinnerian reinforcement learning and Chomskian nativist accounts' of learning
2. Learning is social
p285 'Children do not compute statistics indiscriminately. Social cues highlight what and when to learn'  young infants 'more readily learn and enact an event when it is produced by a person than be an inanimate device. Machine learning studies show that systematically increasing a robot's social-like behaviours and contingent responsivity elevates young children's willingness to connect with it and learn from it'
3. Learning is supported by brain circuits linking perception and action
' Human social and language learning are supported by neural-cognitive systems that link the actions of self and other.'  The brain areas responsible for initiation of movement and its action overlap. ' Social learning, imitation, and sensorimotor experience may initially generate, as well as modify and refine, shared neural circuitry for perception and action'.  KRO to what extent and what is the nature of 'the close coupling and attunement between self and other, which is the hallmark of seamless social communication and interaction'

Social learning and understanding
Three social skills are foundational
1.    Imitation
2.    Shared attention
3.    Empathy and social emotions
 Imitation
'Learning by observing and imitating experts in the culture is a powerful social learning mechanism' ' Imitation if faster than individual discovery and safer than trial and error learning' ' Children can use third person information ( observation of others) to create first person knowledge. This is an accelerator for learning: Instead of having to work out causal relationships themselves children can learn from watching experts' ' Imitative learning is valuable because the behavioural actions of others "like me" serve as a proxy for one's own' ' Children do not slavishly duplicate what they see but reenact a person's goals and intentions' ie ' they produce the goal that the adult was striving to achieve, not the unsuccessful attempts. Children choose whom, when, and what to imitate and seamlessly mix imitation and self discovery to solve novel problems'  attempts in robotics to emulate infant imitation include direct (input-action) and more recently goal based approaches .
 Shared attention
'Social learning is facilitated when people share attention. Shared attention to the same object or event provides a common ground for communication and teaching. An early component of shared attention is gaze following' experimental evidence to show that ' we project our own experience onto others'. P286  ' The ability to interpret the behaviour the behaviour and experience of others by using oneself as a model is a highly effective learning strategy that may be unique to human........It would be useful if this could be exploited in machine  learning'
Empathy and social emotions
' The capacity to feel and regulate emotions is critical '  ' In humans, many affective       processes are uniquely social'. Children will even help and comfort a social robot that was crying Tanaka,Cicourel,Movellan, 2007) 'Brain imaging studies in adults show an overlap in the neural systems activated when people  receive a painful stimulus themselves or perceive that another person is in pain  Hein & Singer (2008) These neural reactions are modulated by cultural experience, training, and perceived similarity between self and other Hein & Singer (2008)

Language Learning  - as shedding light on the interaction between computational learning, social facilitation of learning, and shared neural circuitry for perception and production.
Evidence to show that developing infants pick up the statistical regularities of a language leading to neural commitment. ' However, experiments also show that the computations involved in language learning are "gated" by social processes (Kuhl, 2007). In foreign language learning experiments, social interaction strongly influenced infants' statistical learning. Infants exposed to a foreign language at 9 months learn rapidly, but only when experiencing the new language during social interchanges with other humans. 'Temporal contingencies may be critical'.
Idea of neural commitment

A similar pattern , ' passerine  birds learn conspecific song by listening to and imitating adult birds' ' In birds, as in humans, a social context enhances vocal learning'.






M & D (2003)


Andrew N. Meltzoff and Jean Decety (2003)
What imitation tells us about social cognition: a reapproachment between developmental psychology and cognitive neuroscience
Phil. Trans. R. Soc. Lond. B , 358, 491-500

P491 ‘Through imitating others, the human young come to understand that others not only share behavioural states, but are ‘like me’ in deeper ways as well’.

P494 ‘Imitation seems to be intrinsically coupled with empathy for others, broadly construed’

‘The holy grail for cognitive and neuroscience theories of imitation is to elucidate the mechanism by which infants connect the felt but unseen movement of the self with the seen but unfelt movement of the other’.

Developmental psychology approach and evidence

Imitation as innate? Newborns imitate

P492  imitation is ‘part of innate endowment of humans’ imitation in the newborn has been shown in 14 independent labs. P493 ‘There is an innate link between the perception and production of human acts’
12-21 day infants could imitate four gestures ‘ infants confused neither actions nor body parts’ also p 492 ‘It is as if young infants isolate what part of their body to move before how to move it’ ie there is ‘organ identification’
‘infants can store a model and imitate from memory …… which requires more than a simple visual-motor resonance’

Self-other relations. Awareness of self
Knowing you are being imitated requires being aware of self.
p494 ‘ The infancy work shows that young babies correct their imitative behaviour which suggests active comparison between self and other ( Meltzoff and Moore, 1997)
‘a listener often shows interpersonal connectedness with a speaker by adopting the postural configuration of the speaker’

Can infants recognise when another acts ‘like me’?
What is the emotional value of the experience?

Exp:
C1. One adult imitated the baby 
C2. Another adult imitated the previous baby ( therefore each adult acted like a perfect baby)
Results:  For C1 cf C2.  baby smiled more, looked at adult for longer, directed testing behaviour at the adult.

Older infants and sharedness  p494-495 ‘ the relationship is being abstractly considered’ ….. ‘ the abstract notion that the other is doing the same as me’

Goals and intentions
(KR0 sharing attention etc hinges on being able to determine intention and hence why realness might be important)

P495 “ In the mature adult notion, persons have internal mental states – such as beliefs, goals, and intentions – that predict and explain human actions’

P493 humans do not simply resonate, however. Our goals affect how we process stimuli in the world’ i.e. intention determines pattern of neural representation

Exp:
C1: show infants an unsuccessful act – at 18 months ‘ can infer the unseen goals implied by unsuccessful atempts’ ….. ‘ they choose to imitate what we meant to do rather than what we mistakenly did do’

Human v.  non human
This was repeated using a mechanical device that traced the same spatio-temporal path ‘infants did not attribute goals or intentions in this case’

P497 “ This developmental research shows that infants distinguished between what the adult meant to do and what he actually did they ascribed goals to human acts, indeed, they inferred the goal even when it was not attained. The differentiation between behaviour versus goals and intentions lies at the core of our mentalizing, and it underlies our moral judgements

Neuroscience approach and evidence

P 491 ‘monkeys do not imitate’ but they do have mirror units. But whether mirror units are innately present  or the result of associative learning is not known ( as of 2003)

Kinds of questions to ask p 495 “ What is the neural basis for distinguishing the self’s imitation of the other from the other’s imitation of the self?’ – the situation in the physical world is the same – there are two bodies in correspondence with one another.

Missing p491 ‘ how a neural mirror system begets theory of mind’

Experiments and experimental manipulations
1.  Observing the actions of others

2. Future action v future recognition
·       Remembering for future action
·       Remember for future recognition

3. Animate v. inanimate
·       Observe real person
·       Observe in animate

4. Self – other
4a. Imaging actions
·       Own
·       Others
4b doing actions
·       Imitate others actions
·       See others imitate own actions

5. Means and goals ( LEGO block exp)
·       Goal achived condition  only
·       Means only
·       Whole action
·       & control conditions

Ventral premotor
(Mirror units monkeys)
STS
(Mirror units monkeys)
premotor – somotropic
observing actions of others (1)
remember for future action (Decety work) (2)
left premotor
observe an achieved goal (5)
parietal (right and left)
Observing actions of others  (1)
remember for future action (Decety) (2)
left parietal
Imaging actions ( own) ie self (4a)
intention ( human actions only -  not inanimate from exp comparing animate and inanimate) (3)
imitate others actions  (4b)
right parietal
imagining others actions (4b)
see others imitate own actions (4b)
posterior cingulated
imagining others actions (4a)
fronto polar cortex
imagining others actions (4a)
medial prefrontal cortex
imitate  others actions (4b)
see others imitate own actions (4b)
differentiated means only from goal only condition of  Exp 5.  nb from other research areas plays a critical role in inferencing ie need to be able to observe the means in order to make inferences.
Right dorso lateral prefrontal
Both means and goals condition – exp 5
left somatosensory cortex
Imaging actions ( own) ie self (4a)
left middle temporal gyrus
observing actions of others (1)
left inferior frontal gyrus
observing actions of others (1)
SMA
remember for future action (Decety) (2)

Middle frontal gyrus
remember for future action (Decety) (2)
parahippocampal gyrus in the temporal lobe
remember for future recognition (Decety) (2)
right superior temporal gyrus
intention ( human actions only -  not inanimate from exp comparing animate and inanimate) (3)
visual analysis of others actions (4)
left superior temporal gyrus
visual analysis of other’s actions in relation to actions performed by the self. (4)
cerebellum
Both means and goals condition – exp 5

intention determines pattern of neural activity

p493 ‘ these result support the notion of shared representations of self and other.  The results also suggest a crucial role of the inferior parietal cortex in distinguishing the perspective of self from other’ and the medial prefrontal for inferring the actions of others ie need to see the means of a task as well as the goals and this is consistent, including in terms of source location,  with the work on theory of mind

Theoretical speculation from combining developmental Psychology and Neurocience: from imitation to social cognition.

Theoretical speculation

Proposed : a three step developmental approach

(1)  Innate equipment.  Newborns can recognise equivalences between perceived and executed acts.  This is that a starting state
(2)  Constructing first person experience. Through everyday experience infants map the relation between their own bodily acts and their mental experiences. For example, there is an intimate relation between ‘striving to achieve a goal’ and the concomitant facial expression and effortful bodily acts.  Infants experience their own inner feelings and outward facial expressions and construct a detailed bidirectional map linking mental experience and behaviour.
(3)  Inferences about the experience of others. When infants see others acting ‘like me’ they project that others have the same mental experience that is mapped onto those behavioural states in the self.

Neuroscience -What is common and what is distinct between self and other at a neural level?

Temporal and frontal
There is neural activation in the posterior part of the temporal cortex and the medial prefrontal whatever the imitation task but must be human. Note medial prefrontal is also involved in mentalizing

Parietal
There is differential activity right versus left for inferior parietal lobe

left  p498 ‘ left inferior parietal lobe computes sensory-motor associations necessary to imitate ‘( consistent with lit. on apraxia)

right  self-other p498 ‘ is involved in recognising or detecting that actions performed by others are similar to those initiated by the self and detecting that actions performed by others are similar to those initiated by the self and determining the locus of agency for matching bodily acts’  implies that we have a body scheme and this idea is consistent with neuropsychological evidence ( disorders of bodily representation)

Summary
P498 ‘ in light of our neuroimaging experiments, we suggest that the right inferior parietal lobule plays a key role in the uniquely human capacity to identify with others and appreciate the subjective states of conspecifics as both similar and differentiated from one’s own.  ……in other words, the adult human framework is not simply one of resonance.  We are able to recognise that everyone does not share our own desires, emotions, intentions and beliefs.  To become a sophisticated mentaliser one needs to analyse both the similarities and differences between one’s own states and those of others.  That is what makes us human’.

Diana 2012 book: on imitation
 P 163 ‘Learning through practice is different because it goes beyond the realm of language and representation.  In terms of human evolution, learning through experience long predates learning through language. Learning through language and communication is, of course, a vastly more efficient way of passing on accumulated knowledge and skills , so the teaching professions from earliest times naturally made use of ‘teaching through telling’. Learning through practice in the form of learning through imitation has always been part of human, and indeed some animal , society; and learning through apprenticeship, where the imitation is accompanied by communication, is inevitably more efficient. When you learn flint-knapping and find you broke off too large a piece of slate, it saves time to have someone tell you to hit a different angle when you might have thought you were hitting too hard’
KRO learning through imitation is different from learning through practice ( which may mean learning through repeated association) or learning through doing when actions and consequences can be related one to the other i.e. embodied in some way.

Margie on Meltzoff, p148-149 (Chapter 7)
‘In Chapter  4 I noted Daniel Stern’s (2004, p. 76) claim that our nervous systems are designed to be ‘captured by the nervous systems of others’ as we observe their gestures, facial expressions, their rising and dampening affect and then model, intuit and re-run their intentions and psychological states.  In recent years, a much clearer sense has emerged of the design features involved, how affective inter-subjectivity becomes established in infancy, and then shaped and ‘personilised’ within the relational patterns and interactional routines of childhhood ( Fonagy et al, 2004)

based on Meltzoff  ‘ In other words, what seems to be present at birth is an embryonic capacity to represent one’s own body and the other’s body and coordinate the two together