University of Sussex study: next-generation technologies can stimulate different areas of the hand to convey feelings of happiness, sadness, excitement or fear

Human emotion can be transferred by technology that stimulates different parts of the hand without making physical contact with your body, a University of Sussex-led study has shown.

BRIGHTON, 21-4-2015 — /EuropaWire/ — Sussex scientist Dr Marianna Obrist, Lecturer at the Department of Informatics, has pinpointed how next-generation technologies can stimulate different areas of the hand to convey feelings of, for example, happiness, sadness, excitement or fear.

For example, short, sharp bursts of air to the area around the thumb, index finger and middle part of the palm generate excitement, whereas sad feelings are created by slow and moderate stimulation of the outer palm and the area around the ‘pinky’ finger.

The findings, which will be presented tomorrow (Tuesday 21 April) at the CHI 2015 conference in South Korea, provide “huge potential” for new innovations in human communication, according to Dr Obrist.

Dr Obrist said: “Imagine a couple that has just had a fight before going to work. While she is in a meeting she receives a gentle sensation transmitted through her bracelet on the right part of her hand moving into the middle of the palm. That sensation comforts her and indicates that her partner is not angry anymore.

“These sensations were generated in our experiment using the Ultrahaptics system.

“A similar technology could be used between parent and baby, or to enrich audio-visual communication in long-distance relationships.

“It also has huge potential for ‘one-to-many’ communication – for example, dancers at a club could raise their hands to receive haptic stimulation that enhances feelings of excitement and stability.”

Using the Ultrahaptics system – which enables creating sensations of touch through air to stimulate different parts of the hand – one group of participants in the study was asked to create patterns to describe the emotions evoked by five separate images: calm scenery with trees, white-water rafting, a graveyard, a car on fire, and a wall clock. The participants were able to manipulate the position, direction, frequency, intensity and duration of the stimulations.

A second group then selected the stimulations created by the first group that they felt best described the emotions evoked by the images. They chose the best two for each image, making a total of 10.

Finally, a third group experienced all 10 selected stimulations while viewing each image in turn and rated how well each stimulation described the emotion evoked by each image.

The third group gave significantly higher ratings to stimulations when they were presented together with the image they were intended for, proving that the emotional meaning had been successfully communicated between the first and third groups.

Now Dr Obrist has been awarded £1 million by the European Research Council for a five-year project to expand the research into taste and smell, as well as touch.

The SenseX project will aim to provide a multisensory framework for inventors and innovators to design richer technological experiences.

Dr Obrist said: “Relatively soon, we may be able to realise truly compelling and multi-faceted media experiences, such as 9-dimensional TV, or computer games that evoke emotions through taste.

“Longer term, we will be exploring how multi-sensory experiences can benefit people with sensory impairments, including those that are widely neglected in Human-Computer Interaction research, such as a taste disorder.”

Catherine Bearder, Liberal Democrat MEP for south-east England, said: “I am thrilled Dr Obrist has been awarded this EU funding for her incredible research into such a ground-breaking side of science.

“This is an example of the EU investing in those research projects it sees as having great potential to change our lives.”

You can watch a short video about the research on YouTube:

 

Notes to editors

University of Sussex press office contacts: James Hakner and Jacqui Bealing – press@sussex.ac.uk, 01273 678888.

‘Emotions Mediated Through Mid-Air Haptics’ by Obrist, M., Subramanian, S., Gatti, E., Long, B., and Carter, T. (2015) is presented at Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’15) at 16:30 KST (08:30 GMT) on Tuesday 21 April 2015.

A copy of the paper can be downloaded at http://www.multisensory.info/wp-content/uploads/2014/08/paper1195.pdf.

More information about the research is available at: http://www.multisensory.info/touch/

The ACM CHI Conference on Human Factors in Computing Systems, the premier international conference of Human-Computer Interaction, takes place from Saturday 18 April to Thursday 23 April 2015 in Seoul, South Korea.

The research paper ‘Emotions Mediated Through Mid-Air Haptics’ has been selected to receive a SIGCHI Best of CHI Honourable Mention Award at CHI 2015.

About Sussex Computer Human Interaction
The Sussex Computer Human Interaction Lab (SCHI Lab) is based in the School of Engineering and Informatics at the University of Sussex. Dr Marianna Obrist, leading the SCHI Lab, is particularly interested in the exploration of touch, taste, and smell as future interaction modalities for interactive technologies. Dr Obrist is bringing together an interdisciplinary and creative team of computer scientists, psychologists, designers, and engineers, supported by the recently started SenseX project funded by the European Research Council (ERC Starting Grant Agreement 638605).

Youtube Channel: https://www.youtube.com/channel/UCerl0z9OqDKPyggey4uuWoQ

Twitter: @obristmarianna

###

University of Sussex researchers used a system called UltraHaptics to pinpoint areas of the hand that could be stimulated to evoke different emotions.

University of Sussex researchers used a system called UltraHaptics to pinpoint areas of the hand that could be stimulated to evoke different emotions.

Follow EuropaWire on Google News
EDITOR'S PICK:

Comments are closed.