Back to all news
Written by:
Paul Holland
09 Jun 2021

Imagine having a wellbeing coach available 24/7 who can reliably improve your mood. You never feel judged, the coach is always on top form, and sessions are tailored to your personality to keep you mentally well.

Dr Hatice Gunes, Fellow in Computer Science, hopes to make this vision a reality – in machine form. A Reader in Affective Intelligence and Robotics at the University of Cambridge’s Department of Computer Science and Technology, she wants to create emotionally intelligent robots that can help make us more resilient to life’s challenges.

“The need for mental wellbeing support just keeps growing,” says Gunes. “I have worked with many people experiencing mental health challenges, and the COVID-19 pandemic has only escalated the problem.”

Mindfulness training is a wellbeing approach that can reduce anxiety, and help people regulate emotions and become more resilient to stress. In essence it is about paying attention to the present moment: where we are and what we’re doing, in a non-judgemental way. In doing so it helps put space between ourselves and our reactions.

Gunes has practised mindfulness herself for over a decade and says it has been of great benefit in both her personal and professional life. But she says a lack of trained teachers and training programmes, and difficulties with establishing a guided, regular practice make mindfulness challenging and inaccessible for many people. Her solution is to develop interactive, human-like robot coaches.

This is not a fix for depression and other mental illnesses. Gunes sees robots as a way to make wellbeing practices such as mindfulness more accessible to healthy adults, to prevent mental health deteriorating to the point where medical help is needed.

“Humanoid robots have so much potential to help us stay mentally well – I really believe they can improve quality of life,” says Gunes, “and they’re so much better than an app. Robots have a physical presence – people tend to find them more engaging, which has more impact on their behaviour.”

“I want to work out how a robot can engage a person in a wellbeing practice such as mindfulness – how to express empathy, provide feedback, and instruct and demonstrate in a personalised way,” she says. It’s not just a technological challenge: a robotic coach also has to be accepted as a realistic option.

Gathering opinions

Gunes is taking an iterative approach to achieving her vision, beginning with two projects to identify specific challenges so they can be addressed from the early stages of the robot’s design. Using a focus group and interviews, her team explored people’s attitudes to being coached by a robot, and what they would expect from it. Everyone in the group had experience of some form of wellbeing practice, such as mindfulness or life coaching.

“Some people were more open to the idea of a robot coach than others, but many said they would give it a try if there was evidence to show the sessions were effective,” says Minja Axelsson, a PhD student working on the project.

Axelsson asked the discussion group what the robot should look like: should it be a machine without a face? Or if it has a face, would an animal or a human face be better? The group were shown videos, made by external developers, of different types of robot to help them understand the capabilities of each.

These included Pepper – ‘the emotional humanoid robot built to benefit mankind’:

MiRo – ‘the empathetic dog-bot who could be your granny’s next carer’:

and Jibo – ‘the first family robot’, which was the most machine-like of all:

“We found there wasn’t any clear preference on what the robot should look like,” says Axelsson. “But what was consistent was that they all expect its form to match its function. So, if the robot is going to offer reminders and interact through empathetic gestures, then a dog-like robot could work. But if the robot is going to have conversations and offer advanced interventions like solution-focused coaching, it should be more humanoid.”

The group’s biggest concern in using a robot wellbeing coach was privacy. “There were also worries about the cost, and ability to operate such a complex technology,” says Axelsson. “All these things will be taken into account as the robot is developed.” Three human wellbeing coaches were also interviewed separately to get deeper insights and suggestions.

As good as a human?

Asecond project investigated how effective mindfulness sessions with a robot could be. Study participants attended coaching sessions with Pepper the robot, whose humanoid form gave it points for likeability from the start.

Pepper has not yet been developed as a mindfulness coach: in these sessions it was actually being tele-operated by a human coach in the room next door. Through a virtual reality headset the human could see and hear things from the robot’s perspective, speak through the robot’s mouth, and make movements that were projected onto the robot’s moving parts using real-time pose recognition.

“The session participants knew that Pepper was not working autonomously, but they still got the impression that they were interacting with a robot,” says Dr Indu Bodala, a post-doctoral researcher in the team, “so we can see how they perceive the experience compared to interacting with a human coach.”

Over five weeks, one group of participants had weekly mindfulness sessions with the robot, and another group was coached weekly by a human. The team found interesting differences between the two.

“We asked people about their state of mind before and after each session. Both human and robot coaches helped the participants feel more relaxed and calm in every session. But there was more variability in how pronounced this effect was with the human coach.”

The researchers think that Pepper may have provided more consistent sessions because there was no additional, unintended communication taking place.

Group mindfulness session with Pepper the robot

Group mindfulness session with Pepper the robot, teleoperated by a human coach in another room

“Human to human interactions are influenced by non-verbal behaviours,” says Gunes. “Say one day the coach is feeling anxious or tired, for example. This might be communicated through unconscious signals from the face, voice and body that the participant picks up on. With robots these things can be minimised. Robot behaviours can be programmed to be stable, predictable and reproducible.”

Participants in the study were also asked about their attitude towards the robot after each coaching session – to see whether results were consistent over time and not influenced by an initial novelty effect. In fact their perceptions all improved over time, which the team thinks could be due to participants becoming accustomed to the robot’s capabilities. Interestingly, participants’ personalities affected how enjoyable they rated the sessions.

“Those scoring higher for conscientiousness expected the robot to move more naturally than it currently does. Those with higher neuroticism didn’t enjoy the sessions as much,” says Bodala. “These results show us that design of the robot coach requires personalisation – we have to match the coaching to aspects of people’s personality for it to be most effective.”

Refining the machine

Gunes’ next project is working to provide some level of autonomy to the robot coach, and she stresses that an iterative design continues to be crucial in getting it right. She also aims to integrate data on facial and bodily expressions into the machine-learning models to help with adaptive and more realistic two-way interactions. This relies on recording the faces of participants during coaching sessions – something that mask-wearing during the pandemic has stalled.

There is still a lot of work to be done in transferring the abilities of an experienced wellbeing coach to the robot so that it can work autonomously. And as many participants pointed out: a robot cannot replace a human in providing wellbeing solutions. But what it could do is fill the gaps where a human coach can’t be.

“There are places where wellbeing robots could be very useful,” says Gunes, “especially within organisations – I can imagine having a dedicated room at work where people can go for a private wellbeing coaching session whenever they need it.”

Gunes says the robots could be continually improved through their ‘collective intelligence’: data from multiple robots interacting with multiple people could be combined to help provide more customised services for different cultures or target groups.

“I know it’s far-fetched,” she says, “but this is what I’m working towards. I hope, one day, wellbeing robots will become another assistive technology that can be prescribed just like a pair of glasses.”

This story was originally published by the University of Cambridge

Main image: Paolo Negri on Flickr

This research is funded through a five-year Fellowship from the Engineering and Physical Sciences Research Council (EPSRC).

Research enquiries: Dr Hatice Gunes

Media enquiries: Jacqueline Garget