A Sheffield Hallam University academic is researching intelligent social robots, which it’s hoped can have a range of positive impacts on patient health and wellbeing in the future.
HTN recently attended the HETT Show 2021, where Alessandro Di Nuovo, Professor of Machine Intelligence at the South Yorkshire university, gave a presentation on his research and findings.
The session, entitled ‘Intelligent Social Robots to promote independent lives’ provided real-life examples and case studies on how this form of AI could one day assist users with everyday activities and in clinical settings.
Three examples – of robotic assistance for the elderly, robotic assistance for young children with autism and robots acting as cognitive assessors – from the professor’s own work, illustrated how “intelligent robots can be a suitable social partner, to assist and promote health and wellbeing”.
The researcher ran through four categories of intelligent social robots, from telepresence and robotic pets to the more advanced service robots and humanoids, which have more autonomy and intelligence but less ‘reliability’.
Professor Di Nuovo said: “Service robots…they have more complexity like having robotic arms…can include a very wide range of platforms and provide several different services to support people’s independence at home.
“For instance, they can facilitate the access to other technology in the house and support everyday life tasks like taking the temperature, heart rate, blood pressure, remind to take medicines or water, or support walking inside or outside and other physical activities.
“Finally, we have the last class, the most advanced, which is the humanoids…they can run advanced artificial intelligence algorithms, give the maximum level of autonomy in their activities. They can simulate complex social behaviour [and] provide advanced services very similar to those that are provided by human beings.
“They can also engage users through a variety of multi-model interfaces, including more natural ways of communication, like speech, touch and gestures.”
However, the humanoid robots’ higher level of services and autonomy also brings higher expectations. So the professor also went on to highlight the technology’s current limitations: “Performance is limited by the available technology, for instance battery or speech recognition…would inevitably impact in the real world.”
“We are making significant progress in the labs…but their complexity requires more multidisciplinary research and experimentation before they [become] fully operational in the real world.”
But the Professor did provide three fascinating case studies from his collaborative research, on how the robots can or may be used to support independent living.
The first study – The Robot-ERA project – sought to discover the potential impact of social robots on the care of elderly people. As explained to us by the professor: “[The study] created and tested 11 advanced robotic services, integrated in intelligent environments…[to] between them provide independent living solutions.”
Testing took place over four European countries – Italy, Sweden, the UK and Germany – and included over 100 elderly participants. Services provided by the robots had been identified by potential users and included carrying and manipulating objects, cleaning, disposing of garbage, surveillance, support with walking, laundry etc.
Participants were invited to ‘living labs’ that acted as replicated flats to interact with the automated, robot-provided services. The results, using a System Usability Scale, indicated a score of superior usability in the top 10% of any technological system considered by the scale and over 80% of participants reportedly expressed a ‘strong agreement to adopt the system’.
The professor had also investigated social robots as ‘cognitive assessors’ – to “autonomously administer and evaluate a test for assessing the cognitive level of the user”.
In this project – Cognitive Assessment through Human-Robot Interaction (CATHI) – the data collected by the robots could be standardised and ensure objectivity and lack of assessor bias. It could then be secured in a cloud for clinicians to access and consider.
As a “proof of concept of the idea”, the research team implemented and adapted a popular cognitive test to the multi-model interfaces of a robot. The user was able to listen to the robot’s instructions and interact via a touch screen, sensors and microphones, while the system was able to calculate a score to be used for screening and continuous assessment.
The third project – H2020 CARER-AID – focused on the ability of robots to support children with disabilities and assist carers. This research, funded by the European Union, specifically focused on the assessment and therapy of children with autism.
Professor Di Nuovo explained: “Small humanoid robots proved to be effective social partners for the children…that because of their condition find [it] easier to interact with robots.”
Researchers designed robotic-led activities to train seven children to imitate movements and gestures and detect emotions, across a course of daily, four-week therapy.
“At the beginning, none of the children were able to imitate the movements or understand the emotions,” the professor said, “but then, after the robotic-led training they also demonstrated to have acquired new skills and to be able to generalise and apply them to interaction with other human beings.”
They also observed that, without further training, the children had still retained these skills three months later.
Currently, this project is studying the use of a robotic assistant to “reduce anxiety and improve compliance in children with autism” at the Sheffield Children’s Hospital. The robots are utilised in potentially ‘scary’ or ‘stressful’ situations, such as before invasive treatment like having blood samples taken.
In a preliminary study they received “enthusiastic comments” from parents who said the robots took the emotion out of the situation and improved compliance.
Find out more about the professor’s work and research via the University of Sheffield.