From hospitals, through to the home and the classroom, Prof Belpaeme sees a bright future where robots can help young children learn, stay healthy and provide care support.
For several years now the work of Prof. Belpaeme and his co-workers have focused on the therapeutic benefits of looking at child-robot interactions. Work that started in the research laboratories around Europe is now being taken up by commercial organisations keen to establish a “first to market” presence in these developing application areas – that potentially – have large commercial benefits alongside significant medical ones.
Projects like ALIZ-E (www.aliz-e.org) which focused on children with diabetes, have required multi-national and multi-disciplinary teams to bring numerous skills to the project. Belpaeme’s specific research interests lie with human-robot interactions, but other expertise such as artificial perception, language processing and psychology have all been involved with the project that started in 2010 and came to a close during 2014. An excellent video, providing an overview of the project can be found here: https://vimeo.com/111655200
ALIZ-E has provided significant insights into how robots may be used to assist and help child patients who have diabetes. Now, in late 2014, a new EC funded multi-disciplinary project; DREAM (www.dream2020.eu) is taking some of the learnings from the ALIZE-E work and are building on them in the study of how robots can enhance therapeutic interventions with children who are on the Autistic Spectrum Disorder (ASD).
Active8 Robots talked to Prof Belpaeme at the University of Plymouth about the use of the NAO robot in teaching within the university, and then went on to discuss some of the findings of the ALIZ-E project and the plans for the four year DREAM project.
“We typically have two types of student at the University of Plymouth, those that like to build robots, and those that like to program them. For the later we give them a NAO robot – it’s robust – it’s very easy to get to grips with, and has a very low entry point. It’s very easy, in a matter of minutes it’s possible to create a simple program and get some results.
NAOqi allows the student to access and program via a number of languages, including Python and C++. In a recent project we also use YARP to hook up robots and computers in a large network, so large data volumes can be handled for image processing. Although NAO has a camera, it doesn’t have the necessary processing power internally, so the processing happens elsewhere and the results are sent back to NAO.
One of the nice things about NAO is that it can cover the principals of programming with secondary school kids, all the way through to very large academic research projects; and at a price that is incredibly affordable.”
The ALIZ-E Project
“The ALIZ-E project was a multi-group, multi-site European project and a requirement was to have common systems across all collaborators to allow the easy sharing of information, results and approaches. We looked around for which robot to use, and there really wasn’t a good alternative to the NAO robot. It was robust, a European product, and there wasn’t anything out there that really competes with it – and there still isn’t. Even if we had to make a choice today, it would still be for the NAO robot for our work.
We’re using NAO in medical research applications, it has CE certification which is necessary to be used in these environments and commercial applications and most other robots don’t have that.”
When you consider that this research work involves using the robot, ideally unsupervised, with a young child you can appreciate the care and concern that has to go into making such a selection. Hospitals and other care centres were initially very cautious and conservative regarding the use of these small humanoid robots within their treatment areas.
“Five years ago it was incredibly challenging to get a hospital to co-operate with us. They were concerned by the unknown, they imagined large science-fiction robots and they were worried. We had to lobby for around two years before we eventually broke through their concerns and they agreed as long as the robot was one metre away from the child and an adult therapist was always present. Of course, that kind of gets in the way of doing the child-robot interaction research, but you have to start somewhere”.
After a couple of years, the trust and relationship built up between the research groups and the clinicians so that, eventually, it was allowed for the robot and child to be in the room together and to allow the child to touch the robot.
“From there it just took off. As soon as one hospital was seen as having these robots, you get other hospitals phoning you asking if you need another site for your research work. In the end we had eleven hospitals around Europe that we were working with and we just couldn’t handle anymore.”
The project focused on hundreds of children between seven and eleven years old and studied them whilst they played with and learned from the NAO robot. All the time the children were increasing their own understanding of diabetes and how they could best manage their condition. The presence of the robot in the paediatric ward gave the children increased confidence in that they had a ‘friend’ supporting them and hospital visits were no longer something to be anxious about.
The work has led to over 160 scientific papers being published and a number of insights on how children relate to social robots. For example, children are forgiving of mistakes that are made by the robot, they form stronger relationships and retain more about what the robot has taught them. It has been found that if you provide some educational material using a screen, such as a TV or tablet, the child learns 50% more if the same material is delivered by the NAO robot.
“There will be a follow-up project to ALIZ-E, called PAL, which is going to look at extending the work in diabetes care and education. One of the things we couldn’t do with ALIZ-E was to offer the robot to be taken home by the child, as they are still too expensive. So, typically the child will meet the robot in hospital and have a few sessions and learn from the robot. The robot will learn about the child’s needs and personality and this will be transferred to an App that the child can use at home and continue the relationship and learning this way.”
The DREAM project
The DREAM (Development of a Robot-Enhanced therapy for children with AutisM spectrum disorders) started in early 2014 with the remit to explore using robots in delivering therapeutic interventions where there is a greater degree of autonomy for the robot and there is less requirement for the presence of a therapist, which is the case at the moment. This will require the robot to make decisions based upon observing the movements and intentions of the child, rather than this decision being taken by the therapist. The robot will also be able to function as a diagnostic tool and collect clinical data during these therapeutic interventions.
“DREAM is using the same underlying principles that we developed in ALIZ-E and applying them to children with ASD. We’ve known for years that children respond positively to robots, especially autistic children, for reasons we don’t fully understand yet. There are some theories out there, one is that the robot is repeatable – it always does the same thing (unlike humans) – another is that robots do not judge, you are allowed to make mistakes and the robot will not judge you.
What surprised me is how large the ASD spectrum is. The children are all different and on this very wide spectrum. However, there are so many heart-warming stories out there of how children with ASD meet a robot and respond in a way that there carers / parents have never seen before. The child will explain what the robot is doing, they respond well to robots and want to involve others with their robot interactions. So using a robot is really good as it provides a focus of attention for a child with ASD.”
Much of the software developed during ELIZ-A to help NAO interact with the child has been ported across to DREAM. So the project is building on a good foundation. There are two areas that Prof. Belpaeme wishes to explore in this new project, one is using NAO as a therapeutic aid and the other is as a diagnostic tool.
“One use is a therapeutic aid in understanding ‘joint-attention’. If you consider, eye contact and other social skills, can we rehearse that with a robot? It’s unlikely that an ASD child will pick up this skill, but they can learn to understand and rehearse these skills. Also, people have to learn ‘social distance’ what is too close, what is too far away when interacting with someone? These social norms we learn and practice, and you can achieve this with a robot as well.
The other area is diagnostic. Right now diagnosis for ASD requires the administration of a specific test, which unfortunately requires subjective interpretation and judgement by medical staff. This means there can be a lot of variability in the results of the test. So, for example there is a “tick-box” for ‘Child looks at me when I call their name’. I’ve seen an ASD child playing intently with his toys, totally engrossed, and when the therapist calls the child’s name, they do not look up – so a box gets ticked.
There is a hope that we may be able to help here with NAO. We are keen to explore this as the robot could potentially provide a more objective, even quantifiable, assessment of ASD.”
In conclusion, we asked Prof Belpaeme to speculate about other possible research areas and how the future might look as the use and uptake of robots in these everyday situations increases.
“We’ve learned that children learn 50% more if the same educational content is provided by a robot over a TV or tablet. We don’t really know why this is. But this is powerful information.
So, imagine a classroom, with 30 or more students but only one teacher. Some of the student will be faster, some will be slower, but the teacher has to work to the average ability of the class. There is not enough time to focus on those children who need additional attention. Now, imagine this classroom has a couple of robots within it. The robots recognise you as a student, see that you are having difficulties and the robot approaches “Hi, are you having trouble, do you want some help? Last time you didn’t do too well with your seven times table, shall we do this together?” We’ve trialled this in schools in Plymouth – and it works – that’s what I can see happening in the future: your friendly robot classroom assistant that would help you when you are stuck.
That’s an area I would like to spend some time on….”
PDF Download – Professor Tony Belpaeme – University of Plymouth