Overlooking Lake Geneva, a lab at the Swiss university EPFL is exploring ways to make robots more effective tools for teaching. Postdoctoral researcher Wafa Johal explains why this is so important.

Today robots are present in schools more than ever before. Unlike other digital technologies, such as table computers, they make physical and kinesthetic interaction with the digital realm possible.

Robots as a teaching tool

When robots are used in teaching, the focus is usually on programming or the principles of robotics – for instance, learners need to understand that programming a robot is about sensor input (e.g. distance measures) relative to motor output (e.g. turn left). In some cases students assemble robots from toolkits, in keeping with the “Maker Movement.”

But there are also pre-built robots such as Thymio, a small robot developed at EPFL by Francesco Mondada and his team. Their current popularity – Thymios are being sold at a rate of 2,000 per month – reflects a new mission for the education sector: helping people develop computational thinking skills.

Convinced that robots can do even more to promote education, our team at EPFL’s CHILI lab has developed a new palm-sized robot called Cellulo, which is designed to be manipulated by the user’s hands rather than programmed. Its innovative design allows the user to push the robot in a direction opposite to the robot’s movements without causing damage to the robot. The motion of the robot provides haptic feedback to the users, i.e. they feel forces through their hands. In a recent experiment, haptic feedback enabled learners to experience simulated wind across a map of Europe.

A new release of Cellulo features a set of 10 robots that can be collectively manipulated by a learner, introducing the notion of tangible swarm robotics and collective intelligence. When the learner rotates one Cellulo on a table, the other nine on that table rotate in the same direction. To understand how this works, take a look at this video.

Humanoids for tutoring

Another type of robots, humanoids (robots designed to resemble the human body), is becoming ever cheaper and smarter. These robots are not intended to be manipulated or programmed by students; instead, the idea is to engage with them in many kinds of interactions.

Most young people find these humanoids cute and intriguing, but parents, teachers and journalists raise concerns about their use in educational contexts: “Are you going to replace teachers with robots?“ is an often-heard question. Our favorite answer: “Yes, of course, and we will also replace kids with robots, and then all of the problems of education will be solved!”

But let‘s be serious. Children don’t learn because of their interactions with a robot, but because they are engaging in certain activities that trigger cognitive processes; those processes, in turn, produce learning.

“Children don’t learn because of their interactions with a robot, but because they are engaging in certain activities that trigger cognitive processes; those processes, in turn, produce learning. “

There is a big difference between effortless interaction with a robot and a productive learning activity: While current research on human-robot interaction focuses on eliminating misunderstandings, education experts prefer to design robots in a way that allows for such misunderstandings and even disagreements. Indeed, misunderstandings and conflicts force the child to argue, reformulate and explain, and these activities are known to enhance learning.

Purposely misbehaving robots

Let’s illustrate this approach with two examples using a small humanoid, called Nao, which is a commercial product. In the Co-Reader activity, Nao reads aloud from a storybook, following the words with its fingers – and it sometimes makes a mistake. The child has to interrupt the robot and point out mistakes.

In the Co-Writer activity, children are asked to train the robot to take a writing test; this principle is called “learning by teaching.” In reality, the robot is not capable of practicing writing. It only appears to write on a tablet computer; the child then takes the tablet and corrects the “writing” the robot has produced.

This activity is designed for children around the end of their first year of elementary school. By that time most of them have learned the basics of writing, but some may still have weaknesses in this area – which can be a problem as they continue their schooling. Children who could normally concentrate for no more than 10 minutes, according to their therapists, would try again and again to teach the robot, persisting for up to 50 minutes. This is called the “protégé effect.”

“While current research on human-robot interaction focuses on eliminating misunderstandings, education experts prefer to design robots in a way that allows for such misunderstandings and even disagreements.”

The letters or words in this activity should be chosen with the child’s needs in mind. We adapted standardized tests for dysgraphia, i.e. an inability to write coherently, to the digital realm. By combining a high-quality tablet computer with machine-learning algorithms, we were able to arrive at a very accurate diagnosis in only a few seconds (instead of the three hours it takes to grade a paper-based test).

Many concerns about educational robotics stem from an assumption that robots will be used to do the work of teachers. However, it’s not about replacing teachers. Rather than confining ourselves to applications that duplicate what teachers do, our lab explores out-of-the-box ideas, implements them and then tests them in the schools surrounding Lake Geneva.

These are just a few of the projects we are engaged in at EPFL, a Swiss university that emphasizes research and development in the areas of learning technologies and computational thinking. The activities go way beyond educational robotics. The EPFL recently opened the Swiss EdTech Collider, a collaborative space that houses more than 30 start-ups active in all types of technologies for digital education.

Footnotes

The co-author of this blog post, Pierre Dillenbourg, is a professor of learning technologies at the Swiss Federal Institute of Technology Lausanne (EPFL). Formerly an elementary school teacher, he studied education research before completing a PhD in AI. His research projects focus on computer-supported collaborative learning, educational robotics, learning analytics, eye tracking and MOOCs, among other topics.

4 comments

  1. This action is designed for children at the end of the first year of primary school. At the time, most of them learned the alphabet, but some still have weaknesses in this area – this is a problem when they start school.

    1. Yes, indeed the sooner the better. We actually work on two levels of the handwriting process, the educational aspect and a more medical/remediation level. For educational aspects, our learning activities usually target 5 y.o. children, as they know the letters but still need to practice to get the quality and speed required for taking notes in class [1,2]. On the medical aspect, we work on the diagnosis of handwriting difficulties, and that is done usually in ergo therapy centers, where they treat children from 6 to 14 y.o. [3]. Handwriting takes about 10 years to be learned and difficulties can occur at every stage of the development.

      [1] Asselborn, T. L. C., Güneysu Özgür, A., Mrini, K., Yadollahi, E., Özgür, A., Johal, W., & Dillenbourg, P. (2018, June). Bringing letters to life: handwriting with haptic-enabled tangible robots. In Proceedings of the 17th ACM Conference on Interaction Design and Children (No. CONF, pp. 219-230). ACM.

      [2] Johal, W., Jacq, A., Paiva, A., & Dillenbourg, P. (2016, August). Child-robot spatial arrangement in a learning by teaching activity. In Robot and Human Interactive Communication (RO-MAN), 2016 25th IEEE International Symposium on (pp. 533-538). Ieee.

      [3] Asselborn, T., Gargot, T., Kidziński, Ł., Johal, W., Cohen, D., Jolly, C., & Dillenbourg, P. (2018). Automated human-level diagnosis of dysgraphia using a consumer tablet. npj Digital Medicine, 1(1), 42.

  2. In this article there are some very interesting thoughts, can you back them up with any scientific sources like papers or journal articles?

    Kind regards
    Daniel

Comments are closed.

Keep up to date with the BOLD newsletter