Ten years ago, I wrote an article for Cosmos magazine with the title ‘Why robots will not replace classroom teachers any time soon.’ The article mentioned two significant problems. One was that robots were not capable of effective two-way communication with children. The other was the inability to help children overcome a difficulty. Just re-posing questions that a child is getting wrong does not provide the insight of how to think differently, to make ‘the penny drop,’ so to speak.

Communication with computers has improved significantly over the past ten years. The advent of Generative Artificial Intelligence in systems like Chat-GPT gives an impression of effective communication. However, an important deficiency of such systems is that they are not emotional. The purpose of this short article is to explain the claim that robots (or systems using current versions of artificial intelligence) are not emotional.

 

 

STEM & Robotics Workshops for Schools!

Did you know that we offer Inspiring Robotics & STEM Workshops For Teachers & Students Of All Ages? So, dive into an exploration of Robotics and STEM workshops delivered by one of our qualified STEM educators! Our programs are fully customisable and can be tailored to your students’ competencies and programming experience.

  • What exactly is a robot?

Essentially, a robot is a mechanical device that can be programmed to follow a set of instructions. The robot has a processing unit, sensors to perceive its environment, and motors and actuators to move its limbs or wheels.  It may speak, make other sounds, or flash with lights and colours in response to the environment as per instructions.

  • What exactly is an emotion?

The nature of emotions is debated by psychologists. The prevailing view used to be that there was a universal set of emotions experienced by people worldwide such as happiness and sadness. Moreover, that these could be detected by facial expressions and body language. A more recent view is the Theory of Constructed Emotion where people learn to recognise bodily sensations in a certain way and associates emotion concepts with them. In simple form, the theory claims that “In every waking moment, your brain uses past experience, organized as concepts, to guide your actions and give your sensations meaning. When the concepts involved are emotion concepts, your brain constructs instances of emotion.” That is not what robots do.

Fable, a Robot Designed for Education

The Fable robot system can be used across a range of subjects and classes, allowing students to gain skills they’ll need in the 21st century. Fable is an open-ended system with advanced functionality. Fable encourages students to be both creative and innovative as they build robot prototypes to meet needs in the real world.

Now consider a companion robot with vision and sound capabilities, of which there are an increasing number available as products. Such a robot could be programmed to laugh when it saw a face painted on a toy. I contend that the robot laughing on recognising the face on the toy would not be an emotional reaction. The laughing is of no benefit to the robot in terms of settling it down as was the case for my daughter. Holding up a toy in front of a robot would not be a method for it to deal with discomfort in the world around it. Nor would it be remembered by the robot as a comforting experience.

It is important to appreciate that the robots are not having body sensations as people do. I do not believe it is helpful to describe the robots in emotional terms. Any identification of emotion concepts is by the programmer/designer, not by the robot. To relate it to constructed emotion theory, the robot can use an emotion word but is not participating in the same emotional experience as the human.

Incidentally, robots do elicit emotions in people. However, that is because of people’s inherent tendency to anthropomorphise and treat objects as social beings. Even robots that do not resemble humans, such as Roomba, ElliQ, Starship robot, and AIBO can bring about strong emotional reactions in humans. For example, in 2017 an incident was reported in Estonia, where a drunk man attacked a Starship delivery robot, kicking it several times. I personally have many times watched the delight when people see a Nao robot performing a dance to ‘Gangnam style’.

While not a teacher, I believe that teachers being able to relate to children’s emotions is an important skill. Robots are incapable of directly experiencing emotions. A programmed comment that a robot is sorry is not the same thing as feeling sorry. People are usually able to discern the difference. I still believe the title of my article from ten years ago. Robots will not replace teachers in the classroom any time soon.

About The Author

Leon Sterling

Professor In Software Engineering Computing and Information Systems at The University of Melbourne

Professor Leon Sterling is a career academic with a distinguished academic record. After completing a PhD at the Australian National University, he worked for 15 years at universities in the UK, Israel and the United States.

He is an academic based in Melbourne, Australia with a 40+ year career working across the fields of artificial intelligence, ICT and design.

His current research focuses on incorporating emotions in technology development, where motivational models are an essential element.