Social robots are increasingly being used in various fields, including healthcare, education, and retail.

However, there are still some visible disconnects between humans and these robot dogs, primarily due to the fact that the latter lack emotional intelligence.

To address this issue, researchers are working on a project that could change the way humans interact with robotic companions.

By combining artificial intelligence (AI) and wearable technology, they aim to create a robotic dog that can understand and respond to human emotions. 

 

Making robotic dogs 'alive'

Assistant professor Kasthuri Jayarajah at the New Jersey Institute of Technology’s (NJIT) and co-principal investigator Shelly Levy-Tzedek at Ben-Gurion University of the Negev (BGU) are leading this research.

They aim to develop a robotic dog that adapts its behaviour and interactions to the user’s personality and emotional state.

Jayarajah’s goal is to develop a socially assistive model of her Unitree Go2 robotic dog.

“The overarching project goal is to make the dog come ‘alive’ by adapting wearable-based sensing devices that can detect physiological and emotional stimuli inherent to one’s personality and traits, such as introversions, or transient states, including pain and comfort levels,” said the press release.

A video by the researchers shows the robot dog following gestures from its owner who wears a special headband. It creates the illusion of mind control. 

Researchers aim to develop robotic dogs that can sense and respond to human emotions. Photo: Ying Wu College of Computing.

Overcoming challenges

It has significant implications for vulnerable populations, such as the elderly and individuals undergoing therapy. The robotic dogs could offer companionship, emotional support, and therapeutic benefits, potentially revolutionising mental health care and addressing the issue of loneliness.

“While the concept of socially assistive robots is exciting, long-term sustained use is a challenge due to cost and scale,” remarked the release, citing Jayarajah.

In order to overcome this challenge, the team is investigating the use of wearable devices like smartwatches and earphones to collect data on user attributes. These devices can monitor various physiological and behavioural signals, providing insights into a person’s emotional state.

“The project aims to combine such multimodal wearable sensors with traditional robot sensors (eg, visual and audio) to objectively and passively track user attributes,” highlighted the researchers. 

By integrating this data with traditional robot sensors, the robotic dog will be able to create a comprehensive understanding of its user. This will allow it to tailor its behaviour in real time, providing personalised interactions that meet the individual’s specific needs. 

Early stages, yet promising results

This project, sponsored by the Institute for Future Technologies (IFT), a collaboration between NJIT and BGU, is still in its early stages.

Nonetheless, Jayarajah’s initial research on robotic dogs that understand and respond to gestural cues has already garnered attention. Her findings will be presented at the International Conference on Intelligent Robots and Systems (IROS) later this year.

However, the researchers acknowledge the technical challenges, such as the limited processing power and battery life of current robotic dogs.

“Robots like the Unitree Go2 are not yet up for big AI tasks. They have limited processing power compared to big GPU clusters, not a lot of memory and limited battery life,” said Jayarajah. 

Despite this, they are confident that their work will pave the way for a new generation of robotic companions that can connect with humans on a deeper level.

Interestingly, the fusion of AI and wearable technology holds immense potential for transforming the way we interact with robotic companions. The creation of robotic dogs that can truly understand and respond to our emotions will open up new possibilities for enhancing human wellbeing and social connection.