AI-Enhanced Interspecies Communication

 While we humans have a hard time communicating among ourselves, we find it even more difficult to communicate with individuals of other species. The main interaction between species consists of eating members of other species, or being eaten. We humans consume huge amounts of pig, cow, and chicken meat, drink milk, eat eggs, bread, rice, corn and other veggies. However, when we are not eating plants and animals, we also like to talk to them. Some people even talk to their houseplants. And owners of dogs, cats, and horses of course talk to their pets all the time, maybe even considering them their soulmates. But do the plants and animals really understand what we are trying to say to them? And even more important, can we understand what the dog, horse, cat, or a mimosa or basil plant is trying to say to us?

Before we can start talking to others, we need to listen to what they have to say, and try to make sense of their output. Computers and artificial intelligence have made huge progress over the last twenty years helping us both to listen and to talk to each other. Today’s New York Times gives a great overview of using computers to read the brain which explains how computers are capable of knowing what we are thinking by tracking the activation of combinations of brain cells indicative of certain words. In the currently best implementations computers can read up to 250 different words at 90 percent accuracy by looking at our neurons. In genetically engineered mice, computers have also spoken to the mice, telling them when to drink water by turning on their neurons, “playing them like a piano”. Google Translate and Deepl have become language geniuses, dynamically translating from English to Chinese, with me talking to my phone in English in the hotel in Beijing, and the phone repeating my sentence to the hotel receptionist in Chinese.

What if we could do the same when talking to a horse, a dog, or a mimosa, with the computer telling the animal or plant what I would like to say, and then the animal or plant talking back to me.

Our research group has been studying how humans communicate online and face to face for the last two decades. In our work we have been building many tools for happier, more creative, and more productive collaboration among humans, leveraging the computer and AI to read the "honest signals" and emotions of what a human really wants to say beyond the literal meaning of words.

In the last few years we have been applying these algorithms and technologies to interspecies communication. We have used body sensing technology to better understand communication with horses, and facial emotion recognition to understand emotions of dogs and of horses  . 

In our most recent projects we are trying to listen to plants. We have put “brain sensors” on mimosas, as they talk back to the outside world quite visibly: when their leaves are touched, they fold them. We found that they seem to sense the electrostatic discharge of human bodies and the rhythm of body movement near them. When putting our sensors on other plants such as basil they show the same response. We also recorded the leaf movement of the “dancing plant” (Codarialcalyx Motorius) in response to human voice and music. Using automatic image recognition we found that its leaf movement was different in reaction to male and female voices talking nearby.

As this is a brand-new emerging area of research, we are at the very beginning. Imagine how wonderful it will be not just talking to your dog, cat, horse, or houseplant, but being talked back in understandable words, and engaging into a real dialog. 

Much more research is needed, if you are interested in collaborating in our research, we love to hear from you. 



Comments

Popular posts from this blog

Predicting Stock Market Indicators Through Twitter “I hope it is not as bad as I fear”

My life in Alternative Realities - fatherlanders and nerds don’t go well together

How Blind Trust in Generative AI Undermines Creative Work