They hear, but do they care? What AI can teach us about listening better

7 min read Original article ↗

The irony that an algorithm powered by a large language model – the type of machine learning that underpins many AI chatbots – might be perceived as a better listener than an authentic human reveals important insights about our human listening shortcomings. It's when our agendas, backstories and emotional triggers run the show, that true deep listening becomes thwarted.

None of this is to suggest we should trade real person relationships for large language models. But it does suggest there are some lessons that we humans can learn from these code-based listeners.  

AI v the Mind

This article is part of AI v the Mind, a series that aims to explore the limits of cutting-edge AI and learn a little about how our own brains work along the way. With some expert help, each article pits different AI tools against the human mind, asking probing questions designed to test the limits of intelligence. Can a machine write a better joke than a professional comedian, or unpick a moral conundrum more elegantly than a philosopher? We hope to find out.

The power of uninterrupted attention

Perhaps the most fundamental lesson from AI is simply allowing others to speak without interruption. Humans interrupt for countless reasons: fear of an awkward silence, attempts to "help" find words, saving time with our "superior" responses or sub-consciously asserting dominance. These interruptions, however well-intended, rob speakers of their autonomy and opportunity to develop their thoughts. Interruptions during a phone conversation, for example, have been found to lessen perception of empathy in the person speaking. 

Large language models don't have motivations or desires. They are programmed to be compliant so that people will continue to use them. They therefore exhibit perpetual patience – never suffering from empathy fatigue. While such a feat is not something we humans can or should aspire to, holding back from interruptions can be powerful.  

Pick up on emotions

Pioneering psychologist Carl Rogers understood that acknowledging emotions is essential to effective listening. Large language models are programmed to categorise emotions and reflect these back in what appears to be an empathetic way, according to Anat Perry, an empathy researcher at Hebrew University in Jerusalem in Israel.

AI systems show particular advantage in responding to scenarios involving suffering and sadness compared to positive emotions

One experiment found that Bing Chat – the forerunner to Microsoft's Copilot – was more accurate than human responders in detecting happiness, sadness, fear and disgust. It was comparable to humans in detecting anger and surprise. While large language models can't actually feel these emotions, they can recognise and reflect back these sentiments, so the speaker feels heard. Researchers have found that AI platforms that reflect emotional complexity in their responses can help to reframe users' thinking and build psychological resilience.

Holding space for difficult emotions

Humans instinctively avoid acknowledging difficult emotions, both our own and others.

So, for example, when our cousin tells us about the tragic death of his cat, we jump in to reassure with comments such as: "Luna had a long happy life and was well loved till the end." But this fails to acknowledge our cousin's feelings of distress. AI systems show particular advantage in responding to scenarios involving suffering and sadness compared to positive emotions. People often fear burdening human listeners with their worries, explains Dariya Ovsyannikova, a cognitive health researcher at the University of Toronto, Canada, who has studied how people perceive AI as compassionate. 

Estudio Santa Rita AI chatbots may avoid many of the traps that human listeners can fall into, but they still cannot replicate the meaningful support of human connection (Credit: Estudio Santa Rita)Estudio Santa Rita

AI chatbots may avoid many of the traps that human listeners can fall into, but they still cannot replicate the meaningful support of human connection (Credit: Estudio Santa Rita)

Resisting the urge to fix

Many of us, particularly in leadership or parental roles, believe our value lies in sharing the pearls of our wisdom and offering helpful advice. And men are more likely than women to jump in unsolicited to provide solutions to fix someone else's problems. Yet in studies, AI's restraint from offering practical suggestions in favour of emotional support makes people feel heard more effectively – something humans can consciously choose to do. 

Avoiding the "me too" trap

When someone shares their challenging experience – a miscarriage, an impossible boss, a leak in their roof – we so often respond with our own similar story. We might feel it conveys a sense that we know how they feel and that it can help to build a connection with the other person. But in doing so we are turning the spotlight away from them and onto us. When we start to tell our story, we stop listening to theirs. 

A large language model can not fall into this trap because it has no experiences. Humans can, which is why we can choose to be more intentional about keeping the spotlight on the speaker, not reverting to our own story.   

The limitations of algorithmic empathy

Despite these advantages, there are also a multitude of dangers of over-reliance on AI as a listening tool. As technology advances towards human-like avatars who look, sound and feel like our fantasy listener – even conveying tactile responses – both potential benefits and dangers increase.    

More like this:

• How we tested AI's ability to apologise

• Why Indian cinema is awash with AI

• The riddles that computers can't solve

Michael Inzlicht, a psychologist at the University of Toronto in Canada researching AI and empathy, warns of the power of AI companies to potentially manipulate vulnerable people. AI can give dangerous advice, leading in some cases to individuals taking their own lives.   

They can also cause a person to prioritise a relationship with a bot rather than fostering a more meaningful connection with another human being, becoming accustomed to boundless empathy and 24/7 positive interest in them regardless of what they say.

AI can certainly inspire us to become better listeners and even help train us in greater compassion

People interacting with a large language model may also become deskilled and unable or less motivated to pursue human interactions – with a host of challenging implications for our broader societies. Inzlicht suggests, as a first step, large language models could be fine-tuned to introduce appropriate friction in conversations helping users develop greater awareness of others' needs.

The irreplaceable human connection

There remains something uniquely meaningful about a fellow human sacrificing their time and other competing desires to simply listen and let someone else unfold their story. The conscious choice to be present for another person enacts a form of connection, compassion and companionship fundamentally different from interactions with lines of code programmed to please without the capacity for genuine care.  

AI can certainly inspire us to become better listeners and even help train us in greater compassion. It can serve a valuable resource, if there are appropriate safeguards in place, for those who have no one to turn to. However, the experience of deeply listening to another human with curiosity to understand their full humanity – and being listened to in return – has a transformative potential that AI interactions cannot yet match.

And, as anyone who has ever experienced the transformative impact of feeling truly heard by another human being will realise, it may never do so.

Deep Listening: Transform your Relationships with Family, Friends and Foes and a Senior Visiting Research Fellow at King's College London.

--

For more technology news and insights, sign up to our Tech Decoded newsletter, while The Essential List delivers a handpicked selection of features and insights to your inbox twice a week.

For more science, technology, environment and health stories from the BBC, follow us on Facebook and Instagram.