Future Tech

Digital relationships: can your voice assistant truly be your friend?

Tan KW
Publish date: Sat, 22 May 2021, 10:42 AM
Tan KW
0 465,177
Future Tech

Voice assistants have become part of everyday life for many of us. They answer questions, help to get information quickly, play music on demand and remind us of appointments.

With friendly voices and pre-programmed answers to funny or philosophical questions, you could almost get the impression that you are cultivating a kind of relationship.

But only almost.

When interacting with technology, people like to anthropomorphise objects to explain processes they otherwise don't understand, says Esther Goernemann from the Vienna University of Economics and Business.

She says participants in studies report that Alexa is "insulting","cheeky" or "charming", or even "a little family member who sits at the breakfast table in the morning." The tendency to anthropomorphise is particularly pronounced among children.

But Goernemann believes there is also a social motive for humanising objects - and this is where it gets interesting in relation to the pandemic. "We try to compensate for a lack of social bonding with other people," she says. If you are lonely you are more inclined to form social bonds with objects.

Make a phone call!

In general, though, you shouldn't worry if you notice you're talking to a digital assistant a lot, says Professor Arvid Kappas of Jacobs University in Bremen, Germany.

"We know that solitary confinement is one of the worst things you can put people through," he explains, but cautions that you're better off talking on the phone with real people.

The fact that children, in particular, can perceive voice assistants as real beings does not surprise Professor Kappas. "We don't worry when children talk to their teddy bear for a long time and think that the teddy bear has a soul."

The latest generation of voice assistants can understand language much better than was previously the case. Nevertheless, we are still relatively far away from being able to have a deep conversation with an assistant.

Esther Goernemann shares this view, but believes that this could soon change due to technical progress in the field of artificial intelligence (AI). "We now have AI that can formulate amazingly good texts and is surprisingly creative and versatile,” she says.

"Such a good language model is an essential component for a voice assistant can build a social connection with."

Limitations and opportunities

Voice assistants are not empathetic soul mates - they're just another medium for communication and speeding things up, says Professor Andreas Dengel, Director of the German Research Centre for Artificial Intelligence (DFKI).

"Humans also need negative conversations in order to feel empathy," he says. "Interpersonal communication is more multi-layered and multi-dimensional than a conversation with a voice assistant."

Professor Dengel warns against allowing children to play with voice assistants too much, as this could have a negative effect on their ability to communicate.

"Communication consists not only of speech, but also involves many non-verbal forms of communication, such as facial expressions and gestures or the reflection of the other person,” he says, “and you just don't learn that with these devices."

While many limitations remain, Professor Kappas believes voice assistants can be very liberating for older people, for example by reminding people of appointments or to take medication.

"A natural voice interface is much more suitable for older people who may not be able to type or look at a screen very well," he says. You can also simply ask the voice assistant to call someone, without searching for a number and typing.

Surveillance and advertising

Voice assistants always carry the risk of surveillance, says Esther Goernemann. "I see it as problematic that we reveal more personal information when we establish a social relationship with our voice assistant. This happens involuntarily and we may be unaware of it."

Manufacturers have already developed patents that are supposed to pick out advertising-relevant keywords from voice inputs. Over a long period of time, the companies will be able to determine, for example, which advertising could work when.

Advertising could then be adapted to situations so individually that people would not even notice that their own behaviour was being manipulated, Goernemann warns.

"As long as tech giants screen us down to the smallest detail and this process remains as opaque as it is now there is a danger that we will behave the way the manufacturer wants us to and we won't even notice."

 - dpa

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment