Future Tech

This AI voice assistant is capable of showing ‘empathy’

Tan KW
Publish date: Tue, 02 Apr 2024, 04:15 PM
Tan KW
0 460,014
Future Tech

American startup Hume is launching a voice interface with empathic capabilities, powered by a unique language model. This form of conversational artificial intelligence generates vocal responses adapted to the mood of the human with whom it interacts, to offer a far more natural experience than any other voice assistant to date.

Hume uses an Empathic Voice Interface (EVI), an innovative AI chatbot endowed with emotional intelligence, capable of analysing vocal tones to understand when users finish speaking, and optimising responses to achieve user satisfaction.

Rather than responses that seem unnatural and mechanical - which are criticisms that have been leveled at ChatGPT's vocal interface - with Hume users are treated to a truly immersive conversation, which takes into account both content and form when you address it.

Each time you speak to EVI, the interface analyses your voice in terms of determination, interest, concentration, boredom, calm and satisfaction, and responds with the appropriate tone. The result is the impression of actually talking to someone. Curious users can try it out now: demo.hume.ai.

Already available for testing, the technology will be fully launched in April 2024. Developers will then be able to integrate it into their projects. Its uses are relatively simple to grasp. This type of empathic voice interface could, for example, be used in customer services and call centers of all kinds.

Imagine an after-sales service capable of answering you attentively and empathically. More specifically, it could also be useful in the field of mental health, for dialoguing with patients, for instance.

 - AFP Relaxnews

Discussions
Be the first to like this. Showing 0 of 0 comments

Post a Comment