A picture of a robot heart - conversation with emotion with Hume.ai

Conversation practice with emotion : Meet Hume.ai

If the socials are anything to go by, so many of us language learners are already using AI platforms for conversation practice – whether text-typed, or spoken with speech-enabled platforms like ChatGPT.

Conversational interaction is something that LLMs – large language models – were created for. In fact, language learning and teaching seem like an uncannily good fit for AI. It’s almost like it was made for us.

But there’s one thing that’s been missing up to now – emotional awareness. In everyday conversation with other humans, we use a range of cues to gauge our speaking partner’s attitude, intentions and general mood. AI – even when using speech recognition and text-to-speech – is flat by comparison. It can only simulate true conversational interplay.

A new LLM is set to change all that. Hume.ai has empathy built-in. It uses vocal cues to determine the probable mindset of the speaker for each utterance. For each input, it selects a set of human emotions, and weights them. For instance, it might decide that what you said was 60% curious, 40% anxious and 20% proud. Then, mirroring that, it replies with an appropriate intonation and flex.

The platform already supports over 50 languages. You can try out a demo in English here, and prepare to be impressed – its guesses can be mind-bogglingly spot-on. Although it’s chiefly for developer access right now, the potential usefulness to language learning is so clear that we should hopefully see the engine popping up in language platforms in the near future!

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.