Artificial intelligence is working miracles. Some of AI’s most recent breakthroughs, such as converting word prompts to visuals with Dall.E and converting them into complete videos with Sora, stretch the boundaries of plausibility. However, not everyone is concerned with simply creating content; other entrepreneurs are leveraging technology to build excellent companions. Pi, Inflection AI’s “intelligent and kind” chatbot, is one of them. Hume, a New York-based research lab and technology startup, has now introduced the ‘first conversational AI with emotional intelligence’.
What is Hume’s AI speech interface?
To put things in perspective, all contemporary AI apps obey instructions. Imagine if they could also grasp the user’s emotions and the meanings underlying their statements. This is what Hume’s new AI speech interface accomplishes. The startup claims to be developing empathic AI that will serve human well-being’ via an application programming interface (API) capable of interpreting emotional expressions and providing empathic replies. For the uninitiated, an API can be thought of as a channel that connects one piece of software with another.
To put things in perspective, all contemporary AI apps obey instructions. Imagine if they could also grasp the user’s emotions and the meanings underlying their statements. This is what Hume’s new AI speech interface accomplishes. The startup claims to be developing empathic AI that will serve human well-being’ via an application programming interface (API) capable of interpreting emotional expressions and providing empathic replies. For the uninitiated, an API can be thought of as a channel that connects one piece of software with another.
Hume refers to their breakthrough as Empathic Voice Interface (EVI)
According to the website, Hume’s conversational AI interface is not a stand-alone program. Its technology can power a variety of applications. Hume refers to their breakthrough as Empathic Voice Interface (EVI), which is simply an API driven by its own empathic large language model (eLLM). This eLLM supposedly understands and emulates speech tones and word emphasis to improve human-AI conversations.
Perhaps the most notable aspect of Hume’s conversational AI technology is its integration capacity, as opposed to being a standalone program. According to the claims, the technology has the potential to revolutionize a variety of fields by enabling human-like interactions in applications.
Potential future applications include AI assistants that can interact in a human-like manner, customer support representatives who can give a more natural and relatable service, and therapists that can understand a wide spectrum of human emotions and thoughts. There is also a simple handbook provided for individuals who want to access and interact with this revolutionary technology.
How to use Hume AI?
The procedure of interacting with Hume’s demo is quite straightforward. To begin, go to their website, hume.ai, grant microphone access if using a laptop, and start chatting. Users can speak freely, and the AI will answer while visibly showing the user’s tone and emotional state via an on-screen bar.
Try not to reveal any personal or sensitive information throughout this encounter. Office experiments show that the AI has a sense of humor and can match the user’s tone, providing an engaging experience.
The organization thinks that its objective is to ensure that AI is designed to support human needs and emotional well-being. The company is named after David Hume, a late Scottish philosopher and historian. “Hume claims that emotions influence choice and well-being. At Hume AI, we view this as a guiding concept for ethical AI: to fulfill our desires, algorithms should be influenced by our emotions,” according to the company’s bio.