"Empathy" AI has emerged, will you empathize with it

2024-04-09

When we think of emotional artificial intelligence (AI), the material actually comes from science fiction works, in which emotional AI is often described as a lonely person seeking the same love as humans, or a carefully thought out electronic brain. Now, Hume AI, a startup company led by former DeepMind researchers as CEO, has released an empathetic voice interface (EVI) touted as the "first conversational AI with emotional intelligence," which can detect 53 different emotions from users. This is a technological leap. Hume AI has raised $50 million in a Series B financing, but with it comes a mix of fearful hopes. Is Emotion the watershed of AI? In 1997, IBM's "Deep Blue" defeated the world chess champion. Its powerful computing power and pure violent cracking methods shocked humanity and also reversed the trend of AI development. This violent cracking method was later used in machine learning to easily win seemingly countless possible battles. Now, news of AI challenging humans in various fields is constantly coming, which may give people the illusion that computers are comparable to humans in terms of cognitive abilities. But there is still a gap between the two. Machine learning and natural language expert Greg Holland tells people that the human brain can solve problems that AI has never seen before, and machine learning is still designed for specific problems. In fact, considering the degree of implementation of AI, it is necessary to accurately distinguish between strong AI and weak AI. "Strong" refers to the belief that computers are not only a tool, but also have thinking abilities. Such machines have perception and self-awareness. "Weakness" refers to the belief that it is impossible to create machines that can truly reason and solve problems, and AI is ultimately a tool without autonomous consciousness. For a long time, mainstream research has also revolved around "tools" and has achieved remarkable results. However, as a result, many people see whether they have perception or emotion as a watershed in the development of AI. Do AI need to understand emotions? Is the next important question for AI to understand and apply emotions? The answer to "Hume AI" is yes. The company's original intention was to make an emotional intelligence model better serve humanity. The company believes that emotional intelligence includes the ability to infer intentions and preferences from behavior. This is exactly the core goal that AI interfaces attempt to achieve: infer what people want and then achieve it. Therefore, in a sense, emotional intelligence is the most important function of AI interfaces. As a chatbot, the difference between "Hume AI" and its predecessors lies in its focus on understanding human emotions and providing reasonable feedback. It can not only understand text, but also use voice dialogue interfaces to listen to human voice features such as intonation, pitch, and pauses to deepen understanding. This understanding may be very subtle, including but not limited to people's "big emotions" such as happiness, sadness, anger, fear, etc. Even more subtle and multidimensional "small emotions" are also within the scope of observation, such as admiration, admiration, addiction, satire, shame, etc. "Hume AI" lists a total of 53 types on its website

Edit:    Responsible editor:

Source:

Special statement: if the pictures and texts reproduced or quoted on this site infringe your legitimate rights and interests, please contact this site, and this site will correct and delete them in time. For copyright issues and website cooperation, please contact through outlook new era email:lwxsd@liaowanghn.com

Return to list

Recommended Reading Change it

Links

Submission mailbox:lwxsd@liaowanghn.com Tel:020-817896455

粤ICP备19140089号 Copyright © 2019 by www.lwxsd.com.all rights reserved

>