Chinese researchers made an important step in creating emotional content chat machines as they developed an emotional chatbot software robot. ECM, as its name, can start a dialogue with a person, giving not only coherent answers but also emotional content such as joy, sadness and disgust.
The ultimate goal is to create machines not only with artificial but also with emotional intelligence that will understand human emotions and respond with the corresponding empathy. Only then, the machineinteraction will gain substantial depth (for good or bad), as in the science fiction films Her-2013 and Ex Machina-2015.
The researchers, headed by Deputy Professor Minnie Huang of the Department of Computer Science and Technology at Beijing’s Qinghua University, according to British “Guardian”, reported that 61% of the people who experienced the new emotional chatbot said they preferred it in relation with software robots that speak emotionally neutral language. “But we are still far from a machine that will fully understand the feelings of man. This is just a first attempt, “Huang said.
The emotional chatbot is based on an “emotional classification” algorithm, which has learned to distinguish the individual human emotions, since it initially analyzed 23,000 posts in the Chinese Weibo network. Then improved his ability by further training in millions of posts on the same social network, learning to answer questions emotionally.
Thus, based on the algorithm, a software that can work on five different emotional platforms (happy, sad, angry, disgusting, nice), according to the preferences of each user. So if the user says “it was my worst day, I was late because of a terrible motion on the road,” the robot can answer “yes, you’re late” or “sometimes life is crap” (disgusted) or “I’m here to support you “(sympathetic) or” do not stop smiling, things will go better “(happy or, otherwise, funny!).
The next step will be for the software robot alone to feel the mental mood of his human interlocutor and decide what emotion is the most appropriate for any occasion. Of course, as Huang said, it is important to avoid the risk that the conversation robot will elicit another person’s negative feelings (aggressive, self-destructive, etc.), which could have devastating consequences.
Efforts so far on interactive chat robots have encountered difficulties. For example, one of them, Eugene Goostman, convinced some judges to talk to a computer on their computer screen, but only when he was supposed to be a 13-year-old Ukrainian with limited English knowledge.