Is it possible for AI to understand human feelings?

What do you think of the idea of a robotic psychotherapist? Machines with EQ may soon be able to appear.

Over the past few decades, artificial intelligence has been very adept at reading human emotional reactions. But reading does not mean understanding. If AI can’t go through all sorts of emotions on its own, is it possible for them to really understand us?

If not, do we put ourselves in greater danger when we continue to rely on these machines? The latest generation of AI can learn from the growing size of big data, while having the ability to continuously improve processing. These machines are more competitive than people in dealing with many things. They can now recognize faces, turn human face graffiti into photos, recognize voice and play go.

And do they have the same promotion when it comes to dealing with feelings?

Identification of criminals Recently, some researchers have developed an AI that can identify criminals by identifying facial features. The system used Chinese identity card photos for performance evaluation, and the results were surprising.

Overall, AI successfully identified 83% of criminals, 6% of the innocent wrong to think of criminals, the overall accuracy rate approaching 90%.

The system builds on deep learning techniques and, by combining deep learning with a “face rotation model”, allows AI to still recognize the face of two photos as a person, with different photo angles and brightness. Deep learning establishes a neural network consisting of tens of thousands of neurons distributed on different levels. Each layer converts the input information into more abstract information, such as data that changes the facial image into a series of directions and locations.

This process automatically highlights the most important information features that require the completion of a task. Given the success of deep learning techniques, if there is really a difference between facial features between criminals and ordinary people, we should probably not be surprised that AI managed to separate them.

The study suggests that the most important information for different faces is three, one is the curvature of the nose tip and the corners of the mouth (the average offender’s curvature is 19.6% smaller than that of the ordinary), the other is the curvature of the upper lip (the average offender is 23.4%), and the last is the distance 5.6% between the corners of the eye At first glance, such an analysis seems to confirm an outdated argument that criminals can be identified through their physical characteristics. However, this is not entirely true. It should be noted that the two data are related to the most expressive organs of humans, lips. ID cards are generally required to photograph people do not have any expression, but it is likely that AI still found people’s hidden emotions.

And humans are likely to be unable to discern these emotions.

The magic of “emotional computing” This is not the first time a computer has identified human feelings. A field called “Emotional computing” caught fire a few years ago. The theory is that if we are to live in peace with robots in the future, then these machines must be able to understand and give appropriate feedback on human feelings.

This is still an emerging area with great potential for development. For example, researchers have used facial expression analysis to identify students who encounter obstacles in computer classes. AI is trained as a system that identifies students with varying degrees of active participation or frustration, which makes it easier for them to understand where students encounter the most difficulties.

This technology can improve the learning experience of the online teaching platform.

And another company, BeyondVerbal, uses our voice to judge our moods. They have developed software that can analyze modulation sounds and find fixed patterns in human conversations. They claim that their software has more than 80% of the correct rate in their applications.

In the future, this technology may help us identify mood changes in people with autism. Sony has even generated the development of a robot that can create emotional bonds with people. The project has not yet had much information to publish.

However, they mention that they are trying to combine hardware with services to provide people with a different emotional experience. An AI with EQ will bring many benefits, such as emotional companionship or help us solve more complex problems. But it also means that we need to face more ethical issues and risks. Can we really think that the old man who relied on AI was really connected to the machine in an emotional connection? Can you really convict a man because AI says he’s guilty? Of course not.

In fact, we can only advise people to pay more attention to people who are considered “suspicious” by AI after the whole system has been continuously upgraded and evaluated. So what can we expect from the future development of AI? Subjective topics like feelings and emotions remain a major obstacle to AI learning, one of which is the lack of high-quality data. For example, can AI understand what irony is?

Difficult, because in one context it is ironic, in another context may be a genuine compliment. However, as mentioned earlier, the total amount of data available for AI learning and its processing capacity are growing. So, it’s quite possible that in the next dozens of years, AI will be able to identify a variety of different emotions. But can they really feel the mood? Different schools have different views. Even if they can, it can only be part of the human mood. And we can’t have all the AI of human emotions, we’d better be careful with them.