modal-close
Your request
About you
Your company
Your request
reCAPTCHA is required.

When computers learn to love – Affective Computing

Technology

Hollywood has already been telling this story in movies like Baymax or A.I. – Artificial Intelligence, predicting that computers will soon be able to recognize and show feelings. Currently, machines are mainly fulfilling their purpose – our smartphones remind us of appointments, Netflix analyzes our television viewing behavior and proactively suggests movies and TV shows that we might like. It is also possible to interact with voice-controlled personal assistants like Siri and Amazon’s Alexa. This already creates a feeling of a growing emotional interaction between us humans and our smart devices. We generally tend to personalize our everyday objects. However, there is still a decisive step missing – our smartphones and televisions still cannot identify our emotional state. What if today we would rather watch an exciting action-thriller instead of a comedy? Or sometimes, we prefer to listen to sentimental music, because we are having the blues.

However, that will soon change as well. We give our PCs more and more information about ourselves in order to obtain the biggest possible advantages. Simultaneously, machine learning and artificial intelligence are experiencing an accelerated development. People are also becoming more open-minded towards new technologies. Wearables are already a part of the daily life of many people and the IoT is not only revolutionizing the industry. The development of affective computing – the sensitive computer – is being advanced by voice recognition and first interactions with smart devices.

Affective computing broadly describes systems and gadgets that are capable of recognizing, interpreting, processing and simulating human emotions. Put more simply: it’s about technology that knows how we feel and how to respond accordingly.

Learning about or programming emotions

But how exactly is my television or refrigerator supposed to know how I’m feeling? Artificial intelligence and machine learning are the fundamental key technologies. In particular, thanks to voice command technical equipment can learn about and analyze emotions – tone pitch, phrasing, pauses. The better the speech recognition, the more can be learned from the machines and the more parameters can be analyzed. For instance, a connected home can react to a tense mood by dimming the lights, setting the heating, playing music and preparing a cup of tea in order to create a relaxing atmosphere.

Aside from the speech technology, face recognition could also play an important role. There are certain advertising analysis tests in which study participants are asked to watch commercials, so it can be detected where their gazes were turned to the longest in order to find the best position to place the advertised product. A similar approach is used by face recognition software which can independently search for certain spots on the face that in turn are used to learn about which facial expression is linked to which emotion.

Other gadgets that many of us use nowadays can help with the identification of emotions. Wearables which measure movement, pulse or body temperature can be a source of information about the current frame of mind.

This already shows us that affective computing is not really about recognizing emotions, but rather about reading signs, interpreting and simulating them. A real interaction with machines is still difficult, but the opportunities that we already use now offer various possibilities. If and to what extent machines will truly be able to interpret and authentically simulate emotions currently remains a vision of the future.

Take a look at the topics: Industry 4.0, automation, Cobots and Co. 

https://www.hbi.de/en/2018/08/08/technology-innovation-trends/


Keep up with HBI communication trends and HBI news.