For a very long time, one of the biggest prevailing questions in the tech world has been, ‘Can machine understand human emotions?’
The world of computing is a vision of black and white films. Computers are known for their rational approach, simply represented by a value of 0 and 1. Most of the computing era is working on algorithms, deciphering data, and providing the necessary outputs to the users. But what if computers understood human emotions and functioned, not just by logic, but in verse with human feelings?
Doesn’t take make the computing world, irrational and emotional in nature?
Many people believe that computers can never understand human emotions and they will never be able to take over human workspace.
In some ways, they are correct and, in some other, they are not.
This is where affective computing comes into being. A rather important concept for people who just don’t want to function with computers but connect with them, emotionally.
To define artificial emotional intelligence, you must understand the meaning of the word, affective, before.
Affective means to be influenced by emotions.
It is a computing form that relates to, arises from, or influences emotions. This also includes designing systems and devices that can recognize, interpret, process, and simulate human emotions. It is a compilation of different fields including computer science, psychology, and cognitive sciences.
The term, Affective Computing, was coined by Rosalind Picard, in 1995.
The primary motto of this type of computing is to improve user engagement with computers. If computers can understand how people feel while using them, they can use this knowledge to design better, and more engaging experiences.
This form of knowledge can be stored when someone is playing a computer game, shopping online, or browsing through social media. If computers can understand human emotions, they can respond to such emotions too.
For example, you come home from work and you depressed or sad about it. While using your system, the computer can analyze these emotions and ask you If you are feeling ok and suggest ideas to counter it.
The human world is a haphazard collection of emotions, a fact that has been ignored by technology, for many years. One reason that not many people can stay true with technology, is its lack of emotional interaction and connection.
As human beings are known to be the most curious living organisms, their unending thirst for knowledge, particularly in the field of merging emotions and technology, has given right to the field of affective computing.
Researches under the able guidance of Dr. Rosalind Picard at MIT are currently studying the newer technologies that can understand emotions, and their role in a satisfactory human experience. The research aims to widen the range of devices where computers can connect with human emotions. The ultimate aim of emotional AI is for computers to chat with humans and for computers to detect human emotions, and the underlying meaning of what they are saying.
Now, you must be thinking how can computers tell what humans are feeling when using their systems?
Emotions can indeed be read on a person’s faces, but they can also be detected through the physical changes in their body like heart rate, the amount of sweat on the palms of their hands, or through tone.
All such emotional and physical signals can be combined to help determine people’s emotions.
How?
Through Affective Computing.
It involves a multi-disciplinary approach involving sciences like
In other words, affective technologies can sense the emotional state of users, via sensors, microphone, cameras, or software logic and then perform specific changes to the services, features or products, like recommending some videos to fit the mood, asking about their day, or changing the questions to a quiz.
Like all the other computing systems, artificial emotional intelligence involves using labelled training data to train machine learning models. These can identify emotions in speech or videos. The more the models are fed with data, the more accurate they can perform in detecting human emotions. This also increases the scope of their labelled data set to improve their models.
Source: Imperial College London Intelligent Behaviour Understanding Group
The following is the process followed to understand facial expressions.
Affective computing aims to help computers to connect better with human emotions. This is only how computers can understand the real meaning behind the words. But, out of all the years that the innovation in the tech world is going on, why is artificial emotional intelligence so amazing, interesting, and needed right now?
This is because several ingredients are working in sync to prepare a perfect broth of artificial emotional intelligence. All of us have high-resolution cameras, high-speed internet, and capable machine learning models to aid in the rise of affective computing.
With the increasing penetration of smartphones in the market, more users are becoming apt for witnessing a change in computing work from rational to affective thinking. Also, deep learning solutions which contain so much data have become easier to deploy in the systems.
During the 1990s, there were more frequent talks of the role of emotions in diverse areas like psychology, neurology, medicine, and sociology. Before this, there were no mentions as to why emotions should be a stratum of research. Therefore, researchers mostly relied on rational thinking.
It was in 1995 when questions related to stress and sudden anxiety situations, which were not commonly revised or noticed by computational systems, were analyzed. Affective computing came into existence, as the greasing of the human-computer interaction. This would involve devices to detect human emotions, and appropriately respond to such stimuli as well.
A computing device of such potential will be able to detect the users’ emotions from a variety of sources. This will include, facial expressions, gestures, speech, the force or rhythm of the keystrokes, temperature changes when the hand is placed on the mouse, sweat on the palms, and other significant emotional state changes. All of these can be detected by computer systems, interpreted by them, and then decipher the required output.
Since recognizing emotional information requires the extraction of meaningful patterns from the gathered data, it involves machine learning and its different modalities such as speech recognition, natural language processing, or facial expression detection.
Artificial emotional intelligence offers a more practical approach to enrich the human-machine relationship with emotions. These emotions might be associated with abstract human states and this is what machines need to know.
There are two different types of categories describing emotions in machines. These include emotional speech (algorithms, databases, and speech) and facial affect detection (body gestures and physiological monitoring).
As for the future, affective computing can address the major drawbacks of computers, lacking the human touch in machines. These include:
The biggest difference between online learning and classroom learning is the lack of the emotional state of the student in the classroom. In the different e-learning applications, affective computing can help the presentation, to adjust when the student is bored, interested, frustrated, and pleased. Furthermore, it can help in assistive services and psychological health services like counselling. This will involve adjusting according to the emotional state of the client.
Robots can feed with computation data, to act more informative and emotional. This will ensure higher flexibility in complex or uncertain environments. It will improve automation frequency using affective computing abilities.
This will help in various real-life scenarios. For example, while driving the car, affective systems can monitor the emotional state of the driver and engage in additional safety measures like alerting about other vehicles on the road, going over the speed limit, etc.
Other such examples include warning before sending an angry email to the colleague, or selecting songs based on the mood of the listener.
Companies can use Affective Computing to study any product reception in the customer market. Instead of relying on the emotions of the potential customers, customers can run pilot marketing campaigns, based on the data provided by affective systems, to assess the success of their products in the market.
The human voice is a good predictor of emotions and underlying issues. Since customer satisfaction is everything for companies, affective computing systems can help in attaining that goal. Most of the companies rely on the survey for obtaining necessary information about customer satisfaction. Instead, companies can indulge in real-time analytics to offer necessary guidance.
Furthermore, artificial emotional intelligence is a great tool in contact centres and call centres. The data regarding a particular customer can be kept in the data books and this can serve as a reference for call centre agents. Many companies are offering affective computing analysis for various consumer segments and the can guide in entailing effective customer service. An example is Cognito, a company that helps call centre agents to identify the moods of customers on the phone and adjust their conversations in real-time.
The biggest use of affective computing is in the healthcare sector. Researches can study human emotions in various disorders like autism, epilepsy, etc. and help in designing similar computing systems.
The phase of artificial emotional intelligence extends to wearables devices as well. With so many models of wearables available in the market, analyzing human emotions in the form of stress and anxiety, is an easy way to achieve all that data. Just like computers can collect all the data about your emotional state through your facial expressions and speech pattern analysis, wearables can check your heart rate and pulse to make similar assumptions.
Other than this, some other applications of affective computing include fraud identification and enhancing the security levels of compliance sectors.
The idea of Emotional AI is not human versus machine, but a machine augmenting human. It has the power to embark on healthy living in people, one that benefits humans and machines alike.
Artificial emotional intelligence must assume a critical role in your business. Emotional AI can reshape workplaces, if issues related to the privacy and security of this technology are assessed. The ultimate goal is to enrich human interaction with machines and affective cloud computing can pave the right path for it.
Leave a Reply