1.2166723-211740002
Image Credit: Supplied

Dubai: The current wave of artificial intelligence interface is being driven by personal assistants such as Google Assistant, Apple’s Siri, Microsoft’s Cortana and Amazon’s Alexa to process verbal commands and questions.

In fact, Amazon’s Alexa has gone further, to a point where it can take shopping orders.

As AI capabilities evolve, Roberta Cozza, research director at Gartner, told Gulf News it will understand and respond to users’ emotional states, known as emotional artificial intelligence, or Emotion AI, in a better way.

 I hope to see the tech companies differentiating through AI and bringing more personalisation to their user base or new user base and Emotion AI is one part of it.”

 - Roberta Cozza | Director at Gartner 



Emotion AI is a capability that will allow a device with a camera or voice with AI to detect the user mood or emotion.

“It is an AI-driven technology that will allow [the system] to gather data about the emotional context of the user and based on the analytics of the data, it will offer a more personalised feedback or relevant answer to the user. It could be based on facial expression, posture detection, voice intonation and behavioural patterns,” she said.

Clinical trials

There are companies that already offer health-related products such as wristbands, which can detect the stress level of the user through signals that go through the skin.

These technologies are used by a number of wristband manufacturers.

She said there are wristbands that predict epileptic seizures, and Empatica has already done clinical trials and has a solution in the market, while Emoshape is adding human-like qualities to its robotic systems.

The current set of wristbands is able to alert a user by vibrating if he or she has been idle for few minutes and informs them to do some stretching or get up or take a walk.

Fitbit will be launching an app this year to monitor glucose level in the blood via its Ionic smartwatch this year by teaming up with Dexcom.

“Emotion AI is a growing area among tech companies. New used cases will emerge in the next couple of years,” she said.

For example, she said, if the camera detects that you are sad through your facial expression, then it can advise a specific kind of music and if it detects that you are stressed, then it can suggest that you take a break or don’t write that email.

“Emotion AI can change the face of personalised customer experience. It will be good for service providers to build a service on top of billing information or a specific enterprise application to detect the mood of a worker or the stress level of a worker,” she said.

There are already smart cameras positioned in cars and lorries that can detect the mood, stress level and fatigue of drivers. “We are still far away and it is not going to happen tomorrow. It will take five to six years. We have seen Apple’s iPhone X make custom 3D versions of animated emojis based on facial expressions,” she said.

Moreover, she said that the industry has reached a stage where differentiation is very hard for manufacturers and life cycle of the devices are lengthening, so, what can Apple, Google or Samsung do next.

“I hope to see the tech companies differentiating through AI and bringing more personalisation to their user base or new user base and Emotion AI is one part of it. The vendors can create loyalty to the brand or add value to their ecosystem,” she said.

Beyond smartphones

For that, she said, vendors need to build a very strong use case. If the use case is not very strong, then the user will not upgrade.

With facial recognition feature available on iPhone X, are people upgrading to the new Apple iPhone? Probably not, she said, adding that the device is pricier and many would want to see if it is a great departure from the iPhone 8.

Tech companies are acquiring smaller companies and start-ups that do emotion AI, she said, adding that use would go beyond smartphones, wearables and connected home products to connected vehicles.

“We have already seen smart speakers that have cameras and smart cameras that can monitor. All the players in the industry will try to differentiate and retain their customers by breathing new life into the market,” she said.

David McQueen, Research Director at ABI Research, expects more immersive touchless experiences, with many new interfaces developing such as voice, artificial intelligence, mixed reality, augmented reality and gesture experiences, thereby providing new ecosystems and experiences beyond what is known in today’s smartphone market.

“I do expect most, if not all, of these interfaces to begin appearing in a single device, but timing will be tied to the launch of 5G as a key enabler, notably VR, to limit latency. With this in mind, we don’t expect 5G devices in volume to really start kicking in until 2020, so expect many of these services to be fully integrated by 2022 at the earliest,” he said.