Affective computing – how to build better digital products

Ever wished that applications were easier to use? Affective computing systems adapt based on emotional insights.
27 February 2024

Computer says yes: breakthroughs in affective computing could make software much easier to use.

Getting your Trinity Audio player ready...

“The goal of affective computing is to create a computing system capable of perceiving, recognizing, and understanding human emotions and responding intelligently, sensitively, and naturally,” write researchers in the journal Intelligent Computing.

Hitting such a milestone would signal a UX breakthrough with digital applications that were much more instinctive to use, as programs would be able to determine when users are struggling – through emotional insights – and make adaptations.

A popular approach has been to use wearables fitted with sensors, but making sense of the data is easier said than done. Developers are aware that the physiological response accompanying the emotional response to an event can be the same for different emotions. For example, fear, anger, and excitement can all send heart rates racing. To be effective, affective computing algorithms need to differentiate between such signals.

The good news is that machines have many inputs to draw upon when trying to determine the emotional state of users. Many of the devices we use daily have cameras and microphones, which pave the way for image, gesture, and speech recognition. So, the idea of applying affective computing principles to our devices is by no means far-fetched.

Emotion as a service

Speaking on the topic of affective computing (what is it and why should I care?), Håkan Silfvernagel – a consultant with expertise in human-machine interfaces and robotics – mentions Microsoft’s Emotion API, which is capable of receiving an image of a human face and returning one of eight emotions, including happiness, sadness, fear, and disgust.

Other tech firms such as Amazon and IBM have developed emotion recognition capabilities as well. And, engagement analytics provided by Affectiva – whose co-founder, Rana el Kaliouby, wrote ‘Girl decoded’ to share her quest to reclaim our humanity by bringing emotional intelligence to technology – are popular with many of the world’s largest advertisers.

Considering device applications, Affectiva has applied its so-called Emotion AI to automotive settings using algorithms fed with in-cabin camera footage to detect the state, emotions, and reactions of drivers and other occupants in the context of vehicle environments.

It’s not a stretch to imagine soothing music being piped out of a car’s speakers to calm the driver during rush hour traffic – a use case that’s been given to highlight the adaptive properties of affective computing systems.

Lately, advances in AI have provided a major boost to the field – a point made by Rosalind Picard, who wrote the book on Affective Computing and has been working in the area for decades. Sentiment analysis of spoken words and text has never been as straightforward to carry out as it is today, and that information can be fed back into education and training systems to gauge the performance of learners.

Likewise, image recognition and object tracking have come on leaps and bounds thanks to AI advances, and those breakthroughs positively impact affective computing systems.

Apple has found itself having to interpret the expressions of wearers of the Vision Pro to generate what the tech giant dubs personas, which are used as avatars that participate in FaceTime calls.

The more emotional signals that VR systems can provide, the richer the experience for headset wearers. And affective computing will be key for tempting more business users to participate in the metaverse.

Advances are being made in brain-computer interfaces, which could give developers another perspective on users’ emotional states. As the cost of wearables incorporating EEG measurements comes down, this information will add to the data fusion used by affective computing devices.

Signals gathered from the brain could spotlight signs of fatigue, joy, or frustration – to give just a few examples. Also, beyond making software more user-friendly, virtual assistants programmed using affective computing techniques have been shown to encourage creative problem-solving, which hints at further wins for business users.