Skip to main content

Table 2 Examples of technologies involved in automated emotion recognition

From: Ethical perspectives on recommending digital technology for patients with mental illness

Technology

Description

Body language and gesture recognition

Recognition of meaningful body movements involving the fingers, hands, face, head or body (Mitra and Acharya 2007; Kleinsmith and Bianchi-Berthouze 2013)

Facial expression analysis

Measurement and interpretation of facial expressions (Zeng et al. 2009; Sariyanidi et al. 2015)

Facial recognition

Recognition of human faces, including if background clutter and variable image quality (Zhao et al. 2003; McPherson et al. 2016)

Natural language processing

Automatic extraction of meaning from human languages, both text and speech, requiring ambiguity resolution (Nadkarni et al. 2011)

Pattern recognition

Automated recognition, description, and classification of patterns, often involving statistical classification and neural networks (Jain et al. 2000)

Sensors

Identification of emotion using physiological signals such as heart rate, breathing, skin conduction, physical activity (Calvo and D’Mello 2010; Jerritta et al. 2011; Sun et al. 2010)

Sentiment analysis

Binary classification of subjective opinions in text such as positive versus negative, like versus dislike (Liu 2010)

Smartphone usage patterns

Identification of mood based on measures such as number and duration of incoming/outgoing calls; outgoing text messages, app usage (LiKamWa et al. 2013; Faurholt-Jepsen et al. 2016)

Speech emotion recognition

Recognition of the emotional content of human speech (El Ayadi et al. 2011; Zeng et al. 2009)

Speech recognition

Identification and understanding of human speech, converting into text or commands (Meng et al. 2012; Xiong et al. 2016)