Feature Extraction and Selection for Emotion Recognition from Electrodermal Activity

Jainendra Shukla, Miguel Barreda-Angeles, Joan Oliver, G. C. Nandi, Domenec Puig

Research output: Contribution to JournalArticleAcademicpeer-review

Abstract

Electrodermal activity (EDA) is indicative of psychological processes related to human cognition and emotions. Previous research has studied many methods for extracting EDA features; however, their appropriateness for emotion recognition has been tested using a small number of distinct feature sets and on different, usually small, data sets. In the current research, we reviewed 25 studies and implemented 40 different EDA features across time, frequency and time-frequency domains on the publicly available AMIGOS dataset. We performed a systematic comparison of these EDA features using three feature selection methods, Joint Mutual Information (JMI), Conditional Mutual Information Maximization (CMIM) and Double Input Symmetrical Relevance (DISR) and machine learning techniques. We found that approximately the same numbers of features are required to obtain the optimal accuracy for the arousal recognition and the valence recognition. Also, the subject-dependent classification results were significantly higher than the subject-independent classification for both arousal and valence recognition. Statistical features related to the Mel-Frequency Cepstral Coefficients (MFCC) were explored for the first time for the emotion recognition from EDA signals and they outperformed all other feature groups, including the most commonly used Skin Conductance Response (SCR) related features.

Original languageEnglish
JournalIEEE Transactions on Affective Computing
DOIs
Publication statusAccepted/In press - 1 Jan 2019
Externally publishedYes

Fingerprint

Feature extraction
Learning systems
Skin

Keywords

  • Arousal
  • Discrete wavelet transforms
  • EDA
  • Emotion recognition
  • Emotion Recognition
  • Feature extraction
  • Feature Extraction
  • Feature Selection
  • Frequency-domain analysis
  • Mutual information
  • Skin
  • Time-domain analysis
  • Valence

Cite this

@article{865e89bdda15438a8eddb20811cd09da,
title = "Feature Extraction and Selection for Emotion Recognition from Electrodermal Activity",
abstract = "Electrodermal activity (EDA) is indicative of psychological processes related to human cognition and emotions. Previous research has studied many methods for extracting EDA features; however, their appropriateness for emotion recognition has been tested using a small number of distinct feature sets and on different, usually small, data sets. In the current research, we reviewed 25 studies and implemented 40 different EDA features across time, frequency and time-frequency domains on the publicly available AMIGOS dataset. We performed a systematic comparison of these EDA features using three feature selection methods, Joint Mutual Information (JMI), Conditional Mutual Information Maximization (CMIM) and Double Input Symmetrical Relevance (DISR) and machine learning techniques. We found that approximately the same numbers of features are required to obtain the optimal accuracy for the arousal recognition and the valence recognition. Also, the subject-dependent classification results were significantly higher than the subject-independent classification for both arousal and valence recognition. Statistical features related to the Mel-Frequency Cepstral Coefficients (MFCC) were explored for the first time for the emotion recognition from EDA signals and they outperformed all other feature groups, including the most commonly used Skin Conductance Response (SCR) related features.",
keywords = "Arousal, Discrete wavelet transforms, EDA, Emotion recognition, Emotion Recognition, Feature extraction, Feature Extraction, Feature Selection, Frequency-domain analysis, Mutual information, Skin, Time-domain analysis, Valence",
author = "Jainendra Shukla and Miguel Barreda-Angeles and Joan Oliver and Nandi, {G. C.} and Domenec Puig",
year = "2019",
month = "1",
day = "1",
doi = "10.1109/TAFFC.2019.2901673",
language = "English",
journal = "IEEE Transactions on Affective Computing",
issn = "1949-3045",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

Feature Extraction and Selection for Emotion Recognition from Electrodermal Activity. / Shukla, Jainendra; Barreda-Angeles, Miguel; Oliver, Joan; Nandi, G. C.; Puig, Domenec.

In: IEEE Transactions on Affective Computing, 01.01.2019.

Research output: Contribution to JournalArticleAcademicpeer-review

TY - JOUR

T1 - Feature Extraction and Selection for Emotion Recognition from Electrodermal Activity

AU - Shukla, Jainendra

AU - Barreda-Angeles, Miguel

AU - Oliver, Joan

AU - Nandi, G. C.

AU - Puig, Domenec

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Electrodermal activity (EDA) is indicative of psychological processes related to human cognition and emotions. Previous research has studied many methods for extracting EDA features; however, their appropriateness for emotion recognition has been tested using a small number of distinct feature sets and on different, usually small, data sets. In the current research, we reviewed 25 studies and implemented 40 different EDA features across time, frequency and time-frequency domains on the publicly available AMIGOS dataset. We performed a systematic comparison of these EDA features using three feature selection methods, Joint Mutual Information (JMI), Conditional Mutual Information Maximization (CMIM) and Double Input Symmetrical Relevance (DISR) and machine learning techniques. We found that approximately the same numbers of features are required to obtain the optimal accuracy for the arousal recognition and the valence recognition. Also, the subject-dependent classification results were significantly higher than the subject-independent classification for both arousal and valence recognition. Statistical features related to the Mel-Frequency Cepstral Coefficients (MFCC) were explored for the first time for the emotion recognition from EDA signals and they outperformed all other feature groups, including the most commonly used Skin Conductance Response (SCR) related features.

AB - Electrodermal activity (EDA) is indicative of psychological processes related to human cognition and emotions. Previous research has studied many methods for extracting EDA features; however, their appropriateness for emotion recognition has been tested using a small number of distinct feature sets and on different, usually small, data sets. In the current research, we reviewed 25 studies and implemented 40 different EDA features across time, frequency and time-frequency domains on the publicly available AMIGOS dataset. We performed a systematic comparison of these EDA features using three feature selection methods, Joint Mutual Information (JMI), Conditional Mutual Information Maximization (CMIM) and Double Input Symmetrical Relevance (DISR) and machine learning techniques. We found that approximately the same numbers of features are required to obtain the optimal accuracy for the arousal recognition and the valence recognition. Also, the subject-dependent classification results were significantly higher than the subject-independent classification for both arousal and valence recognition. Statistical features related to the Mel-Frequency Cepstral Coefficients (MFCC) were explored for the first time for the emotion recognition from EDA signals and they outperformed all other feature groups, including the most commonly used Skin Conductance Response (SCR) related features.

KW - Arousal

KW - Discrete wavelet transforms

KW - EDA

KW - Emotion recognition

KW - Emotion Recognition

KW - Feature extraction

KW - Feature Extraction

KW - Feature Selection

KW - Frequency-domain analysis

KW - Mutual information

KW - Skin

KW - Time-domain analysis

KW - Valence

UR - http://www.scopus.com/inward/record.url?scp=85064676065&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85064676065&partnerID=8YFLogxK

U2 - 10.1109/TAFFC.2019.2901673

DO - 10.1109/TAFFC.2019.2901673

M3 - Article

JO - IEEE Transactions on Affective Computing

JF - IEEE Transactions on Affective Computing

SN - 1949-3045

ER -