|
--- |
|
tags: |
|
- EEG |
|
--- |
|
Part of MONSTER: <https://arxiv.org/abs/2502.15122>. |
|
|
|
***Dreamer*** is a multimodal dataset that includes electroencephalogram (EEG) and electrocardiogram (ECG) signals recorded during affect elicitation using audio-visual stimuli [1], captured with a 14-channel Emotiv EPOC headset. It consists of data recording from 23 participants, along with their self-assessments of affective states (valence, arousal, and dominance) after each stimulus. For our classification task, we focus on the arousal and valence labels, referred to as ***DreamerA*** and ***DreamerV*** respectively. |
|
|
|
The dataset is publicly available [2], and we utilize the Torcheeg toolkit for preprocessing, including signal cropping and low-pass and high-pass filtering [3]. Note that only EEG data is analyzed in this study, with ECG signals excluded. Labels for arousal and valence are binarized, assigning values below 3 to class 1 and values of 3 or higher to class 2, and has been split into cross-validation folds based on participant. |
|
|
|
[1] Stamos Katsigiannis and Naeem Ramzan. (2017) Dreamer: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices. *IEEE Journal of Biomedical and Health Informatics*, 22(1):98–107. |
|
|
|
[2] Stamos Katsigiannis and Naeem Ramzan. (2017). Dreamer: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices. <https://zenodo.org/records/546113>. |
|
|
|
[3] Zhi Zhang, Sheng-Hua Zhong, and Yan Liu. (2024). TorchEEGEMO: A deep learning toolbox towards EEG-based emotion recognition. *Expert Systems with Applications*. |