IEEE_TASLP_emotion-submitted-2.pdf (7.02 MB)
Download file

Mutual impact of acoustic and linguistic representations for continuous emotion recognition in call-center conversations

Download (7.02 MB)
posted on 09.12.2021, 17:17 by Marie TahonMarie Tahon, Manon Macary, yannick Estève, Daniel Luzzati

The goal of our research is to automaticaly retrieve the satisfaction and the frustration in real-life call-center conversations. This study focuses an industrial application in which the customer satisfaction is continuously tracked down to improve customer services. To compensate the lack of large annotated emotional databases, we explore the use of pre-trained speech representations as a form of transfer learning towards AlloSat corpus. Moreover, several studies have pointed out that emotion can be detected not only in speech but also in facial trait, in biological response or in textual information. In the context of telephone conversations, we can break down the audio information into acoustic and linguistic by using the speech signal and its transcription. Our experiments confirms the large gain in performance obtained with the use of pre-trained features. Surprisingly, we found that the linguistic content is clearly the major contributor for the prediction of satisfaction and best generalizes to unseen data. Our experiments conclude to the definitive advantage of using CamemBERT representations, however the benefit of the fusion of acoustic and linguistic modalities is not as obvious. With models learnt on individual annotations, we found that fusion approaches are more robust to the subjectivity of the annotation task. This study also tackles the problem of performances variability and intends to estimate this variability from different views: weights initialization, confidence intervals and annotation subjectivity. A deep analysis on the linguistic content investigates interpretable factors able to explain the high contribution of the linguistic modality for this task.


Email Address of Submitting Author

ORCID of Submitting Author


Submitting Author's Institution

Le Mans Université / LIUM

Submitting Author's Country