Abstract
The relationship between music and emotion has been addressed within several disciplines, from more historico-philosophical and anthropological ones, such as musicology and ethnomusicology, to others that are traditionally more empirical and technological, such as psychology and computer science. Yet, understanding the link between music and emotion is limited by the scarce interconnections between these disciplines. Trying to narrow this gap, this data-driven exploratory study aims at assessing the relationship between linguistic, symbolic and acoustic features—extracted from lyrics, music notation and audio recordings—and perception of emotion. Employing a listening experiment, statistical analysis and unsupervised machine learning, we investigate how a data-driven multi-modal approach can be used to explore the emotions conveyed by eight Bach chorales. Through a feature selection strategy based on a set of more than 300 Bach chorales and a transdisciplinary methodology integrating approaches from psychology, musicology and computer science, we aim to initiate an efficient dialogue between disciplines, able to promote a more integrative and holistic understanding of emotions in music.
Citation
Emilia
Parada-Cabaleiro,
Anton Batliner,
Marcel Zentner,
Markus
Schedl
Exploring emotions in Bach chorales: a multi-modal perceptual and data-driven study
Royal Society Open ScienceSpringer,
10(12):
230574, doi:10.1098/rsos.230574, 2023.
BibTeX
@article{Parada-Cabaleiro2023RSOS_2023, title = {Exploring emotions in Bach chorales: a multi-modal perceptual and data-driven study}, author = {Parada-Cabaleiro, Emilia and Batliner, Anton and Zentner, Marcel and Schedl, Markus}, journal = {Royal Society Open ScienceSpringer}, doi = {10.1098/rsos.230574}, url = {https://royalsocietypublishing.org/doi/abs/10.1098/rsos.230574}, volume = {10}, number = {12}, pages = {230574}, year = {2023} }