dc.contributor.author | Sánchez-Rada, J. Fernando | |
dc.contributor.author | Iglesias, Carlos A. | |
dc.contributor.author | Sagha, Hesam | |
dc.contributor.author | Schuller, Björn | |
dc.contributor.author | Ian D. Wood, Ian D. | |
dc.contributor.author | Buitelaar, Paul | |
dc.date.accessioned | 2018-09-05T11:33:10Z | |
dc.date.available | 2018-09-05T11:33:10Z | |
dc.date.issued | 2017-10-23 | |
dc.identifier.citation | Sánchez-Rada, J. Fernando, Iglesias, Carlos A., Sagha, Hesam, Schuller, Björn, Ian D. Wood, Ian D., & Buitelaar, Paul. (2017). Multimodal multimodel emotion analysis as linked data. Paper presented at the 2017 Seventh International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW), San Antonio, TX, USA, 23-26 October, pp. 111-116, doi: 10.1109/ACIIW.2017.8272599 | en_IE |
dc.identifier.isbn | 10.1109/ACIIW.2017.8272599 | |
dc.identifier.uri | http://hdl.handle.net/10379/10030 | |
dc.description.abstract | The lack of a standard emotion representation model
hinders emotion analysis due to the incompatibility of annotation
formats and models from different sources, tools and annotation
services. This is also a limiting factor for multimodal
analysis, since recognition services from different modalities
(audio, video, text) tend to have different representation models
(e. g., continuous vs. discrete emotions).
This work presents a multi-disciplinary effort to alleviate
this problem by formalizing conversion between emotion models.
The specific contributions are: i) a semantic representation
of emotion conversion; ii) an API proposal for services that
perform automatic conversion; iii) a reference implementation
of such a service; and iv) validation of the proposal through
use cases that integrate different emotion models and service
providers. | en_IE |
dc.description.sponsorship | The research leading to these results has received funding
from the European Union‘s Horizon 2020 Programme
research and innovation programme under grant agreement
No. 644632 (MixedEmotions) | en_IE |
dc.format | application/pdf | en_IE |
dc.language.iso | en | en_IE |
dc.publisher | IEEE | en_IE |
dc.relation.ispartof | 3rd International Workshop on Emotion and Sentiment in Social and Expressive Media: User Engagement and Interaction | en |
dc.rights | Attribution-NonCommercial-NoDerivs 3.0 Ireland | |
dc.rights.uri | https://creativecommons.org/licenses/by-nc-nd/3.0/ie/ | |
dc.subject | Emotional analysis | en_IE |
dc.subject | Linked data | en_IE |
dc.title | Multimodal multimodel emotion analysis as linked data | en_IE |
dc.type | Conference Paper | en_IE |
dc.date.updated | 2018-06-29T09:54:42Z | |
dc.local.publishedsource | https://dx.doi.org/10.1109/ACIIW.2017.8272599 | en_IE |
dc.description.peer-reviewed | non-peer-reviewed | |
dc.contributor.funder | Horizon 2020 | en_IE |
dc.internal.rssid | 14558716 | |
dc.local.contact | Ian Wood. Email: ian.wood@nuigalway.ie | |
dc.local.copyrightchecked | Yes | |
dc.local.version | PUBLISHED | |
dcterms.project | info:eu-repo/grantAgreement/EC/H2020::IA/644632/EU/Social Semantic Emotion Analysis for Innovative Multilingual Big Data Analytics Markets/MixedEmotions | en_IE |
nui.item.downloads | 320 | |