%0 Generic %T How Much Does Audio Matter to Recognize Egocentric Object Interactions? %A Alejandro Cartas %A Jordi Luque %A Petia Radeva %A Carlos Segura %A Mariella Dimiccoli %D 2019 %F Alejandro Cartas2019 %O MILAB; no menciona %O exported from refbase (http://refbase.cvc.uab.es/show.php?record=3383), last updated on Thu, 28 Jan 2021 10:26:20 +0100 %X CoRR abs/1906.00634 Sounds are an important source of information on our daily interactions with objects. For instance, a significant amount of people can discern the temperature of water that it is being poured just by using the sense of hearing. However, only a few works have explored the use of audio for the classification of object interactions in conjunction with vision or as single modality. In this preliminary work, we propose an audio model for egocentric action recognition and explore its usefulness on the parts of the problem (noun, verb, and action classification). Our model achieves a competitive result in terms of verb classification (34.26% accuracy) on a standard benchmark with respect to vision-based state of the art systems, using a comparatively lighter architecture. %9 miscellaneous %U https://arxiv.org/abs/1906.00634