Seguir
Catharine Oertel
Catharine Oertel
Assistant Professor, TU Delft
Email confirmado em tudelft.nl
Título
Citado por
Citado por
Ano
Turn-taking, feedback and joint attention in situated human–robot interaction
G Skantze, A Hjalmarsson, C Oertel
Speech Communication 65, 50-66, 2014
1192014
D64: A corpus of richly recorded conversational interaction
C Oertel, F Cummins, J Edlund, P Wagner, N Campbell
Journal on Multimodal User Interfaces, 1-10, 2012
942012
A gaze-based method for relating group involvement to individual engagement in multimodal multiparty dialogue
C Oertel, G Salvi
Proceedings of the 15th ACM on International conference on multimodal …, 2013
482013
On the use of multimodal cues for the prediction of degrees of involvement in spontaneous conversation
C Oertel, S Scherer, N Campbell
Twelfth annual conference of the international speech communication association, 2011
462011
Measuring dynamics of mimicry by means of prosodic cues in conversational speech
C De Looze, C Oertel, S Rauzy, N Campbell
ICPhS 2011, 2011
442011
Engagement in human-agent interaction: An overview
C Oertel, G Castellano, M Chetouani, J Nasir, M Obaid, C Pelachaud, ...
Frontiers in Robotics and AI 7, 92, 2020
422020
Gaze patterns in turn-taking
C Oertel, M Włodarczak, J Edlund, P Wagner, J Gustafson
Thirteenth annual conference of the international speech communication …, 2012
392012
Predicting group performance in task-based interaction
G Murray, C Oertel
Proceedings of the 20th ACM International Conference on Multimodal …, 2018
362018
Exploring the effects of gaze and pauses in situated human-robot interaction
G Skantze, A Hjalmarsson, C Oertel
Proceedings of the SIGDIAL 2013 Conference, 163-172, 2013
312013
Deciphering the silent participant: On the use of audio-visual cues for the classification of listener categories in group discussions
C Oertel, KA Funes Mora, J Gustafson, JM Odobez
Proceedings of the 2015 ACM on International Conference on Multimodal …, 2015
272015
Who will get the grant? A multimodal corpus for the analysis of conversational behaviours in group interviews
C Oertel, KA Funes Mora, S Sheikhi, JM Odobez, J Gustafson
Proceedings of the 2014 Workshop on Understanding and Modeling Multiparty …, 2014
272014
Towards the automatic detection of involvement in conversation
C Oertel, CD Looze, S Scherer, A Windmann, P Wagner, N Campbell
Analysis of Verbal and Nonverbal Communication and Enactment. The Processing …, 2011
242011
Context cues for classification of competitive and collaborative overlaps
C Oertel, M Wlodarczak, A Tarasov, N Campbell, P Wagner
Proceedings of speech prosody 2012, 2012
222012
Using immersive virtual reality to support designing skills in vocational education
KG Kim, C Oertel, M Dobricki, JK Olsen, AE Coppi, A Cattaneo, ...
British Journal of Educational Technology 51 (6), 2199-2213, 2020
212020
Gaze direction as a backchannel inviting cue in dialogue
A Hjalmarsson, C Oertel
212012
Towards building an attentive artificial listener: On the perception of attentiveness in audio-visual feedback tokens
C Oertel, J Lopes, Y Yu, KAF Mora, J Gustafson, AW Black, JM Odobez
Proceedings of the 18th ACM International Conference on Multimodal …, 2016
202016
A multimodal corpus for mutual gaze and joint attention in multiparty situated interaction
D Kontogiorgos, V Avramova, S Alexanderson, P Jonell, C Oertel, ...
Proceedings of the Eleventh International Conference on Language Resources …, 2018
172018
The Similar Segments in Social Speech Task.
NG Ward, SD Werner, DG Novick, E Shriberg, C Oertel, LP Morency, ...
MediaEval, 2013
142013
Farmi: a framework for recording multi-modal interactions
P Jonell, M Bystedt, P Fallgren, D Kontogiorgos, J Lopes, Z Malisz, ...
Proceedings of the Eleventh International Conference on Language Resources …, 2018
132018
Effects of different interaction contexts when evaluating gaze models in HRI
A Pereira, C Oertel, L Fermoselle, J Mendelson, J Gustafson
Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot …, 2020
122020
O sistema não pode efectuar a operação agora. Tente novamente mais tarde.
Artigos 1–20