From facial expressions to social signals for useful embodied social agents
Dear all,
It is my pleasure to invite you to attend via Zoom a MMEF Lab seminar delivered by Prof. Ruth Aylett, Professor of Mathematics and Computer Science, Heriot-Watt University Edinburgh, who is visiting our laboratory and staying with us (now remotely due to the Covid-19) from March the 9th to April the 8th 2020, in the context of a travel grant that was awarded to her, and an interest in our work in computational modelling, virtual reality and robotics.
She researches Affective Systems, Social Agents in both graphical and robotic embodiments, and Human-Robot Interaction, as well as Interactive Narrative. She led three EU projects (VICTEC, eCIRCUS and eCUTE) in the period 2001-2012 applying empathiuc graphical characters to education against bullying (FearNot!) and in cultural sensitivy (ORIENT, Traveller, MIXER). She also worked as a PI in the projects LIREC (investigating long-lived robot companions) and EMOTE (an empathic robot tutor). She led the EPSRC-funded network of excellence in interactive narrative, RIDERS. She is currently PI of the project SoCoRo (Socially Competent Robots) which is investigating the use of a mobile robot to train high-functioning adults with an Autism Spectrum Disoprder in social interaction. She has authored more then 250 referred publications in conferences, journals and book chapters, and has been an invited speaker at various events, most recently AAMAS 2016.
The seminar will take place on Wednesday the 1st of April at 14h30 (she will speak for about 50 min followed by 10-15 min questions).
See below for connection information to the Zoom meeting.
Title: From facial expressions to social signals for useful embodied social agents
Abstract:
The research field of embodied social agents aims to take scientific results from cognitive and neurological modelling and apply them to the engineering of embodied social agents, both robots and intelligent graphical characters. As a researcher in AI, I have been involved in this work for twenty years over a number of large projects. I am visiting University of Geneva to gain a better understanding of the scientific state-of-the-art with a view to improving the systems in which I have been involved.
Many existing computational architectures include affective models, and often use these to generate expressive behaviour for embodied agents. In general these are naive in that the modelled state is directly transformed into behaviour such as facial expressions. My goal is to formulate a more realistic approach that can be applied in architectures modelling versions of cognitive appraisal. This approach seeks to apply a more current view of cognitive appraisal on multiple timescales along with a somatic component and a simulation theory of mind to modify expressive behaviour by social context.
In this talk I will outline some of the social agent work in which I have been involved, the architecture that has been used and how it could be modified to make move the modelling of expressive behaviour towards a model of social signals.
Looking forward to listening to her talk, and hoping that interested people will be able to attend the seminar in these challenge times.
All the very best,
David Rudrauf
———
Zoom meeting connection information:
David Rudrauf is inviting you to a scheduled Zoom meeting. Join Zoom Meeting https://unige.zoom.us/j/570768039?pwd=RGtldnRHb01sbG1UWDNPWWx4eFN2UT09
Meeting ID: 570 768 039
Password: 820978
One tap mobile +13462487799,,570768039# US (Houston) +16699006833,,570768039# US (San Jose) Dial by your location +1 346 248 7799 US (Houston) +1 669 900 6833 US (San Jose) +1 929 436 2866 US (New York) +1 253 215 8782 US +1 301 715 8592 US +1 312 626 6799 US (Chicago) Meeting ID: 570 768 039 Find your local number: https://unige.zoom.us/u/acNl3DlvWL Join by SIP 570768039@zoomcrc.com Join by H.323 162.255.37.11 (US West) 162.255.36.11 (US East) 221.122.88.195 (China) 115.114.131.7 (India Mumbai) 115.114.115.7 (India Hyderabad) 213.19.144.110 (EMEA) 103.122.166.55 (Australia) 209.9.211.110 (Hong Kong) 64.211.144.160 (Brazil) 69.174.57.160 (Canada) 207.226.132.110 (Japan)