[robotics-worldwide] [meetings] 3rd CFP ICMI 2016 Workshop "Embodied Interaction with Smart Environments"

Patrick Holthaus patrick.holthaus at uni-bielefeld.de
Wed Aug 24 11:27:52 PDT 2016


Dear colleagues,

Again, I would like to draw your attention to our workshop "Embodied 
Interaction with Smart Environments" which will be held in conjunction with 
ICMI 2016.

The submission deadline is *approaching* (Aug 31, 2016)

- Apologies for multiple postings - 

*Call for papers*
https://urldefense.proofpoint.com/v2/url?u=https-3A__aiweb.techfak.uni-2Dbielefeld.de_icmi2016-5Fworkshop-5Feise_call.html&d=DQICAg&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=l_HKWCnaX3axMYBK9zHK58srpgEqY7dNxF-WKzMrqAk&s=9qok9-WM_PSvFqBvqBp0HKcGISSFzpL81xgKduL00Z8&e= 

*Workshop*
Our homes become increasingly smart through modular hardware and software apps 
controlling home automation functions such as setting the room temperature or 
starting the washing machine. Also, mobile robots start to enter our homes as 
vacuum cleaners, mobile cell phone platforms or toys. All of these come with 
their own interfaces resulting not only in a multitude of different 
interaction devices with different interaction philosophies. Additionally, the 
increasingly embodied capabilities of smart devices, ranging from ambient 
actions (light, sound..) to moving objects (robots, furniture...) yield to the 
overwhelming amount of information and control that needs to be mastered 
within such a convoluted environment. Yet, despite large research efforts, the 
main modality of interaction with smart home devices is often still a 
challenging graphical interface.

Such a complex situation opens up a range of new research questions pertaining 
to the interaction with smart environments. How can they be made more 
intuitive and adaptive? And how to deal with agency or explicit lack of it, 
i.e. whom to address when specifying a command or a goal situation? In this 
workshop we want to address the question how the various instalments inside a 
smart environment can be used as intuitive means of interaction.

This entails on the one hand questions regarding the human partner - in how 
far do users profit from embodied interaction partners (e.g. virtual agents or 
robots) as opposed to non-embodied devices? Do the different devices and 
agents have to provide a coherent interaction? On the other hand, this entails 
questions of situation awareness with respect to the interaction partner. How 
can the environment be attentive to the inhabitants' and their visitors' 
intentions without having to overhear all conversations and interactions that 
are not addressed towards the environment?

*Further information*
Homepage: https://urldefense.proofpoint.com/v2/url?u=https-3A__aiweb.techfak.uni-2Dbielefeld.de_icmi2016-5Fworkshop-5Feise_&d=DQICAg&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=l_HKWCnaX3axMYBK9zHK58srpgEqY7dNxF-WKzMrqAk&s=DzYlKaB0vLE-4Mu1TVjQT8cdXlDlSA-hGdIWkNprPJQ&e= 
Submission/Reviews: via https://urldefense.proofpoint.com/v2/url?u=https-3A__precisionconference.com_-7Eicmi_&d=DQICAg&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=l_HKWCnaX3axMYBK9zHK58srpgEqY7dNxF-WKzMrqAk&s=fnwWDPOMvKveu-osuC0Tqug6ZGWnC1cYdkJRKqrq8-U&e= 
Publication: via the ACM Digital Library, https://urldefense.proofpoint.com/v2/url?u=http-3A__dl.acm.org&d=DQICAg&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=l_HKWCnaX3axMYBK9zHK58srpgEqY7dNxF-WKzMrqAk&s=s6id9pHx2le2TpQdX04UjFPOjh7xt7C_e8f7Ih6NNiM&e= 
Invited talk: Takayuki Kanda, www.irc.atr.jp/~kanda/

-- 
With kind regards

Dr.-Ing. Patrick Holthaus


Bielefeld University
Faculty of Technology
Applied Informatics/Cognitive Systems Engineering
Office: CITEC 1.228
Phone: +49(0)521-106-12207
Fax: +49(0)521-106-2992


More information about the robotics-worldwide mailing list