[robotics-worldwide] [news]Call for Participation--RSS 2017 Workshop on Perception and Interaction Dynamics in Child-robot Interaction

Zhi Zheng zhizheng at mtu.edu
Tue May 2 11:47:11 PDT 2017


Perception and Interaction Dynamics in Child-robot Interaction
Robotics, Science and Systems Workshop
Boston, MA. July 15, 2017

Dear Colleagues,

We are pleased to invite contributions to the RSS 2017 Workshop on Perception
and Interaction Dynamics in Child-robot Interaction.

------------------------------------------------------------
------------------------------------------------------------
-------------------

Summary:

For decades, robots have been designed particularly for children, such as
robot companions and therapeutic robots for children with ASD. This
research has gained attention in various communities, including robotics,
human-machine interaction, and psychology. Two of the most important
aspects in designing these robots are perception and interaction dynamics.
Precise perception allows the robots to gather informative behavioral and
affective signals from children, and thus, the robots can provide effective
prompts and feedback accordingly. Interaction dynamics between the robots
and the children needs to be optimized towards a goal to help enhance the
quality of these children’s lives. The interaction dynamics also elicits
specific interaction cues that need to be perceived carefully by the
robots. We believe that a tight coupling between perception and interaction
dynamics is vital for successful assistance.

While holding great promises, a variety of challenges need to be addressed
before robots can reach their full potential. Topics to be discussed during
the workshop include, but are not limited to:

· Perception cues and methodologies in child-robot interaction (CRI).

· Defining interaction models in CRI.

· Personalization, adaptation, and automation of CRI

· Social cognition and learning in CRI

This workshop aims to bring together researchers from the assistive
robotics, human-robot interaction, human sensing, human factors, and
psychology communities. By combining these communities’ strengths and
sharing current progress, we hope to discuss ideas and solutions that
tackle the current difficulties, make progress towards robotic systems that
have a positive impact on children, and discuss future trends and open
problems.

------------------------------------------------------------
------------------------------------------------------------
-------------------
Workshop website:

https://urldefense.proofpoint.com/v2/url?u=https-3A__sites.google.com_a_mtu.edu_rss2017-2Drci-2Dperception-2Dand-2Dinteraction_&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=H-7tJB5aYqrkBYwlr7svnIM8ltoYUNv8lxZwGowC0qM&e= 
home
------------------------------------------------------------
------------------------------------------------------------
-------------------
Call for Contributions:

Researchers interested in participation will need to submit workshop papers
(4-6 pages) on one of the following topics, related to child-robot
interaction.

1). Automatic and/or real-time interaction cue (e.g., gaze, gestures,
tactility, speech, etc.) sensing for children.

2). Theoretical and experimental studies on interaction dynamics.

3). Longitudinal child-robot interaction.

4). Physical and/or emotional child-robot interaction.

Selection will undergo a peer review process and papers be chosen based on
originality of the work and appropriateness to the workshop topic. The
selected papers will be posted on this workshop website. The authors of the
selected papers will be asked to present their papers as posters during the
workshop. We also welcome other people who are interested in related topics
to participate in the workshop without a paper.

Please submit your paper to: rss2017crisubmission at gmail.com

Submission deadline: 1 June 2017

Notification: 20 June 2017

Camera ready: 5 July 2017

Workshop: 15 July 2017
Please contact Zhi Zheng at rss2017crisubmission at gmail.com for questions.
------------------------------------------------------------
------------------------------------------------------------
-------------------

Keynote Speakers:

*Dr. Brian Scassellati  <https://urldefense.proofpoint.com/v2/url?u=http-3A__www.cs.yale.edu_homes_scaz_&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=tkzSVGCLEufG2UHVqEfbeSErbSdYqVeYnmLPSuuTpa4&e= >*

Professor of Computer Science <https://urldefense.proofpoint.com/v2/url?u=http-3A__www.cs.yale.edu_&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=wjAaA3_wbOe7VLWyqxSn4FBZa6-4zSnotuY8wC3rX3c&e= >, Cognitive Science
<https://urldefense.proofpoint.com/v2/url?u=http-3A__www.yale.edu_cogsci&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=sXmHqx45PhBvxsVxW1-4Q3mrCEqFmKeKOu9lhoIe8aE&e= >, and Mechanical Engineering
<https://urldefense.proofpoint.com/v2/url?u=http-3A__seas.yale.edu_departments_mechanical-2Dengineering-2Dand-2Dmaterials-2Dscience&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=2FLRPjX5FChtqPAkG-2TdK_cfQXUsVygD738noY4mNk&e= >
 at Yale University <https://urldefense.proofpoint.com/v2/url?u=http-3A__www.yale.edu_&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=hp04G6pgzm_TDMK3wCI_TQsYzzNUfSZxM9ZVpQ9cktc&e= > and Director of the NSF
Expedition on Socially Assistive Robotics
<https://urldefense.proofpoint.com/v2/url?u=http-3A__www.robotshelpingkids.com_&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=m8iI_12vRoAy7rjSQrilv5hVxr3H4mcsKr3ZcbRtpPA&e= >.

Dr. Nilanjan Sarkar
<https://urldefense.proofpoint.com/v2/url?u=https-3A__engineering.vanderbilt.edu_bio_nilanjan-2Dsarkar&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=OC1ZVHHFMTc4rE70puLvdLkJ8D_Ytn7V7Uh0DBqz2uk&e= >

Professor of Mechanical Engineering and Electrical Engineering and Computer
Science at Vanderbilt University.

*Dr. Holly Yanco <https://urldefense.proofpoint.com/v2/url?u=http-3A__www.cs.uml.edu_-7Eholly_&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=NYODQoruWlcWNXFX56BtMVsmQyyMH2fp7subNdi4CRo&e= >*

Professor in the Computer Science Department <https://urldefense.proofpoint.com/v2/url?u=http-3A__www.cs.uml.edu_&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=-LYu0mTBr14ZEcRIaEiomlJ_Pqh8MK2oKmTDahvSeAk&e= > at UMass
Lowell <https://urldefense.proofpoint.com/v2/url?u=http-3A__www.uml.edu_&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=07qRX42j13HmPtR4v--R8LgBPHWo3fQZD3iwSObdTCs&e= >



Invited Speakers:

Dr. Chung Hyuk Park <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.seas.gwu.edu_chung-2Dhyuk-2Dpark&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=6TUpgNl2IKpDMbE_kU59f-nwD16j9ZzlQ1f1XYpXKHA&e= >
Assistant Professor of Biomedical Engineering at The George Washington
University

Dr. Hae Won Park <https://urldefense.proofpoint.com/v2/url?u=http-3A__www.haewonpark.com_&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=LTTfCKuNTC_0DNonPiNF9nArsup7bKRaMdo777trODM&e= >
Postdoctoral Associate at the Personal Robots Group
<https://urldefense.proofpoint.com/v2/url?u=http-3A__robotic.media.mit.edu_&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=Gzf5j5NhNUQA8-zQL0oYgwXf-2ToHQqSwzRlhRkd8OA&e= > at the MIT Media Lab
<https://urldefense.proofpoint.com/v2/url?u=http-3A__www.media.mit.edu_&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=N4-noX2qHwCJzQxaw0uRUO7ZwTqntrZtdkglRUdOlIQ&e= >

Ms. Franziska Kirstein
<https://urldefense.proofpoint.com/v2/url?u=https-3A__www.researchgate.net_profile_Franziska-5FKirstein&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=9fuKYS8Oq21LqxbUOiSDcoZDEr_GGRBhIev8jXmXpso&s=Ff123fCcRBrnaUsYsP5L7QnX97k3g2JlKP4dE_r5VeQ&e= >
Human-Robot Interaction Expert at BlueOcean Robotics

------------------------------------------------------------
------------------------------------------------------------
-------------------

Sincerely,

Zhi Zheng
Research Assistant Professor

Department of Electrical and Computer Science

Michigan Technological University


More information about the robotics-worldwide mailing list