[robotics-worldwide] [meetings] [CFP] Late-Breaking Papers for SIGdial 2018 Special Session on Physically Situated Dialogue (RoboDIAL)
sandrist at microsoft.com
Tue Apr 17 11:55:42 PDT 2018
Call for Late-Breaking Papers: SIGdial 2018 Special Session on Physically Situated Dialogue (RoboDIAL)
Deadline for late-breaking and work-in-progress papers: April 28
Recent technologies have brought conversational robots out of the lab and into the homes and workplaces of real users. Dialogue is now actively taking place between with robots and other smart devices to understand, operate, navigate, and manipulate physical space. *Physically situated dialogue* distinguishes itself from other forms of dialogue in that it (1) takes place in a physical space, (2) refers to the shared surroundings of dialogue partners, and (3) involves a physical agent that can make actions in the world. There is a growing need for showcasing bi-directional dialogue work that draws on language grounding, models of vision and language, as well as dialogue that allows physically situated agents to ask for clarification and provide updates on their internal states.
Our objectives in this special session are to showcase recent and ongoing work on physically situated dialogue, and to identify paths forward in this space from research across communities including dialogue, robotics, computer vision, NLP, and AI. The special session will feature presentations, a poster session, and a panel discussion comprising a mix of experts in the topic area. We welcome submissions on any topic related to physically situated dialogue, including but not limited to:
* Interaction studies with smart-home devices
* Learning from demonstration through natural language dialogue
* Explainable AI in physical spaces
* Representations of physical surroundings / world modeling to support grounded communication
* Embodied visual question answering and/or generation
* Empirical studies of human-robot dialogue (Wizard-of-Oz based, simulated, or semi-autonomous)
* Computational models of dialogue management and/or turn-taking with physical agents
* Methods of building or leveraging common ground with physical agents in real-world or simulated environments
* Corpora of physically situated dialogue (Wizard-of-Oz based or otherwise)
* Multimodal information processing to support dialogue (including speech, gaze, gesture)
* Physical embodiment, voice, or personification of robots and their effects on human-robot dialogue
* Communicating feedback from robots using affordances in addition to speech
* Spoken language generation for physically situated dialogue
**Call for Papers**
Researchers may choose to submit:
* *Long papers and short papers* will present original research and go through the regular SIGdial peer review process by the general SIGdial program committee. These papers will appear in the main SIGdial proceedings and are presented with the main track. Long papers must be no longer than eight pages, including title, text, figures and tables, along with two additional pages for example discourses or dialogues and algorithms. Short papers should be no longer than four pages including title, text, figures and tables, along with one additional page for example discourses or dialogues and algorithms. An unlimited number of pages are allowed for references.
* *Late-breaking and work-in-progress papers* will showcase ongoing work and focused, relevant contributions. Submissions need not present original work and are limited to four pages including references. These will be reviewed by the special session organizers and posted on the special session website. These papers will be presented as lightning talks or posters during the session. Authors will retain the copyright to their work so that they may submit to other venues as their work matures.
**Long and short paper abstract deadline: March 11**
**Long and short paper final PDF deadline: March 18**
To submit a long or short paper, please go to the SIGdial 2018 main page (https://urldefense.proofpoint.com/v2/url?u=http-3A__www.sigdial.org_workshops_conference19_&d=DwIFAg&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=Sx-vtJ9feHc9yDpBLAsy1LDx6vmaHXq6ca1jYXAePiw&s=D1qVekG9Oc_yducOGMe9fTXm2gky0JaQ_1jWSg91v1Y&e=<https://urldefense.proofpoint.com/v2/url?u=https-3A__na01.safelinks.protection.outlook.com_-3Furl-3Dhttp-253A-252F-252Fwww.sigdial.org-252Fworkshops-252Fconference19-252F-26data-3D04-257C01-257Csandrist-2540MICROSOFT.COM-257C4ba35210a9284ef9db7d08d584cc1838-257C72f988bf86f141af91ab2d7cd011db47-257C1-257C0-257C636560934476307381-257CUnknown-257CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwifQ-253D-253D-257C-2D2-26sdata-3DrDaAYLgWAAIGD3rcIG1Waj1uX7qg1WyWXJSS2n0mK7c-253D-26reserved-3D0&d=DwIFAg&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=Sx-vtJ9feHc9yDpBLAsy1LDx6vmaHXq6ca1jYXAePiw&s=3TeWhEo5f_z3LkhyX7Mz5a6UVRBuS3YtElr0YnceTJg&e=>) for conference submissions. When submitting, indicate "Physically Situated Dialogue" as the candidate special session. All long and short submissions must follow the SIGdial 2018 format.
**Late-breaking and work-in-progress paper deadline: April 28**
To submit a late-breaking or work-in-progress paper, please email a 2-4 page PDF (including references) formatted using the SIGdial 2018 format guidelines, to: robodial at googlegroups.com by April 28. You will receive a confirmation of your submission and notification before the Early Bird Registration deadline.
**List of Organizers**
* Sean Andrist, Microsoft Research
* Stephanie Lukin, Army Research Lab
* Matthew Marge, Army Research Lab
* Jesse Thomason, University of Texas at Austin
* Zhou Yu, University of California at Davis
For more information and updates please check the special session website at <https://urldefense.proofpoint.com/v2/url?u=https-3A__robodial.github.io&d=DwIFAg&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=Sx-vtJ9feHc9yDpBLAsy1LDx6vmaHXq6ca1jYXAePiw&s=dBXjg3eAite8l-L2bvgbVZO5UYtqw82GtAYJB4O42xI&e=>
More information about the robotics-worldwide