[robotics-worldwide] [meetings] Deadline approaching! -- CFP IROS16 - See, Touch and Hear: 2nd Workshop on multimodal sensor-based robot control

Andrea Cherubini andrea.cherubini at lirmm.fr
Thu Aug 25 05:56:20 PDT 2016

See, Touch and Hear: 2nd Workshop on multimodal sensor-based robot control for HRI and soft manipulation
Oct 10, Daejeon, Korea 

***Related Special Issue in the journal ROBOTICS AND AUTONOMOUS SYSTEMS, including extended versions of the best papers!!!***


The recent development of bio-inspired sensors (which are nowadays affordable and lightweight) has spurred their use on robots, particularly on anthropomorphic ones (e.g., humanoids and dexterous hands). Such sensors include e.g. RGB-D cameras, tactile skins,  microphones, joint torque sensors, and capacitive proximity sensors.
Since these sensors generally measure different physical phenomena, it is preferable to use them directly in the low-level servo controller rather than to apply multi-sensory fusion, or to design complex state machines. We believe that adaptive sensor-based methods directly linking perception to action can provide better solutions in unpredictable scenarios, than traditional planning and model-based techniques, which require a priori models of the environment. This idea, originally proposed in the hybrid position-force control paradigm, when extended to feedback from multiple sensors, brings new challenges to the controller design; e.g. related to the sensors characteristics (synchronization, hybrid control, task compatibility, etc.) or to the task representation.
This multimodal approach represents at best our cognitive processes (which directly link perception and action), and is fundamental in many innovative robotic applications, such as human-robot interaction, soft material manipulation, and whole-body control.
Human-robot interaction for collaborative tasks often relies on force/tactile feedback, to transmit the user intention to the robot. However, the robot should also be capable of recognizing the intention, when there is no direct contact with the human. Possible solutions come from audio and/or visual data, which should then be combined with haptics to obtain the best result. These approaches are particularly useful in whole-body control of humanoid robots, since their actuators and sensors are generally bio-inspired, to facilitate interaction with the human.
The automatic manipulation of soft materials (e.g., in the food industry) represents a second important case study. The natural evolution of recent works on vision-based servoing of soft objects is the integration of haptics and force feedback.
Whole-body control is a third field of research that would greatly profit from the discussed methods. In fact, multiple tasks (manipulation, self collision avoidance, etc.) can be simultaneously realized by exploiting the diverse sensing capabilities of the robot body.
We propose a half-day workshop to enhance active collaboration and discuss formal methods for sensor-based control. The purpose of the workshop is to bring together researchers with common interests in the area of multimodal control, based on a variety of sensed signals, including vision (2D and 3D), touch (haptics), audio, position, force, proximity (from capacitive measurements) etc. The invited speakers will share their experience and will give an insight into the evolution and current status of multimodal control. The workshop will also be opened to paper submission, and the final schedule will be adapted depending on the quantity and quality of the submissions. We will organize a poster session of the submitted papers, to ease interaction and discussion between participants.

Topics of interest

- hands-on applications where multimodal control is necessary
- whole-body control with heterogeneous sensors
- bio-inspired approaches to multimodal control
- theoretical foundations of multimodal control (e.g., task frame approaches or constraint-based task specification)
- new trends in sensor-based control, based on perspective integration with other modalities (e.g: visual deformation servoing, tactile/haptic servoing, robot audition, proximity servoing)

Submission Information

Prospective participants are required to submit an extended abstract (maximum 2 pages in length), but videos are also welcome!
All submissions will be reviewed using a single-blind review process.
Accepted contributions will be presented during the workshop as posters. 
Submissions must be sent in pdf, following the IEEE conference style (two-columns), to: cherubini_AT_lirmm_DOT_fr 
indicating [IROS 2016 Workshop] in the e-mail subject. 
- Submission Deadline: August 29
- Notification of acceptance: September 2
- Camera-ready deadline: September 6
- Workshop day: October 10


Andrea Cherubini
Youcef Mezouar
David Navarro-Alarcon
Mario Prats

RAS TC Support:

- Whole-Body Control, 
- Humanoid Robotics,
- Robotic Hands, Grasping and Manipulation,
- Haptics,
- Human-Robot Interaction & Coordination.



Andrea Cherubini
Associate Professor - MCF
Responsable Master Robotique
LIRMM / Université de Montpellier

More information about the robotics-worldwide mailing list