[robotics-worldwide] [meetings] Call For Participation - GAZE 2019 @ ICCV19: Call for Papers & Extended Abstract

Hyung Jin Chang h.j.chang at bham.ac.uk
Wed Jul 3 10:06:29 PDT 2019


Dear colleague,

We are organising an International Workshop on Gaze Estimation and
Prediction in the Wild (GAZE 2019) this year jointly with ICCV 2019 in
Seoul, South Korea. Please, consider submitting a workshop paper or an
extended abstract of ongoing or published recent work!

Please, check the call for papers below.

Best regards,

GAZE 2019 organisers

************************************************************************
                 International Workshop on
         Gaze Estimation and Prediction in the Wild
                         GAZE 2019

             https://urldefense.proofpoint.com/v2/url?u=https-3A__gazeworkshop.github.io&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=98uwTnrgKlkETlnyjiCcb6Vz08hzdiCXXdq6o00lbFw&s=KsuG90pRvdL1ap5IC-AT7ue7IcBRc9pbrH6op3HMTuQ&e= 

          Seoul, South Korea - October 27th, 2019

              in conjunction with ICCV 2019
************************************************************************
---------------
CALL FOR PAPERS
---------------
The 1st Workshop on Gaze Estimation and Prediction in the Wild (GAZE 2019)
at ICCV 2019 is the first-of-its-kind workshop focused on designing and
evaluating deep learning methods for the task of gaze estimation and
prediction. We aim to encourage and highlight novel strategies with a focus
on robustness and accuracy in real-world settings. This is expected to be
achieved via novel neural network architectures, incorporating anatomical
insights and constraints, introducing new and challenging datasets, and
exploiting multi-modal training among other directions. This half-day
workshop consists of three invited talks as well as lightning talks from
industry contributors.

The topics of this workshop include but are not limited to:

- Proposal of novel eye detection, gaze estimation, and gaze prediction
pipelines using deep convolutional neural networks.
- Incorporating geometric and anatomical constraints into neural networks
in a differentiable manner.
- Demonstration of robustness to conditions where current methods fail
(illumination, appearance, low-resolution etc.).
- Robust estimation from different data modalities such as RGB, depth, and
near infra-red.
- Leveraging additional cues such as task context, temporal information,
eye movement classification.
- Designing new accurate metrics to account for rapid eye movements in the
real world.
- Semi-supervised / unsupervised / self-supervised learning, meta-learning,
domain adaptation, attention mechanisms and other related machine learning
methods for gaze estimation.
- Methods for temporal gaze estimation and prediction including bayesian
methods.

--------------------------
CALL FOR EXTENDED ABSTRACT
--------------------------
In addition to regular workshop papers, we also invite extended abstracts
of ongoing or published work (e.g. related papers on ICCV main track). We
see this as an opportunity for authors to promote their work to an
interested audience.
Extended abstracts are limited to three pages.
The submission must be sent to gaze.iccv19 at gmail.com

---------------
IMPORTANT DATES
---------------
Submission deadline: July 29, 2019
Notification of acceptance: August 16, 2019
Camera ready deadline: August 30, 2019
Workshop: October 27, 2019 (Morning)

* Extended abstract deadline: TBD (in September)

---------------------------
INVITED SPEAKERS
---------------------------
Andreas Bulling, University of Stuttgart, Germany
Yusuke Sugano, University of Tokyo, Japan

-------------------
WORKSHOP ORGANIZERS
-------------------
Hyung Jin Chang, University of Birmingham, UK
Seonwook Park, ETH Zürich, Switzerland
Xucong Zhang, ETH Zürich, Switzerland
Otmar Hilliges, ETH Zürich, Switzerland
Aleš Leonardis, University of Birmingham, UK

-----------------
WORKSHOP SPONSORS
-----------------
Samsung
NVIDIA
TOBII

---------------------------
CONTACT
---------------------------
gaze.iccv19 at gmail.com

*--------------------------------------------*
*Hyung Jin Chang*, PhD
Lecturer in Intelligent Robotics Laboratory
School of Computer Science
University of Birmingham
Edgbaston, Birmingham, UK
https://urldefense.proofpoint.com/v2/url?u=https-3A__www.cs.bham.ac.uk_-7Echanghj&d=DwIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=98uwTnrgKlkETlnyjiCcb6Vz08hzdiCXXdq6o00lbFw&s=Xs_YM9_fovx5mQsvWL8ER-gyr07SWp2mTvuAilidGJk&e= 
*--------------------------------------------*


More information about the robotics-worldwide mailing list