[robotics-worldwide] [meetings] [CFP] NIPS 2016 Symposium: Recurrent Neural Networks and Other Machines that Learn Algorithms
juergen at idsia.ch
Mon Sep 26 12:52:32 PDT 2016
NIPS 2016 Symposium:
Recurrent Neural Networks and Other Machines that Learn Algorithms
Thursday, December 8, 2016, Barcelona
Soon after the birth of modern computer science in the 1930s, two fundamental questions arose: 1. How can computers learn useful programs from experience, as opposed to being programmed by human programmers? 2. How to program parallel multiprocessor machines, as opposed to traditional serial architectures? Both questions found natural answers in the field of Recurrent Neural Networks (RNNs), which are brain-inspired general purpose computers that can learn parallel-sequential programs or algorithms encoded as weight matrices.
The first RNNaissance NIPS workshop dates back to 2003: https://urldefense.proofpoint.com/v2/url?u=http-3A__people.idsia.ch_-7Ejuergen_rnnaissance.html&d=DQIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=zQvU1BPe9TmKewI4OBL8H84fTheysHn4TOmRdArfHsw&s=7xu2sV-F8JsDgaowtqqojclaF-wmbJ0fAHwzqKpZ4E4&e= . Since then, a lot has happened. Some of the most successful applications in machine learning (including deep learning) are now driven by RNNs such as Long Short-Term Memory, e.g., speech recognition, video recognition, natural language processing, image captioning, time series prediction, etc. Through the world's most valuable public companies, billions of people can now access this technology through their smartphones and other devices, e.g., in the form of Google Voice or on Apple's iOS. Reinforcement-learning and evolutionary RNNs are solving complex control tasks from raw video input. Many RNN-based methods learn sequential attention strategies.
At this symposium, we will review the latest developments in all of these fields, and focus not only on RNNs, but also on learning machines in which RNNs interact with external memory such as neural Turing machines, memory networks, and related memory architectures such as fast weight networks and neural stack machines. In this context we will also will discuss asymptotically optimal program search methods and their practical relevance.
Our target audience has heard a bit about RNNs, the deepest of all neural networks, but will be happy to hear again a summary of the basics and then delve into the latest advanced topics to see and understand what has recently become possible. All invited talks will be followed by open discussions, with further discussions during a poster session. Finally, we will also have a panel discussion on the bright future of RNNs, and their pros and cons.
A tentative list of speakers can be found at the symposium website: https://urldefense.proofpoint.com/v2/url?u=http-3A__people.idsia.ch_-7Erupesh_rnnsymposium2016_index.html&d=DQIFaQ&c=clK7kQUTWtAVEOVIgvi0NU5BOUHhpN0H8p7CSfnc_gI&r=0w3solp5fswiyWF2RL6rSs8MCeFamFEPafDTOhgTfYI&m=zQvU1BPe9TmKewI4OBL8H84fTheysHn4TOmRdArfHsw&s=wzbtncLL8gq3O0ZIQHEAuk7LwvjS7oPS5V4Y8d1t_vg&e=
Call for Posters
We invite researchers and practitioners to submit poster abstracts for presentation during the symposium (min. 2 pages, no page limit). All contributions related to the symposium theme are encouraged. The organizing committee will select posters to maximize quality and diversity within the available display space.
For submissions, non-anonymous abstracts should be emailed to rnn.nips2016 at gmail.com by the corresponding authors. Selected abstracts will be advertised on the symposium website, and posters will be visible throughout the duration of the symposium. NIPS attendees will interact with poster presenters during the light dinner break (6:30 - 7:30 PM). The submission deadline is October 15th, 23:59 PM CET.
Jürgen Schmidhuber & Sepp Hochreiter & Alex Graves & Rupesh Srivastava
More information about the robotics-worldwide