Sentient VP of Research, Risto Miikkulainen Recaps the Metalearning Panel at the NIPS Conference

You may have heard of NIPS, the Neural Information Processing Systems Conference. NIPS is the largest and most influential AI conference in the world, this year boasting nearly 8,000 attendees and selling out in just 12 days.

Many of the movers and shakers of the AI world were speaking and/or participating in discussions, including DeepMind CEO Demis Hassabis, Uber AI contrarian Gary Marcus, and of course, Sentient’s very own VP of Research and Professor of Computer Science at University of Texas at Austin, Risto Miikkulainen. Risto spoke at the Metalearning Symposium—one of the four symposia within the conference, and one that Sentient co-sponsored and co-organized.

The Metalearning Symposium functioned as a kind of conference within a conference. It focused on the research of how learning methods can be used to improve other learning methods, discussing emerging topics within the field including evolutionary computation, self-play, benchmark designs, open-ended learning, one-shot learning, and multi-level learning. Risto was joined by 12 other metalearning experts including experts from both academia (Pieter Abbeel, Roman Garnett, Frank Hutter, Satinder Singh) and industry (Max Jaderberg, Ilya Sutskever, Oriol Vinyals, Jane Wang) including co-organizers Chrisantha Fernando from DeepMind, Quoc Le from Google Brain, and Ken Stanley from Uber/UCF.

So what exactly is metalearning? Metalearning can be understood simply as one learning method optimizing the configuration of another, which then learns the actual task. The common approaches include Bayesian optimization, reinforcement learning, gradient descent, and evolutionary optimization. In a nutshell, it’s training models to train models to learn a task.

Sentient’s presentation at the symposium showed how evolutionary-optimization-gated memory units (such as LSTMs, or Long Short-Term Memory units) result in top performance in language modeling (a standard benchmark). Risto also showed how evolutionary optimization significantly improves accuracy in the Omniglot multitask benchmark by evolving modules and by evolving the topologies of deep learning networks. This approach is projected to someday be able to automatically optimize neural network architectures to new tasks.

Risto and the Research team he oversees at Sentient were also one of the top scoring teams in the NIPS 2017 Competition: Learning To Run which had nearly 500 entries. In this competition, participants were provided with a human musculoskeletal model and a physics-based simulation environment and tasked to develop a controller to enable the physiologically-based human model to navigate through a bumpy obstacle course–running and jumping over stones–as quickly as possible (shown below).

 

Stay tuned for more updates!

For more information on our research related news and publications, click here.