The second blog post in this series, sharing brief descriptions of the papers we are presenting at NIPS 2016 Conference in Barcelona.Sequential Neural Models with Stochastic LayersAuthors:Marco Fraccaro, Sren Kaae Snderby, Ulrich Paquet, Ole WintherMuch of our reasoning about the world is sequential, from listening to sounds and voices and music, to imagining our steps to reach a destination, to tracking a tennis ball through time. All these sequences have some amount of latent random structure in them. Two powerful and complementary models, recurrent neural networks (RNNs) and stochastic state space models (SSMs), are widely used to model sequential data like these. RNNs are excellent at capturing longer-term dependencies in data, while SSMs model uncertainty in the sequence’s underlying latent random structure, and are great for tracking and control.Is it possible to get the best of both worlds? In this paper we show how you can, by carefully layering deterministic (RNN) and stochastic (SSM) layers. We show how you can efficiently reason about a sequences present latent structure, given its past (filtering) and also its past and future (smoothing).For further details and related work, please see the paper https://arxiv.org/abs/1605.07571Check it out at NIPS:Tue Dec 6th 05:20 – 05:40 PM @ Area 1+2 (Oral) in Deep LearningTue Dec 6th 06:00 – 09:30 PM @ Area 5+6+7+8 #179Read More