Sequential Neural Models with Stochastic Layers

Download this episode

Download Video

Description

How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks? This paper introduces stochastic recurrent neural networks which glue a deterministic recurrent neural network and a state space model together to form a stochastic and sequential neural generative model. The clear separation of deterministic and stochastic layers allows a structured variational inference network to track the factorization of the model's posterior distribution. By retaining both the nonlinear recursive structure of a recurrent neural network and averaging over the uncertainty in a latent path, like a state space model, we improve the state of the art results on the Blizzard and TIMIT speech modeling data sets by a large margin, while achieving comparable performances to competing methods on polyphonic music modeling.

Day:

2

Embed

Format

Available formats for this video:

Actual format may change based on video formats available and browser capability.

    The Discussion

    Comments closed

    Comments have been closed since this content was published more than 30 days ago, but if you'd like to send us feedback you canĀ Contact Us.