Table of Contents
- Table of Contents [Getting too long]
[Deep Learning book] Sequence Modeling
Introduction
Unfolding Computational Graphs
- Any function involving recurrence can be considered a recurrent neural network
- When the recurrent network is used in task where we predict future from past, we use $h^t$ as lossy summary of the past task-relevant features.
Lossy because a variable length sequence is mapped to a fixed length vector
Recurrent Neural Network [RNN]
Design Patterns
Output at each time step and have recurrent connections between hidden states
- This design pattern of RNNs is universal in a sense that any function computable by a turing machine can be computed by a RNN of finite size.
Output at each time step and have recurrent connections between output states
Output at the last time step