**Recurrent Neural Network
**

By Prof. Seungchul Lee

http://iailab.kaist.ac.kr/

Industrial AI Lab at KAIST

http://iailab.kaist.ac.kr/

Industrial AI Lab at KAIST

Table of Contents

- RNNs are a family of neural networks for processing sequential data

- Separate parameters for each value of the time index

- Cannot share statistical strength across different time indices

- Input at each time is a vector
- Each layer has many neurons
- Output layer too may have many neurons
- But will represent everything simple boxes
- Each box actually represents an entire layer with many units

- The state-space model

- This is a recurrent neural network
- State summarizes information about the entire past

- Single Hidden Layer RNN (Simplest State-Space Model)

- Multiple Recurrent Layer RNN

- Recurrent Neural Network
- Simplified models often drawn
- The loops imply recurrence

- Gradients propagated over many stages tend to either
**vanish**or**explode** - Difficulty with long-term dependencies arises from the exponentially smaller weights given to long-term interactions
- Introduce a memory state that runs through only linear operators
- Use gating units to control the updates of the state

Example: "I grew up in Franceâ€¦ I speak fluent *French*."