**Recurrent Neural Network
**

By Prof. Seungchul Lee

http://iai.postech.ac.kr/

Industrial AI Lab at POSTECH

http://iai.postech.ac.kr/

Industrial AI Lab at POSTECH

Table of Contents

InÂ [2]:

```
%%html
<center><iframe src="https://www.youtube.com/embed/pxPu2IdnEiE?rel=0"
width="560" height="315" frameborder="0" allowfullscreen></iframe></center>
```

InÂ [3]:

```
%%html
<center><iframe src="https://www.youtube.com/embed/SwkS73r4J5c?rel=0"
width="560" height="315" frameborder="0" allowfullscreen></iframe></center>
```

InÂ [4]:

```
%%html
<center><iframe src="https://www.youtube.com/embed/DkWE9FonbO0?rel=0"
width="560" height="315" frameborder="0" allowfullscreen></iframe></center>
```

- RNNs are a family of neural networks for processing sequential data

- Separate parameters for each value of the time index

- Cannot share statistical strength across different time indices

- Input at each time is a vector
- Each layer has many neurons
- Output layer too may have many neurons
- But will represent everything simple boxes
- Each box actually represents an entire layer with many units

- The state-space model

- This is a recurrent neural network
- State summarizes information about the entire past

- Single Hidden Layer RNN (Simplest State-Space Model)

- Multiple Recurrent Layer RNN

- Recurrent Neural Network
- Simplified models often drawn
- The loops imply recurrence

- Gradients propagated over many stages tend to either
**vanish**or**explode** - Difficulty with long-term dependencies arises from the exponentially smaller weights given to long-term interactions
- Introduce a memory state that runs through only linear operators
- Use gating units to control the updates of the state

Example: "I grew up in Franceâ€¦ I speak fluent *French*."