Covers: theory of Rectified Linear Unit

- What is ReLU?
- How to code ReLU activation function?
- What's are the advantages or ReLU?

Read sections 1 to 4, and implement the code if desired.

Jason Brownlee

0 comment

Contributors

- Objectives
- Understand the concept of rectified linear units
- Potential Use Cases
- Mathematical foundations for Deep Learning
- Who is This For ?
- BEGINNERDeep Learning practitioners new to Mathematical foundations

Click on each of the following **annotated items** to see details.

ARTICLE 1. Rectified Linear Units (ReLU) in Deep Learning

- What is ReLU?
- Why It Works?
- How ReLU captures Interactions and Non-Linearities?

15 minutes

BOOK_CHAPTER 2. Rectiﬁed Linear Units and Their Generalizations

- What is a Rectified Linear Unit?

10 minutes

OTHER 3. ReLu(x)

- How is ReLU function represented ?

10 minutes

ARTICLE 4. A Gentle Introduction to the Rectified Linear Unit (ReLU)

- What is ReLU?
- How to code ReLU activation function?
- What's are the advantages or ReLU?

15 minutes

ARTICLE 5. Intuitive explanation of why ReLU works

- Why ReLU works?

10 minutes

0 comment