Recipe

publicShareStar(0)

### Rectified Linear Units (ReLU)

**Total time needed: **~30 minutes- Objectives
- Understand the concept of rectified linear units
- Potential Use Cases
- Mathematical foundations for Deep Learning
- Who is This For ?
- BEGINNERDeep Learning practitioners new to Mathematical foundations

Click on each of the following **annotated items** to see details.

Resource Asset5/5

ARTICLE 1. Rectified Linear Units (ReLU) in Deep Learning

- What is ReLU?
- Why It Works?
- How ReLU captures Interactions and Non-Linearities?

15 minutes

BOOK_CHAPTER 2. Rectiﬁed Linear Units and Their Generalizations

- What is a Rectified Linear Unit?

10 minutes

OTHER 3. ReLu(x)

- How is ReLU function represented ?

10 minutes

ARTICLE 4. A Gentle Introduction to the Rectified Linear Unit (ReLU)

- What is ReLU?
- How to code ReLU activation function?
- What's are the advantages or ReLU?

15 minutes

ARTICLE 5. Intuitive explanation of why ReLU works

10 minutes

0 comment