Covers: theory of Rectified Linear Unit
Estimated time needed to finish: 15 minutes
Questions this item addresses:
  • What is ReLU?
  • How to code ReLU activation function?
  • What's are the advantages or ReLU?
How to use this item?

Read sections 1 to 4, and implement the code if desired.

Author(s) / creator(s) / reference(s)
Jason Brownlee
0 comment
Recipe
publicShare
Star(0)

Rectified Linear Units (Relu)

Contributors
Total time needed: ~30 minutes
Objectives
Understand the concept of rectified linear units
Potential Use Cases
Mathematical foundations for Deep Learning
Who is This For ?
BEGINNERDeep Learning practitioners new to Mathematical foundations
Click on each of the following annotated items to see details.
Resources5/5
ARTICLE 1. Rectified Linear Units (ReLU) in Deep Learning
  • What is ReLU?
  • Why It Works?
  • How ReLU captures Interactions and Non-Linearities?
15 minutes
BOOK_CHAPTER 2. Rectified Linear Units and Their Generalizations
  • What is a Rectified Linear Unit?
10 minutes
OTHER 3. ReLu(x)
  • How is ReLU function represented ?
10 minutes
ARTICLE 4. A Gentle Introduction to the Rectified Linear Unit (ReLU)
  • What is ReLU?
  • How to code ReLU activation function?
  • What's are the advantages or ReLU?
15 minutes
ARTICLE 5. Intuitive explanation of why ReLU works
  • Why ReLU works?
10 minutes

Concepts Covered

0 comment