AI-Accelerated Product Development
...
ARTICLE
A Gentle Introduction to the Rectified Linear Unit (ReLU)
Star
Covers:
theory
of
Rectified Linear Unit
Estimated time needed to finish:
15 minutes
Questions this item addresses:
What is ReLU?
How to code ReLU activation function?
What's are the advantages or ReLU?
How to use this item?
Read sections 1 to 4, and implement the code if desired.
URL:
https://machinelearningmastery.com/rectified-linear-activation-function-for-deep-learning-neural-networks/
Author(s) / creator(s) / reference(s)
Jason Brownlee
0
comment
Recipe
public
Share
Star
0
Rectified Linear Units (ReLU)
Contributors
Total time needed:
~30 minutes
Objectives
Understand the concept of rectified linear units
Potential Use Cases
Mathematical foundations for Deep Learning
Who is This For ?
BEGINNER
Deep Learning practitioners new to Mathematical foundations
Hide details
Click on each of the following
annotated items
to see details.
ARTICLE
1. Rectified Linear Units (ReLU) in Deep Learning
What is ReLU?
Why It Works?
How ReLU captures Interactions and Non-Linearities?
15 minutes
BOOK_CHAPTER
2. Rectified Linear Units and Their Generalizations
What is a Rectified Linear Unit?
10 minutes
OTHER
3. ReLu(x)
How is ReLU function represented ?
10 minutes
ARTICLE
4. A Gentle Introduction to the Rectified Linear Unit (ReLU)
What is ReLU?
How to code ReLU activation function?
What's are the advantages or ReLU?
15 minutes
ARTICLE
5. Intuitive explanation of why ReLU works
Why ReLU works?
10 minutes
Concepts Covered
0
comment