Important Terms in Deep Learning (TBU)
15 Feb 2019
Terminology List
- Weights
- Hidden Layer
- Gradient
- Exploding Gradient Problem
- Vanishing Gradient Problem
- Activation Function
- ReLU (Rectified Linear Units)
- Sigmoid Function
- Cost Function
- Backpropagation
- Learning Rate
- Batch, Epoch, Iteration
- Dropout
- Pooling, Padding
Weights
Hidden Layer
Gradient
Exploding Gradient Problem
Vanishing Gradient Problem
Activation Functions
ReLU
Sigmoid Function
Cost Function
Backpropagation
Learning Rate
Batch, Epoch, Iteration
Dropout
Pooling, Padding
References
- https://www.analyticsvidhya.com/blog/2017/05/25-must-know-terms-concepts-for-beginners-in-deep-learning/