Deep Learning basics for video — Convolutional Neural Networks (CNNs) — Part 2
Mar 1, 2025
This article explains different activation functions used in neural networks, such as Sigmoid, Tanh, and ReLU, highlighting their advantages and limitations. It describes the vanishing gradient problem, which slows down learning in deep networks due to very small gradients. The article also covers how backpropagation adjusts weights using gradients to improve model predictions. Finally, it explains pooling layers and fully connected layers, essential components in convolutional neural networks for feature reduction and decision making.