📅  最后修改于: 2022-03-11 14:58:55.238000             🧑  作者: Mango
The rectified linear activation function overcomes the vanishing gradient problem, allowing models to learn faster and perform better. The rectified linear activation is the default activation when developing multilayer Perceptron and convolutional neural networks