Keras: Activations
聊一聊深度学习的activation function
https://zhuanlan.zhihu.com/p/25110450
高级激活层 Advanced Activations
https://keras.io/zh/layers/advanced-activations/
How could we use Leaky ReLU and Parametric ReLU as activation function ?
https://github.com/keras-team/keras/issues/117
26种神经网络激活函数可视化
這篇分析的很棒
https://www.jiqizhixin.com/articles/2017-10-10-3
激活函数ReLU、Leaky ReLU、PReLU和RReLU
https://blog.csdn.net/qq_23304241/article/details/80300149
深度学习基础(十二)—— ReLU vs PReLU
https://blog.csdn.net/lanchunhui/article/details/52644823
https://zhuanlan.zhihu.com/p/25110450
高级激活层 Advanced Activations
https://keras.io/zh/layers/advanced-activations/
How could we use Leaky ReLU and Parametric ReLU as activation function ?
https://github.com/keras-team/keras/issues/117
26种神经网络激活函数可视化
這篇分析的很棒
https://www.jiqizhixin.com/articles/2017-10-10-3
激活函数ReLU、Leaky ReLU、PReLU和RReLU
https://blog.csdn.net/qq_23304241/article/details/80300149
深度学习基础(十二)—— ReLU vs PReLU
https://blog.csdn.net/lanchunhui/article/details/52644823