本文已被:浏览 764次 下载 277次
投稿时间:2020-03-18
投稿时间:2020-03-18
中文摘要: 对常用的激活函数进行了研究比较。对曾经提出的一种新型激活函数ArcReLU进行了改进,使其具有更快的收敛速度和更低的计算消耗。实验证明,改进的ArcReLU函数既能显著加快反向传播神经网络的训练速度,又能有效降低训练误差,甚至能避免梯度消失的问题。
Abstract:The common activation functions are studied and compared.A new activation function,ArcReLU,is improved,which has faster convergence speed and lower calculation cost.Experiments show that the function can not only significantly accelerate the training speed of BP neural network,but also effectively reduce the training error and avoid the problem of gradient disappearance.
文章编号:20215017 中图分类号:TP18 文献标志码:
基金项目:上海市自然科学基金(19ZR1420800)。
引用文本:
徐菲菲,许赟杰.基于ArcReLU激活函数的优化研究[J].上海电力大学学报,2021,37(5):507-511.
XU Feifei,XU Yunjie.Optimization of Activation Function Based on ArcReLU Function[J].Journal of Shanghai University of Electric Power,2021,37(5):507-511.
徐菲菲,许赟杰.基于ArcReLU激活函数的优化研究[J].上海电力大学学报,2021,37(5):507-511.
XU Feifei,XU Yunjie.Optimization of Activation Function Based on ArcReLU Function[J].Journal of Shanghai University of Electric Power,2021,37(5):507-511.