HUSCAP logo Hokkaido Univ. logo

Hokkaido University Collection of Scholarly and Academic Papers >
Graduate School of Information Science and Technology / Faculty of Information Science and Technology >
Peer-reviewed Journal Articles, etc >

Effective neural network training with adaptive learning rate based on training loss

This item is licensed under: Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International

Files in This Item:
NEUNET_takase.pdf1.41 MBPDFView/Open
Please use this identifier to cite or link to this item:http://hdl.handle.net/2115/77798

Title: Effective neural network training with adaptive learning rate based on training loss
Authors: Takase, Tomoumi Browse this author
Oyama, Satoshi Browse this author →KAKEN DB
Kurihara, Masahito Browse this author →KAKEN DB
Keywords: Multilayer perceptron
Deep learning
Neural network training
Stochastic gradient descent
Learning rate
Beam search
Issue Date: May-2018
Publisher: Elsevier
Journal Title: NEURAL NETWORKS
Volume: 101
Start Page: 68
End Page: 78
Publisher DOI: 10.1016/j.neunet.2018.01.016
Abstract: A method that uses an adaptive learning rate is presented for training neural networks. Unlike most conventional updating methods in which the learning rate gradually decreases during training, the proposed method increases or decreases the learning rate adaptively so that the training loss (the sum of cross-entropy losses for all training samples) decreases as much as possible. It thus provides a wider search range for solutions and thus a lower test error rate. The experiments with some well-known datasets to train a multilayer perceptron show that the proposed method is effective for obtaining a better test accuracy under certain conditions. (c) 2018 Elsevier Ltd. All rights reserved.
Rights: ©2018. This manuscript version is made available under the CC-BY-NC-ND 4.0 license https://creativecommons.org/licenses/by-nc-nd/4.0/
https://creativecommons.org/licenses/by-nc-nd/4.0/
Type: article (author version)
URI: http://hdl.handle.net/2115/77798
Appears in Collections:情報科学院・情報科学研究院 (Graduate School of Information Science and Technology / Faculty of Information Science and Technology) > 雑誌発表論文等 (Peer-reviewed Journal Articles, etc)

Submitter: 高瀬 朝海

Export metadata:

OAI-PMH ( junii2 , jpcoar )

MathJax is now OFF:


 

 - Hokkaido University