SCHEDULING OF NEURAL NETWORK PARAMETERS

ABSTRACT

The idea is to automatically alter the batch size of the inpu and the learning rate of the neural network based on the loss. Two approaches used in this project. One is based on the trend of the current loss and the other one uses reinforcement learning. This has been applied on neural networks which are used to solve differential equations.




DOWNLOADS


> GitHub



KEY REFERENCES

> L. L. Lima, J. R. Ferreira and M. C. Oliveira, “Efficient Hyperparameter Optimization of Convolutional Neural Networks on Classification of Early Pulmonary Nodules”, 2019 IEEE 32nd International Symposium on Computer-Based Medical Systems (CBMS), pp. 144-149, 2019

> S. Khandelwal, S. Rana, K. Pandey and P. Kaushik, “Analysis of Hy-perparameter Tuning in Neural Style Transfer", 2018 Fifth International Conference on Parallel Distributed and Grid Computing

> N. Kamiyama, H. Izumi, N. Iijima, H. Mitsui, M. Yoshida and M. Sone, “Tuning of learning rate and momentum on backpropagation”, IJCNN-91-Seattle International Joint Conference on Neural Networks, vol. 2, pp. 963, 1991

> S. Roy, “Factors influencing the choice of a learning rate for a back-propagation neural network”, Proceedings of 1994 IEEE International Conference on Neural Networks