Improving First-Order Optimization Algorithms (Student Abstract)

Authors

  • Tato Ange Université du Québec à Montréal
  • Nkambou Roger Université du Québec À Montréal (UQAM)

DOI:

https://doi.org/10.1609/aaai.v34i10.7240

Abstract

This paper presents a simple and intuitive technique to accelerate the convergence of first-order optimization algorithms. The proposed solution modifies the update rule, based on the variation of the direction of the gradient and the previous step taken during training. Results after tests show that the technique has the potential to significantly improve the performance of existing first-order optimization algorithms.

Downloads

Published

2020-04-03

How to Cite

Ange, T., & Roger, N. (2020). Improving First-Order Optimization Algorithms (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 34(10), 13935-13936. https://doi.org/10.1609/aaai.v34i10.7240

Issue

Section

Student Abstract Track