Skip to content

t0gan/Natural-Gradient-Descent

Repository files navigation

Natural Gradient Descent

View Report

Abstract

Gradient Descent is a technique used to optimize a given cost function for a learning model. Although this technique is broadly used, it has a lot of limitations. This work aims to explicate the natural gradient descent, an optimization algorithm that is useful for finding the local minimum of a differentiable function. We start by defining several concepts that are necessary to understand the gradient descent, then we explain the limitations of the standard gradient descent, which is generally used to find the values of a function's parameters that minimize a cost function as much as possible. In the next section, we introduce the natural gradient descent, where we explain the theoretical background and give examples to demonstrate its efficiency in optimizing parameters for a required path and on a given data set. Finally, an introduction to the quantum analog of natural gradient descent, the quantum natural gradient is made, which is useful for optimizing the parameters used in variational quantum circuits, where we first construct a variational circuit, then we try to optimize its parameters using both the Qiskit and Pennylane libraries. In the end, we compare both natural and standard gradient algorithms on the variational circuit.

Index Terms

Optimization, Gradient Descent, Natural Gradient Descent, Quantum Natural Gradient Descent, Variational Quantum Circuit

Acknowledgment

The work in this paper is not officially published.

About

An Efficient Optimization Method: Natural Gradient Descent

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published