A Mean-field Theory for Quantum Neural Networks

A Mean-field Theory for Quantum Neural Networks

Auteur : Ryan Cohen

Date de publication : 2024

Éditeur : Pennsylvania State University

Nombre de pages : Non disponible

Résumé du livre

Quantum machine learning is an emerging technology with the potential to solve many problems in data science. It works with a large quantum circuit with many parameters, so the model and training process are quite complicated. We need to develop a mathematical description for the training problem in the context of a large parameter space. In a previous paper on classical neural networks, it was demonstrated that for neural networks with a growing number of parameters $n$, the loss function becomes convex, and the approximation error of the network scales as $O(n^{-1})$. In this thesis, we extend those results to a quantum framework. We prove that the loss landscape is asymptotically convex and run numerical experiments to attempt to show that the error scales as $O(n^{-1})$, but conclude that the latter requires further study.

Connexion / Inscription

Saisissez votre e-mail pour vous connecter ou créer un compte

Connexion

Inscription

Mot de passe oublié ?

Nous allons vous envoyer un message pour vous permettre de vous connecter.