Cargando…
Resource Saving via Ensemble Techniques for Quantum Neural Networks
Quantum neural networks hold significant promise for numerous applications, particularly as they can be executed on the current generation of quantum hardware. However, due to limited qubits or hardware noise, conducting large-scale experiments often requires significant resources. Moreover, the out...
Autores principales: | , , , , , , |
---|---|
Lenguaje: | eng |
Publicado: |
2023
|
Materias: | |
Acceso en línea: | https://dx.doi.org/10.1007/s42484-023-00126-z http://cds.cern.ch/record/2855442 |
_version_ | 1780977460021559296 |
---|---|
author | Incudini, Massimiliano Grossi, Michele Ceschini, Andrea Mandarino, Antonio Panella, Massimo Vallecorsa, Sofia Windridge, David |
author_facet | Incudini, Massimiliano Grossi, Michele Ceschini, Andrea Mandarino, Antonio Panella, Massimo Vallecorsa, Sofia Windridge, David |
author_sort | Incudini, Massimiliano |
collection | CERN |
description | Quantum neural networks hold significant promise for numerous applications, particularly as they can be executed on the current generation of quantum hardware. However, due to limited qubits or hardware noise, conducting large-scale experiments often requires significant resources. Moreover, the output of the model is susceptible to corruption by quantum hardware noise. To address this issue, we propose the use of ensemble techniques, which involve constructing a single machine learning model based on multiple instances of quantum neural networks. In particular, we implement bagging and AdaBoost techniques, with different data loading configurations, and evaluate their performance on both synthetic and real-world classification and regression tasks. To assess the potential performance improvement under different environments, we conducted experiments on both simulated, noiseless software and IBM superconducting-based QPUs, suggesting these techniques can mitigate the quantum hardware noise. Additionally, we quantify the amount of resources saved using these ensemble techniques. Our findings indicate that these methods enable the construction of large, powerful models even on relatively small quantum devices. |
id | cern-2855442 |
institution | Organización Europea para la Investigación Nuclear |
language | eng |
publishDate | 2023 |
record_format | invenio |
spelling | cern-28554422023-10-26T06:46:04Zdoi:10.1007/s42484-023-00126-zhttp://cds.cern.ch/record/2855442engIncudini, MassimilianoGrossi, MicheleCeschini, AndreaMandarino, AntonioPanella, MassimoVallecorsa, SofiaWindridge, DavidResource Saving via Ensemble Techniques for Quantum Neural Networksquant-phGeneral Theoretical PhysicsQuantum neural networks hold significant promise for numerous applications, particularly as they can be executed on the current generation of quantum hardware. However, due to limited qubits or hardware noise, conducting large-scale experiments often requires significant resources. Moreover, the output of the model is susceptible to corruption by quantum hardware noise. To address this issue, we propose the use of ensemble techniques, which involve constructing a single machine learning model based on multiple instances of quantum neural networks. In particular, we implement bagging and AdaBoost techniques, with different data loading configurations, and evaluate their performance on both synthetic and real-world classification and regression tasks. To assess the potential performance improvement under different environments, we conducted experiments on both simulated, noiseless software and IBM superconducting-based QPUs, suggesting these techniques can mitigate the quantum hardware noise. Additionally, we quantify the amount of resources saved using these ensemble techniques. Our findings indicate that these methods enable the construction of large, powerful models even on relatively small quantum devices.Quantum neural networks hold significant promise for numerous applications, particularly as they can be executed on the current generation of quantum hardware. However, due to limited qubits or hardware noise, conducting large-scale experiments often requires significant resources. Moreover, the output of the model is susceptible to corruption by quantum hardware noise. To address this issue, we propose the use of ensemble techniques, which involve constructing a single machine learning model based on multiple instances of quantum neural networks. In particular, we implement bagging and AdaBoost techniques, with different data loading configurations, and evaluate their performance on both synthetic and real-world classification and regression tasks. To assess the potential performance improvement under different environments, we conduct experiments on both simulated, noiseless software and IBM superconducting-based QPUs, suggesting these techniques can mitigate the quantum hardware noise. Additionally, we quantify the amount of resources saved using these ensemble techniques. Our findings indicate that these methods enable the construction of large, powerful models even on relatively small quantum devices.arXiv:2303.11283oai:cds.cern.ch:28554422023-03-20 |
spellingShingle | quant-ph General Theoretical Physics Incudini, Massimiliano Grossi, Michele Ceschini, Andrea Mandarino, Antonio Panella, Massimo Vallecorsa, Sofia Windridge, David Resource Saving via Ensemble Techniques for Quantum Neural Networks |
title | Resource Saving via Ensemble Techniques for Quantum Neural Networks |
title_full | Resource Saving via Ensemble Techniques for Quantum Neural Networks |
title_fullStr | Resource Saving via Ensemble Techniques for Quantum Neural Networks |
title_full_unstemmed | Resource Saving via Ensemble Techniques for Quantum Neural Networks |
title_short | Resource Saving via Ensemble Techniques for Quantum Neural Networks |
title_sort | resource saving via ensemble techniques for quantum neural networks |
topic | quant-ph General Theoretical Physics |
url | https://dx.doi.org/10.1007/s42484-023-00126-z http://cds.cern.ch/record/2855442 |
work_keys_str_mv | AT incudinimassimiliano resourcesavingviaensembletechniquesforquantumneuralnetworks AT grossimichele resourcesavingviaensembletechniquesforquantumneuralnetworks AT ceschiniandrea resourcesavingviaensembletechniquesforquantumneuralnetworks AT mandarinoantonio resourcesavingviaensembletechniquesforquantumneuralnetworks AT panellamassimo resourcesavingviaensembletechniquesforquantumneuralnetworks AT vallecorsasofia resourcesavingviaensembletechniquesforquantumneuralnetworks AT windridgedavid resourcesavingviaensembletechniquesforquantumneuralnetworks |