Comparative analysis of LCNet050 and MobileNetV3 architectures in hybrid quantum–classical neural networks for image classification

Authors

DOI:

https://doi.org/10.20535/2786-8729.7.2025.333887

Keywords:

Neural Networks, Quantum Computing, Hybrid Neural Networks, Image Classification

Abstract

This study explores the impact of classical backbone architecture on the performance of hybrid quantum-classical neural networks in image classification tasks. Hybrid models combine the representational power of classical deep learning with the potential advantages of quantum computation. Specifically, this research employs a quanvolutional neural network architecture in which a quantum convolutional layer, based on a four-qubit Ry circuit, preprocesses input images before classical processing. 

Despite the growing interest in hybrid models, few studies have systematically investigated how variations in classical architecture design affect the overall performance of hybrid quantum-classical neural networks. To address this gap, we compare two lightweight convolutional backbones – MobileNetV3Small050 and LCNet050 – integrated with an identical quantum preprocessing layer. Both models are evaluated on the CIFAR-10 dataset using 5-fold stratified cross-validation. Performance is assessed using multiple metrics, including accuracy, macro- and micro-averaged area under the curve, and class-wise confusion matrices. 

The results indicate that the LCNet-based hybrid model consistently outperforms its MobileNet counterpart, achieving higher overall accuracy and area under the curve scores, along with improved class balance and robustness in distinguishing less-represented classes. These findings underscore the critical role of classical backbone selection in hybrid quantum-classical architectures. While the quantum layer remains fixed, the synergy between quantum preprocessing and classical feature extraction significantly affects model performance.

This study contributes to a growing body of work on quantum-enhanced learning systems by demonstrating the importance of classical design choices. Future research may extend these insights to alternative datasets, deeper or transformer-based backbones, and more expressive quantum circuits.

Author Biographies

Arsenii Khmelnytskyi, National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”, Kyiv

PhD student of the Computer Engineering Department of the Faculty of informatics and Computer Technique

Yuri Gordienko, National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”, Kyiv

Professor of the Computer Engineering Department of the Faculty of Informatics and Computer Technique, Doctor of Sciences in Physics and Mathematics, Senior Research Fellow

References

H. I. G. Hernández, R. T. Ruiz, and G. Sun, “Image Classification via Quantum Machine Learning,” arXiv preprint, 2020. [Online]. Available: https://doi.org/10.48550/arxiv.2011.02831

S. An, C. Lee, H. Moon, and J. Park, “An Ensemble of Simple Convolutional Neural Network Models for MNIST Digit Recognition,” arXiv preprint, 2020. https://doi.org/10.48550/arxiv.2008.10400

S. Resch and U. R. Karpuzcu, “Quantum Computing: An Overview Across the System Stack,” arXiv preprint, 2019. https://doi.org/10.48550/arxiv.1905.07240

T. Begušić and G. K.-L. Chan, “Fast classical simulation of evidence for the utility of quantum computing before fault tolerance,” arXiv preprint, 2023. https://doi.org/10.48550/arxiv.2306.16372

A. Mari, T. Bromley, J. Izaac, M. Schuld, and N. Killoran, “Transfer Learning in Hybrid Classical-Quantum Neural Networks,” Quantum, vol. 4, p. 340, Oct. 2020. https://doi.org/10.22331/q-2020-10-09-340

Z. Chen, R. Zhang, L. Yu, and Y. Gao, “VQNet: Library for a Quantum-Classical Hybrid Neural Network,” arXiv preprint, 2019. https://doi.org/10.48550/arxiv.1901.09133

K. He, X. Zhang, S. Ren, and J. Sun, “Deep Residual Learning for Image Recognition,” arXiv preprint, 2015. https://doi.org/10.48550/arxiv.1512.03385

C. Cui, J. Wang, B. Wu, and Y. Gao, “PP-LCNet: A Lightweight CPU Convolutional Neural Network,” arXiv preprint, 2021. https://doi.org/10.48550/arxiv.2109.15099

D. Lykov et al., “Tensor Network Quantum Simulator With Step-Dependent Parallelization,” arXiv (Cornell University), 2020. https://doi.org/10.48550/arxiv.2012.02430

A. De Lorenzis, A. Giusti, M. Pierini, and S. Carrazza, “Harnessing quantum extreme learning machines for image classification,” Phys. Rev. Applied, vol. 23, no. 4, Apr. 2025. https://doi.org/10.1103/PhysRevApplied.23.044024

D. Shepherd, “On the Role of Hadamard Gates in Quantum Circuits.”In arXiv:quant-ph/0508153, Mar. 2006. https://doi.org/10.48550/arXiv.quant-ph/0508153

Y. Gordienko, A. Khmelnytskyi, V. Taran, and S. Stirenko, “Hybrid Neural Networks with Multi-channel Quanvolutional Input for Medical Image Classification,” in Trends in Sustainable Computing and Machine Intelligence, ICTSM 2024. Algorithms for Intelligent Systems. Singapore: Springer, 2025, pp. 181–193. https://doi.org/10.1007/978-981-96-1452-3_15

Kakkar, Sushma, et al. "Enhancing energy efficiency and classification modeling through a combined approach of LightGBM and stratified kfold cross-validation." Electric Power. Components and Systems 2024 : 1-19. https://doi.org/10.1080/15325008.2024.2315213

IBM Quantum, “Qiskit Documentation.” [Online]. Available: https://qiskit.org/documentation/. Accessed on: Oct. 8, 2025.

“Kaggle notebook for MobileNetV3Small050.” [Online]. Available: https://www.kaggle.com/code/arseniykhmelnitskiy/cifar10-full-q5-w4-mobnv3-mb1-batch-64-t. Accessed on: Oct. 8, 2025.

Downloads

Published

2025-12-27

How to Cite

[1]
A. Khmelnytskyi and Y. Gordienko, “Comparative analysis of LCNet050 and MobileNetV3 architectures in hybrid quantum–classical neural networks for image classification”, Inf. Comput. and Intell. syst. j., no. 7, pp. 49–60, Dec. 2025.