L Auslander, E Feig, et al.
Advances in Applied Mathematics
In recent years, with the development of quantum machine learning, Quantum Neural Networks (QNNs) have gained increasing attention in the field of Natural Language Processing (NLP) and have achieved a series of promising results. However, most existing QNN models focus on the architectures of Quantum Recurrent Neural Network (QRNN) and Quantum Self-Attention Mechanism (QSAM). In this work, we propose a novel QNN model based on quantum convolution. We develop the quantum depthwise convolution that significantly reduces the number of parameters and lowers computational complexity. We also introduce the multi-scale feature fusion mechanism to enhance model performance by integrating word-level and sentence-level features. Additionally, we propose the quantum word embedding and quantum sentence embedding, which provide embedding vectors more efficiently. Through experiments on two benchmark text classification datasets, we demonstrate our model outperforms a wide range of state-of-the-art QNN models. Notably, our model achieves a new state-of-the-art test accuracy of 96.77% on the RP dataset. We also show the advantages of our quantum model over its classical counterparts in its ability to improve test accuracy using fewer parameters. Finally, an ablation test confirms the effectiveness of the multi-scale feature fusion mechanism and quantum depthwise convolution in enhancing model performance.
L Auslander, E Feig, et al.
Advances in Applied Mathematics
Peter Wendt
Electronic Imaging: Advanced Devices and Systems 1990
A. Skumanich
SPIE OE/LASE 1992
Renu Tewari, Richard P. King, et al.
IS&T/SPIE Electronic Imaging 1996