기본 정보
연구 분야
프로젝트
발행물
구성원
article|
인용수 0
·2026
Adaptive Quantum Transformer: Adapter-Based Parameter Reuse for Dynamic Qubit Scaling in Vision Transformers
Hyochan Kim, Jong Hwan Ko
IF 1.1International Journal of Pattern Recognition and Artificial Intelligence
초록

Transformers excel in various domains but often demand massive computational resources. Quantum Neural Networks (QNNs) combine classical and quantum computing to mitigate these costs. However, QNNs typically fix the number of qubits, forcing complete retraining whenever the qubit count changes and leading to substantial inefficiency. We propose an Adaptive Quantum Transformer (AQT) to resolve this limitation by introducing an adapter module — originally designed for large language models (LLMs) — into a quantum vision Transformer. This module reuses parameters when the qubit count expands or contracts, preserving learned states. We validate AQT in an autoscaling cloud environment, demonstrating dynamic qubit scaling without sacrificing accuracy or retraining from scratch. Results on vision tasks reveal that AQT achieves higher accuracy than a conventional quantum vision Transformer while converging faster and requiring fewer computational resources. Notably, AQT provides over triple the efficiency in training time and better stability under fluctuating qubit availability, suggesting a promising avenue for practical, cost-effective quantum–classical hybrid intelligence.

키워드
QubitQuantum computerQuantumTransformerScalingRobustness (evolution)
타입
article
IF / 인용수
1.1 / 0
게재 연도
2026