【Featured in Journal】Adaptive Interpolating Quantum Transform: A Quantum-Native Framework for Efficient Transform Learning
- 田中拓哉
- 1 day ago
- 1 min read
A research study led by Gekko of Quemix, proposing a novel method called AIQT (Adaptive Interpolating Quantum Transform) to overcome training difficulties in quantum machine learning, has been published in Physical Review A, a journal of the American Physical Society.→ View the article
In conventional quantum machine learning, deeper circuits increase the number of trainable parameters, leading to training difficulties due to issues like vanishing gradients. This study introduces AIQT, a quantum-native framework using unitary operations that can continuously interpolate between multiple quantum transforms, such as between the Hadamard transform and the quantum Fourier transform (QFT).
This approach achieves highly expressive transformations on quantum states with only a small number of parameters, enabling efficient and expressive transform learning in a quantum-native manner.
The approach builds on our previous work, General Transform (GT), published in May 2025, which demonstrated how to learn optimal combinations of multiple transforms in classical machine learning. This study extends that concept into quantum circuits.





Comments