Quantum computers are expected to be the next-generation computers in various fields. Quantum computers are different from classical computers, which store the current binary information of 0 and 1. Quantum computers can use quantum superposition, which can hold 0 to 1 probability as continuous quantities in one qubit. Therefore, existing algorithms are not necessarily efficient computational algorithms on quantum computers. GPU-based quantum simulations, such as cuQuantum, have recently been released to develop quantum algorithms. This paper focuses on quantum image processing, presents how far quantum image processing can be efficiently described today and verifies on GPUs that multiple image processing can be described using cuQuantum.
|