Solving linear systems of equations is essential for many problems in science and technology, including problems in machine learning. Existing quantumalgorithms have demonstrated the potential for large speedups, but...
详细信息
Solving linear systems of equations is essential for many problems in science and technology, including problems in machine learning. Existing quantumalgorithms have demonstrated the potential for large speedups, but the required quantum resources are not immediately available on near-termquantum devices. In this work, we study near-term quantum algorithms for linear systems of equations, with a focus on the two-norm and Tikhonov regression settings. We investigate the use of variational algorithms and analyze their optimization landscapes. There exist types of linear systems for which variational algorithms designed to avoid barren plateaus, such as properly-initialized imaginary time evolution and adiabatic-inspired optimization, suffer from a different plateau problem. To circumvent this issue, we design near-termalgorithms based on a core idea: the classical combination of variational quantum states (CQS). We exhibit several provable guarantees for these algorithms, supported by the representation of the linear system on a so-called ansatz tree. The CQS approach and the ansatz tree also admit the systematic application of heuristic approaches, including a gradient-based search. We have conducted numerical experiments solving linear systems as large as 2(300) x 2(300) by considering cases where we can simulate the quantum algorithm efficiently on a classical computer. Our methods may provide benefits for solving linear systems within the reach of near-termquantum devices.
MaxCut is an NP-hard combinatorial optimization problem in graph theory. The quantum approximate optimization algorithms (QAOAs) offer new methods for solving such problems, which are potentially better than classical...
详细信息
MaxCut is an NP-hard combinatorial optimization problem in graph theory. The quantum approximate optimization algorithms (QAOAs) offer new methods for solving such problems, which are potentially better than classical schemes. However, the requirement of N qubits for solving graphs with N-vertex in QAOA, coupled with the large-N N trainability issue due to barren plateaus, poses a substantial challenge for noisy intermediate-scale quantum (NISQ) devices. Recently, a hybrid quantum-classical algorithm has been proposed for solving semidefinite programs (SDPs), named NISQ SDP Solver (NSS). In this paper, we study the performance of the NSS for solving MaxCut problem via the introduction of a near-termquantum algorithm and execution of experimental simulations. Since the MaxCut problem admits a relaxation into an SDP formulation, the SDP can be solved using NSS, with a subsequent classical post-processing step converting the hybrid density matrix into a MaxCut solution. After performing these steps, the near-termquantum algorithm will also inherit the advantages of NSS. We analyze the algorithm as compared to QAOAs and find that the depth of the quantum circuit is independent of the number of edges on the graph. Our algorithm requires logN N qubits and has O (1) circuit depth. This implies that it uses exponentially fewer qubits compared to QAOAs while also requiring a significantly reduced circuit depth to solve the MaxCut problem. To demonstrate the effectiveness of the NSS, we focused on 3-regular graphs, 9-regular graphs, and ER graphs. Numerical results indicate that the quantum algorithm achieves a comparable approximation ratio with classical methods by solving the original high dimensional problem within a lower dimensional ansatz space. Analysis under various initial states shows that the algorithm exhibits a remarkably stable performance.
There is an evident need to develop fast and accurate methods for materials and chemical simulations, as ab initio quantum chemistry methods are not scalable. On the one hand, the machine learning (ML) literature offe...
详细信息
ISBN:
(纸本)9798331541378
There is an evident need to develop fast and accurate methods for materials and chemical simulations, as ab initio quantum chemistry methods are not scalable. On the one hand, the machine learning (ML) literature offers an appealing approach in which ML force fields trained on a small number of ab initio computations can accurately predict properties of new structures with a reduced computational cost. On the other hand, quantum computing algorithms are being developed with the potential to reduce the scaling of classical computing methods for theoretical chemistry. In this work, we bring together quantum computing and the MACE ML framework to introduce the QUACE hybrid algorithm. In the classical MACE architecture, the central operation is a tensor contraction that requires the manipulation of highly dimensional data and is the bottleneck step of the method. This tensor operation is well suited to be performed on a quantum circuit. We implement a quantum algorithm that performs tensor contraction, thus reducing the load of classical processing and enabling improved scaling of the overall algorithm with system size. Although present NISQ quantum devices are not powerful enough to outperform classical performances, this work aims to demonstrate the potential of quantum computing for tensor contractions and its application to molecular simulations. QUACE could be a practical method for running molecular dynamics on near-term noisy quantum devices, where quantum noise would be harnessed as a source of stochasticity in the dynamics.
We introduce a family of variational quantumalgorithms, which we coin as quantum iterative power algorithms (QIPAs), and demonstrate their capabilities as applied to global-optimization numerical experiments. Specifi...
详细信息
We introduce a family of variational quantumalgorithms, which we coin as quantum iterative power algorithms (QIPAs), and demonstrate their capabilities as applied to global-optimization numerical experiments. Specifically, we demonstrate the QIPA based on a double exponential oracle as applied to ground state optimization of the H 2 molecule, search for the transmon qubit ground-state, and biprime factorization. Our results indicate that QIPA outperforms quantum imaginary time evolution (QITE) and requires a polynomial number of queries to reach convergence even with exponentially small overlap between an initial quantum state and the final desired quantum state, under some circumstances. We analytically show that there exists an exponential amplitude amplification at every step of the variational quantum algorithm, provided the initial wavefunction has non-vanishing probability with the desired state and that the unique maximum of the oracle is given by lambda(1)>0 , while all other values are given by the same value 0near-termquantum computing algorithms. Such approaches could facilitate identification of reaction pathways and transition states in chemical physics, as well as optimization in a broad range of machine learning applications. The method also provides a general framework for adaptation of a class of classical optimization algorithms to quantum computers to further broaden the range of algorithms amenable to implementation on current noisy intermediate-scale quantum computers.
In order to support near-term applications of quantum computing, a new compute paradigm has emerged-the quantum-classical cloud-in which quantum computers (QPUs) work in tandem with classical computers (CPUs) via a sh...
详细信息
In order to support near-term applications of quantum computing, a new compute paradigm has emerged-the quantum-classical cloud-in which quantum computers (QPUs) work in tandem with classical computers (CPUs) via a shared cloud infrastructure. In this work, we enumerate the architectural requirements of a quantum-classical cloud platform, and present a framework for benchmarking its runtime performance. In addition, we walk through two platform-level enhancements, parametric compilation and active qubit reset, that specifically optimize a quantum-classical architecture to support variational hybrid algorithms, the most promising applications of near-termquantum hardware. Finally, we show that integrating these two features into the Rigetti quantum Cloud Services platform results in considerable improvements to the latencies that govern algorithm runtime.
暂无评论