
A wheel-shaped muon detector is a part of an ATLAS particle detector improve at CERN. A brand new examine applies “unfolding,” or error-correction strategies used for particle detectors, to issues with noise in quantum computing. Credit score: Julien Marius Ordan/CERN
‘Unfolding’ strategies used to enhance the accuracy of particle detector knowledge can even enhance the readout of quantum states from a quantum pc.
Borrowing a web page from high-energy physics and astronomy textbooks, a staff of physicists and pc scientists on the U.S. Division of Power’s Lawrence Berkeley Nationwide Laboratory (Berkeley Lab) has efficiently tailored and utilized a typical error-reduction method to the sphere of quantum computing.
On this planet of subatomic particles and large particle detectors, and distant galaxies and large telescopes, scientists have realized to reside, and to work, with uncertainty. They’re usually attempting to tease out ultra-rare particle interactions from a large tangle of different particle interactions and background “noise” that may complicate their hunt, or attempting to filter out the consequences of atmospheric distortions and interstellar mud to enhance the decision of astronomical imaging.
Additionally, inherent issues with detectors, akin to with their potential to document all particle interactions or to precisely measure particles’ energies, may end up in knowledge getting misinterpret by the electronics they’re related to, so scientists must design complicated filters, within the type of pc algorithms, to cut back the margin of error and return essentially the most correct outcomes.
The issues of noise and bodily defects, and the necessity for error-correction and error-mitigation algorithms, which cut back the frequency and severity of errors, are additionally frequent within the fledgling area of quantum computing, and a examine revealed within the journal npj Quantum Info discovered that there seem like some frequent options, too.
Ben Nachman, a Berkeley Lab physicist who’s concerned with particle physics experiments at CERN as a member of Berkeley Lab’s ATLAS group, noticed the quantum-computing connection whereas engaged on a particle physics calculation with Christian Bauer, a Berkeley Lab theoretical physicist who’s a co-author of the examine. ATLAS is among the 4 large particle detectors at CERN’s Giant Hadron Collider, the most important and strongest particle collider on the planet.
“At ATLAS, we frequently must ‘unfold,’ or right for detector results,” stated Nachman, the examine’s lead writer. “Individuals have been creating this method for years.”
In experiments on the LHC, particles known as protons collide at a price of about 1 billion occasions per second. To deal with this extremely busy, “noisy” setting and intrinsic issues associated to the vitality decision and different components related to detectors, physicists use error-correcting “unfolding” strategies and different filters to winnow down this particle jumble to essentially the most helpful, correct knowledge.
“We realized that present quantum computer systems are very noisy, too,” Nachman stated, so discovering a technique to cut back this noise and reduce errors – error mitigation – is a key to advancing quantum computing. “One type of error is said to the precise operations you do, and one pertains to studying out the state of the quantum pc,” he famous – that first form is called a gate error, and the latter known as a readout error.

These charts present the connection between sorted high-energy physics measurements associated to particle scattering – known as differential cross-section measurements (left) – and repeated measurements of outputs from quantum computer systems (proper). These similarities present a possibility to use comparable error-mitigation strategies to knowledge from each fields. Credit score: Berkeley Lab; npj Quantum Inf 6, 84 (2020), DOI: 10.1038/s41534-020-00309-7
The most recent examine focuses on a way to cut back readout errors, known as “iterative Bayesian unfolding” (IBU), which is acquainted to the high-energy physics group. The examine compares the effectiveness of this strategy to different error-correction and mitigation strategies. The IBU technique is predicated on Bayes’ theorem, which gives a mathematical technique to discover the likelihood of an occasion occurring when there are different circumstances associated to this occasion which can be already recognized.
Nachman famous that this method might be utilized to the quantum analog of classical computer systems, often called common gate-based quantum computer systems.
In quantum computing, which depends on quantum bits, or qubits, to hold info, the delicate state often called quantum superposition is tough to keep up and might decay over time, inflicting a qubit to show a zero as an alternative of a one – this can be a frequent instance of a readout error.
Superposition gives {that a} quantum bit can symbolize a zero, a one, or each portions on the identical time. This permits distinctive computing capabilities not potential in standard computing, which depend on bits representing both a one or a zero, however not each without delay. One other supply of readout error in quantum computer systems is just a defective measurement of a qubit’s state as a result of structure of the pc.
Within the examine, researchers simulated a quantum pc to check the efficiency of three completely different error-correction (or error-mitigation or unfolding) strategies. They discovered that the IBU technique is extra sturdy in a really noisy, error-prone setting, and barely outperformed the opposite two within the presence of extra frequent noise patterns. Its efficiency was in comparison with an error-correction technique known as Ignis that’s a part of a group of open-source quantum-computing software program improvement instruments developed for IBM’s quantum computer systems, and a really fundamental type of unfolding often called the matrix inversion technique.
The researchers used the simulated quantum-computing setting to supply greater than 1,000 pseudo-experiments, they usually discovered that the outcomes for the IBU technique had been the closest to predictions. The noise fashions used for this evaluation had been measured on a 20-qubit quantum pc known as IBM Q Johannesburg.
“We took a quite common method from high-energy physics, and utilized it to quantum computing, and it labored rather well – because it ought to,” Nachman stated. There was a steep studying curve. “I needed to study all types of issues about quantum computing to make sure I knew the right way to translate this and to implement it on a quantum pc.”
He stated he was additionally very lucky to search out collaborators for the examine with experience in quantum computing at Berkeley Lab, together with Bert de Jong, who leads a DOE Workplace of Superior Scientific Computing Analysis Quantum Algorithms Staff and an Accelerated Analysis for Quantum Computing challenge in Berkeley Lab’s Computational Analysis Division.
“It’s thrilling to see how the plethora of information the high-energy physics group has developed to get essentially the most out of noisy experiments can be utilized to get extra out of noisy quantum computer systems,” de Jong stated.
The simulated and actual quantum computer systems used within the examine assorted from 5 qubits to twenty qubits, and the method must be scalable to bigger methods, Nachman stated. However the error-correction and error-mitigation strategies that the researchers examined would require extra computing assets as the scale of quantum computer systems will increase, so Nachman stated the staff is targeted on the right way to make the strategies extra manageable for quantum computer systems with bigger qubit arrays.
Nachman, Bauer, and de Jong additionally participated in an earlier study that proposes a technique to cut back gate errors, which is the opposite main supply of quantum-computing errors. They imagine that error correction and error mitigation in quantum computing could finally require a mix-and-match strategy – utilizing a mix of a number of strategies.
“It’s an thrilling time,” Nachman stated, as the sphere of quantum computing continues to be younger and there’s loads of room for innovation. “Individuals have at the least gotten the message about a majority of these approaches, and there’s nonetheless room for progress.” He famous that quantum computing supplied a “push to consider issues in a brand new means,” including, “It has opened up new science potential.”
Reference: “Unfolding quantum pc readout noise” by Benjamin Nachman, Miroslav Urbanek, Wibe A. de Jong and Christian W. Bauer, 25 September 2020, npj Quantum Info.
DOI: 10.1038/s41534-020-00309-7
The Oak Ridge Management Computing Facility, a DOE Workplace of Science person facility at Oak Ridge Nationwide Laboratory, supplied the researchers with entry to quantum-computing assets at IBM, together with the IBM Quantum Expertise and Q Hub Community.
Miroslav Urbanek in Berkeley Lab’s Computational Analysis Division additionally participated within the examine, which was supported by the U.S. DOE’s Workplace of Science and the Aspen Middle for Physics.