Machine Learning Helps Tune, Characterize Quantum Dots Quickly
Using a machine-learning approach, scientists from the Universities of Oxford, Basel, and Lancaster are automating the process of characterizing and tuning individual semiconductor quantum dots (QDs) for use as qubits. This machine-learning approach to tuning could reduce the measuring time and the number of measurements by a factor of approximately four compared with conventional methods of data acquisition.
Semiconductor QDs are not identical and must be characterized individually. When several QDs are combined to scale a device up to a large number of qubits, this tuning process can become enormously time-consuming.
Artistic illustration of the potential landscape defined by voltages applied to nanostructures in order to trap single electrons in a QD. The electrons are kept under control by applying voltages to the various nanostructures within the trap. Among other things, this allows scientists to control how many electrons enter a QD from a reservoir. For each QD, the applied voltages must be tuned carefully in order to achieve the optimum conditions. Even small changes in voltage affect the electrons. Courtesy of Department of Physics, University of Basel.
First, the scientists trained the machine with data on the current flowing through the QD at different voltages. The algorithm selects the most informative measurements to perform next by combining information theory with a probabilistic deep-generative model that can generate full-resolution reconstructions from scattered partial measurements, similar to facial recognition technology. The system then performs these measurements and repeats the process until effective characterization is achieved according to predefined criteria and the QD can be used as a qubit.
For two different current map configurations, the researchers demonstrated that the algorithm could outperform standard grid scan techniques, reducing the number of measurements required by up to four times and the measurement time by 3.7 times.
“For the first time, we’ve applied machine learning to perform efficient measurements in gallium arsenide quantum dots, thereby allowing for the characterization of large arrays of quantum devices,” professor Natalia Ares, University of Oxford, said.
“The next step at our laboratory is to apply the software to semiconductor quantum dots made of other materials that are better suited to the development of a quantum computer,” professor Dominik Zumbühl, University of Basel, said.
The work by this team could open the way for learning-based automated measurement of quantum devices and ultimately support the building of large-scale qubit architectures.
The research was published in npj Quantum Information (https://doi.org/10.1038/s41534-019-0193-4).
Published: September 2019