Tangle frees scaling for quantum machine learning

0

LOS ALAMOS, NM — The field of machine learning in quantum computers has been boosted by new research removing a potential barrier to the practical implementation of quantum neural networks. While theorists previously believed that an exponentially large training set would be needed to train a quantum neural network, the No-Free-Lunch quantum theorem developed by Los Alamos National Laboratory shows that quantum entanglement eliminates this overhead. exponential.

“Our work proves that big data and large entanglement are valuable in quantum machine learning. Even better, entanglement leads to scalability, which solves the barrier of exponentially increasing data size in order to learn them,” said Andrew Sornborger, computer scientist at Los Alamos and co-author of the published paper in Physical examination letters. “The theorem gives us hope that quantum neural networks are well on their way to achieving the goal of quantum acceleration, where they will eventually outperform their counterparts on classical computers.”

The classic No-Free-Lunch theorem states that any machine learning algorithm is as good as, but not better than, any other when their performance is averaged over all possible functions relating data to their labels. A direct consequence of this theorem that showcases the power of data in classical machine learning is that the more data one has, the better the average performance. Thus, data is the currency of machine learning that ultimately limits performance.

Los Alamos’ new No-Free-Lunch Theorem shows that in the quantum regime, entanglement is also a currency, and one that can be exchanged for data to reduce data requirements.

Using a Rigetti quantum computer, the team entangled the quantum dataset with a reference system to verify the new theorem.

“We demonstrated on quantum hardware that we could indeed violate the standard No-Free-Lunch Theorem using entanglement, while our new formulation of the theorem stood up to experimental tests,” said Kunal Sharma, the first author of the item.

“Our theorem suggests that entanglement should be considered a valuable resource in quantum machine learning, along with big data,” said Los Alamos physicist and lead author Patrick Coles. “Classical neural networks only depend on big data.”

Entanglement describes the state of an atomic-scale particle system that cannot be fully described independently or individually. Entanglement is a key element of quantum computing.

– This press release was originally published on the Los Alamos National Laboratory website

Share.

Comments are closed.