Solving ‘barren plateaus’ is the key to quantum machine learning
Many machine learning algorithms on quantum computers suffer from the dreaded “barren plateau” of unsolvability, where they run into dead ends on optimization problems. This challenge had been relatively unstudied—until now. Rigorous theoretical work has established theorems that guarantee whether a given machine learning algorithm will work as it scales up on larger computers.
Click to rate this post!
[Total: 0 Average: 0]
You have already voted for this article
(Visited 16 times, 1 visits today)