The post Quantum computers: are they here? appeared first on ANASTASIA MARCHENKOVA.
]]>This morning, IBM announced their 20qubit processor is now available, with a 50qubit prototype (a device, not a simulator) that has been successfully measured and tested and will be available in their nextgeneration systems.
The 5 qubit system was impressive, as was the 16 qubit system. But one of the most impressive things has been the rate at which IBM has progressed.
The ability to reliably operate several working quantum systems and putting them online was not possible just a few years ago. Now, we can scale IBM processors up to 50 qubits due to tremendous feats of science and engineering.
Dario Gil, vice president of AI and IBM Q
But putting a quantum computer online made this not just a theoretical field anymore. Opening up such hardware to the public was a huge shift. Students could use a quantum computer in their education, which meant that not only were physicists familiar with the field using a quantum computer, but that IBM was gathering insights from specialists in chemistry, biology, and other diverse fields.
Thanks to this incredible resource that IBM offers, I have students run actual quantum algorithms on a real quantum computer as part of their assignments! This drives home the point that this is a real technology, not just a pipe dream. What once seemed like an impossible future is now something they can use from their dorm rooms. Now, our enrollments are skyrocketing, drawing excitement from top students from a very wide range of disciplines.
– Andrew Houck, professor of electrical engineering, Princeton University
Why is this all such a big deal? The 50qubit threshold is where we might see “quantum supremacy” – where quantum computers are outperforming classical computers. Hitting the 50qubit threshold is a gamechanger.
See the press release here: http://www03.ibm.com/press/us/en/pressrelease/53374.wss
The post Quantum computers: are they here? appeared first on ANASTASIA MARCHENKOVA.
]]>The post PostQuantum Cryptography at Google appeared first on ANASTASIA MARCHENKOVA.
]]>It’s time to explore options for quantum safe algorithms beyond theoretical implementations. Google has launched “CECPQ1” – a postquantum key exchange algorithm on top of the standard ECC algorithm – live on Google Canary!
Research began with the “New Hope” algorithm developed by Alkim, Ducas, Pöppelmann, and Schwabe, building upon’s Microsoft’s work by Bos, Costello, Naehrig, and Stebila. Bos et al implemented the latticebased “RingLearningWithErrors”(RLWE) into the TLS protocol. The thorough paper covers implementation and integration into TLS, as well as performance tests. The later “New Hope” paper focuses on optimizing and building upon the previous work, speeding up computation by 10x or more. All this demonstrates that postquantum cryptography is practical at this time.
Google’s experiment focuses on a LWEbased key exchange (LWE being a less specialized problem than RLWE) in their implementation. While LWE uses much larger key sizes, Google chose it based on potential security concerns of the RLWE structure. Though this algorithm is slower than ECDH, and nearly an order of magnitude slower than RLWE (New Hope), it’s now officially in use!
Google does not intend for this particular algorithm to be the endall standard for postquantum cryptography. Retiring RSA and ECC is necessary; however, these are not the only options for postquantum cryptography. While there are conferences focused on the new cryptography standards and algorithm exploration, quantum computers and even the quantum “mindset” hasn’t been around long enough to have thoroughly exhausted all potential attacks or algorithms.
Transitioning to new standards will be its own headache. With query complexity breakthroughs even in key quantum algorithms, these recommendations are just initial experiments. The key is to have easily upgradeable systems, without painful transition costs.
Here’s additional results as of Sept 9, 2016 from the Google experiments: PDF
The post PostQuantum Cryptography at Google appeared first on ANASTASIA MARCHENKOVA.
]]>The post Quantum Technology News – Issue #11 appeared first on ANASTASIA MARCHENKOVA.
]]>





The post Quantum Technology News – Issue #11 appeared first on ANASTASIA MARCHENKOVA.
]]>The post What’s the difference between quantum annealing and universal gate quantum computers? appeared first on ANASTASIA MARCHENKOVA.
]]>DWave, the most famous quantum annealer, and universal gate quantum computing are not competitors. While they rely on the same concepts, they are useful for different tasks and different sorts of problems, while also suffering from different challenges in design and manufacturing.
The DWave machine is a quantum annealer running adiabatic quantum computing algorithms. This is great for optimizing solutions to problems by quickly searching over a space and finding a minimum (or “solution”). The latest announcement from Google states that the DWave machine is more than 10⁸ times faster than simulated annealing running on a single core. However, Selby’s algorithm still performs better than the DWave quantum computer, so there’s a long way to go for DWave.
But quantum annealing works best on problems where there are a lot of potential solutions and finding a “good enough” or “local minima” solution, making something like faster flight possible. DWave could be able to speed up research on better aerospace materials which can shield from radiation or stand up to heat, or model the flow over the wing, which Airbus is counting on to speed R&D.
However, quantum annealing will never be able to run Shor’s algorithm, which breaks common forms of modern cryptography used to protect our bank information, logins, and all web communication.
Universal gate quantum computing is much broader. A universal gate quantum computing system relies on building reliable qubits where basic quantum circuit operations, similar to the classical operations we all know, can be put together to create any sequence, running increasingly complex algorithms. Algorithms like Shor’s (to break RSA cryptography) and Grover’s (faster search) as well as the approximately 50 other quantum algorithms will also be able to run on a universal quantum computer.
This means that a universal quantum computer can be used for many more problems than a quantum annealer, but comes with its own challenges and a different design than a quantum annealer. The quantum annealer, like DWave, is becoming a great standard for proof of concept, but design of universal quantum computing chips for various applications and making sure that qubits are properly manufactured will be the tipping point for the quantum computing industry.
It all comes down to designing the chips. For universal gate quantum computing, the problem is being able to R&D chips and test them in an efficient manner to improve coherence (length of time the information is stored and can be manipulated) and qubit reliability.
The all Silicon chip breakthroughs mean that standard microfabrication facilities can be used to create quantum processor units, which leads towards cheaper and less specialized facilities for qubit manufacturing. Anyone* will be able to design a universal gate quantum computing chip, specialized for their purposes.
The post What’s the difference between quantum annealing and universal gate quantum computers? appeared first on ANASTASIA MARCHENKOVA.
]]>The post Quantum Technology News – Issue #10 appeared first on ANASTASIA MARCHENKOVA.
]]>Therefore, applications of all aspects of quantum communication and quantum computing have jumped, with many companies setting up research facilities and thinktanks within industry and academia to explore the applications quantum computing may have upon their industry, like Airbus, below.
Additionally, as a parallel to the race for quantum computing, we are seeing the niches related to quantum technology beginning commercialization. China has not been a frontrunner in the quantum computing race (US and Australia are the big players), but they have been investing resources in all aspects of quantum communications.





The post Quantum Technology News – Issue #10 appeared first on ANASTASIA MARCHENKOVA.
]]>The post Top 3 Quantum Myths and Misconceptions appeared first on ANASTASIA MARCHENKOVA.
]]>No. That’s not how that works. Where did this explanation come from?
More than fifty quantum algorithms have been discovered. Each quantum algorithm works differently, but none of them work by checking all possibilities at once. If that was true, all quantum algorithms would be the same and quantum computers could speed up any problem, which is not true.
The most common time you will hear this myth is when breaking RSA encryption on a quantum computer is discussed. This is how it really works:
The quantum algorithm, Shor’s, run on a quantum computer doesn’t try checking each prime factor directly. There is one quantum step that makes finding the prime factors much easier; it concentrates on finding the period of a function which contains the RSA key and classically computes the greatest common divisor. (Interested in reading more? Here’s an article I wrote about how Shor’s algorithm is run on a quantum computer, in case you have one in your back pocket)
This is why doubling the size of the RSA key will not make cracking RSA encryption that much more difficult for a quantum computer to solve. It doesn’t need to test more primes, as it would if the algorithm was just testing more possibilities — it uses the same quantum Fourier transform to find the period of the function containing the RSA key.
General scientific consensus is that no, that faster than the speed of light communication is not possible. This myth came around through this simple but misleading thought experiment explaining entanglement:
You have two lights. One can flash red, the other can flash blue. You put one in a box, and the other in another box. You send ONE of the boxes to Pluto. When you open your box on Earth and trigger the light and see that the light is flashing red, you KNOW that the box on Pluto has the blue light.
This implies “instantaneously” and “faster than the speed of light”.
Here’s the issue — no information is actually passed between the lights. The lights are ‘collapsed’ before the ‘measurement’ of opening the box. We were messing in the initial state preparation, by ‘knowing’ that one light will be red, and the other blue, and collapsing them before they are sent. We interfered in the original creation of the ‘entanglement’, saying that one light should be red, and the other should be blue.
We can apply this same metaphor to real qubits. Measuring one qubit and immediately knowing the state of the other implies we knew all along how they were entangled, just as we knew how the lights were entangled.
In quantum entanglement, while the qubits themselves exist in a superposition of 0 and 1, and only upon measurement, the qubit collapses into either a 0 or 1. They remain in that superposition until they are measured far apart. At first glance, this does seem to be information transfer.
However, if we don’t mess with the initial state preparation, we will receive the measurement information instantaneously, but we won’t be able to interpret it. Are the qubits going to correlate and be both polarized up, both polarized down, or opposite polarizations (assuming our qubits here are photons)? You don’t know what the information means without the classical channel — which is limited by the speed of light.
Now you see the fallacy in the light experiment, where we know how the lights will be entangled beforehand. Since we interfered in the initial state of the lights, we get misled into believing there is faster than the speed of light information transfer, when actually we knew the entanglement all along.
Read more at these Wiki articles: EPR Paradox and Nocommunication theorem and try not to be overwhelmed!
How do you define quantum computer? This is an interesting question because the race is on to build the first “quantum computer”. What kind of quantum computer? How many qubits? What application? The reliability?
DWave is a quantum annealer, which is very different from a universal quantum computer. It’s great for optimizing solutions to problems by quickly searching over a space and finding a minimum (or “solution”) by running adiabatic quantum algorithms. However, quantum annealing will never be able to run Shor’s algorithm, which breaks common forms of modern cryptography, like RSA and ECC.
A universal quantum computer is meant to be general purpose, where basic quantum circuit operations, similar to the classical circuit operations like AND and OR, can be put together to create any sequence. This includes Shor’s algorithm and Grover’s algorithm — and can be used for a larger set of problems than a quantum annealer.
Which type of quantum computer is a true quantum computer?
Which algorithm will convince us of quantum speedup? Shor’s seems to be ‘Schrodinger’s Killer App’ to ‘prove’ quantum computers are real. However, there are also a minimum amount of qubits required to run Shor’s algorithm but a thousand times more required to show a speedup.
How large a prime number must it break to be the ‘first’ quantum computer? In what amount of time?
What if you don’t care about factorization at all, and care about optimization?
I think we will have many ‘first’ quantum computers.
The post Top 3 Quantum Myths and Misconceptions appeared first on ANASTASIA MARCHENKOVA.
]]>The post Why the NSA moving away from Suite B cryptography due to quantum computers makes total sense appeared first on ANASTASIA MARCHENKOVA.
]]>One of the comments I most often hear is “Well, Snowden released documents in 2013 showing that the NSA has not had much progress on their quantum computer”, used as a justification why we shouldn’t worry about quantum computing now.
While this statement about the Snowden files is true, the last 2 years have been a storm of real, practical results, as well as funding poured into both companies and academic research in quantum computing. We know the tipping point of quantum computing research happened after the Snowden files were released.
Publicly driving the battle for universal quantum computing are Google andIBM.
IBM has had a quantum computing research group for over 20 years at the Watson center in New York, and works on theoretical work as well as practical results in all aspects of quantum computing. In April 2014, IBM announced a critical milestone with their 4qubit chip — detecting both types of possible errors at the same time:
And IARPA, in December 2015, infused IBM with additional funding through the LogiQ program:
Google hired John Martinis and his research group in late 2014 and are focusing entirely on building a scalable, fault tolerant quantum computing chip. In March 2015, they announced their success in not only building 9 qubits, but also error correcting:http://www.nature.com/nature/journal/v519/n7541/full/nature14270.html
We are seeing Moore’s law for quantum computing, but even faster.
Both companies have stated that 10 years is a reasonable timeline for functional quantum computing.
While Google and IBM have been very public with their plans, plenty of other companies are involved in various aspects of quantum computing:
Intel announced $50 million in funding to quantum computing development:
Alibaba plans to get 30 qubits working by 2020:
Australian researchers released results demonstrating the first quantum logic gate in silicon
Northrop Grumman has an internal team working on quantum computing.
Lockheed houses a DWave computer and has partnered with University of Maryland quantum computing professors to work on non silicon based quantum computers in March 2014.
Microsoft has a quantum computing group, StationQ. They are working on a different approach, dealing with the software side of quantum computing and taking a “fullstack” approach. Recently, they released the LIQUi> platform, a culmination of 3 years of hard work by the team. Right now, this platform simulates up to 30 qubits, but the approach could allow Microsoft to plug into quantum hardware and run the real qubits.
Various companies popping up in all aspects of quantum technology — from Quantum Key Distribution (IDQuantique, MagiQ, etc) to companies working on simulating qubits (Anyon Systems).
While DWave is not a universal quantum computer and can only be used for a small class of problems, DWave was one of the first companies that brought widespread interest to the field. DWave has sold 3 quantum computers so far: to Google, Lockheed, and Los Alamos National Lab (and the NSA?).
In December 2015, Google released a “watershed” quantum computing announcement about their 100 million fold improvement in a specialized problem engineered to show the power of DWave (Read the articles below, and the independent analysis here on the speedup by Dr. Scott Aaronson)
However, the DWave machine cannot use Shor’s algorithm to break cryptography — but they were the first player in the commercial quantum computing field and accelerated interest in quantum computing applications.
In light of all these advances, the NSA preparing the move to postquantum cryptography makes sense. And with all this quantum computing talk, the research into postquantum cryptography, as well as quantum cryptography, has accelerated:
Since 2004, SECOQC has been active (http://www.secoqc.net/), with 11 million Euro invested into developing secure quantum communications protocols, and in late 2014 the GCHQ announced funding into postquantum cryptography research.
Right now, there is no standard of encryption against a quantum computing attack, but there are known quantumresistant algorithms.We might be late on the traditional timeline for establishing quantum security standards, but that just means we need to start now, and the NSA moving forward on this makes complete sense.
The post Why the NSA moving away from Suite B cryptography due to quantum computers makes total sense appeared first on ANASTASIA MARCHENKOVA.
]]>The post Quantum Technology News – Issue #9 appeared first on ANASTASIA MARCHENKOVA.
]]>Independent analyses by Dr. Scott Aaronson show that while the results are showing that DWave is becoming more mature and probably has quantum effects, the technology suffers the drawbacks of just being built on old quantum computing techniques that we know are not the most optimal anymore. Additionally, Selby’s algorithm (a classical algorithm on a classical computer) still outperforms DWave.
In the cryptography community, even though the DWave computer cannot break modern cryptography, the fact that the NSA is preparing to move to quantum resistant algorithms shows that there is concern over either: quantum computing research acceleration, or problems with existing ECC algorithms (or both?).
Australian continues to fund quantum computing research and announce breakthroughs weekly, this time on a quantum computer code that will allow for the beginning of a fullstack solution to controlling and programming a quantum computer, built on top of the breakthrough last month, where UNSW showed the first logic gate in a silicon chip.






The post Quantum Technology News – Issue #9 appeared first on ANASTASIA MARCHENKOVA.
]]>The post Quantum Technology News – Issue #8 appeared first on ANASTASIA MARCHENKOVA.
]]>Additionally, the Microsoft quantum simulator is released!
Watch this space for more articles and reviews of Microsoft’s LanguageIntegrated Quantum Operations (LIQUi>) simulator. We are working on testing the system for some use cases and will evaluate the power of the simulator not only for academic and research, but for practical and commercially available quantum computing. I’ve included a link to the Microsoft Github for you to download.






The post Quantum Technology News – Issue #8 appeared first on ANASTASIA MARCHENKOVA.
]]>The post Quantum Technology News – Issue #7 appeared first on ANASTASIA MARCHENKOVA.
]]>NSA’s decision to move towards quantum safe cryptography seems to be a response to this rapid increase in funding and research.





The post Quantum Technology News – Issue #7 appeared first on ANASTASIA MARCHENKOVA.
]]>