<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=1177130&amp;fmt=gif">

Posted by Dan Akister on 09-Jun-2022 13:05:24

Quantum computing – where are we?

shutterstock_1015677370

Quantum computing is the next step in technological evolution and likely to be something that changes our entire future. We’ve discussed this previously on the blog, but given that some time has passed – where are we now with its development?

Power consumption
One of our favourite topics! Quantum computing as it stands needs an inordinate amount of power to run a quantum setup – the amount of processing power required to make the calculations is currently so high that most data centres would be incapable of sustaining the required wattage for any significant period of time. Powering the laser frequency for quantum gates and their precision requires so much power, we’re currently in the realms of it being almost physically impossible to continue without redeveloping how power is distributed across these setups. Requiring so much power, in a configuration that needs so much scientific theory to make quantum computing possible does restrict the ability for continued development outside of organisations that have huge resources.

Technological advancement
The more qubits the better. The way that people are academically leapfrogging each other is to design and manufacture a chip that has the most qubits to allow for more random sampling, more calculations per nanosecond and easier access to the data filed out. What we see here is companies like Google and IBM consistently outdoing each other with newer, smaller and more efficient chips that have more processing power. Google is behind with their Sycamore processing unit which has 53 qubits and was developed in 2019. IBM currently leads the race with 65 qubits. They (IBM) have promised a processor with 1000 qubits by the end of 2023 which will dramatically enhance quantum theory and allow for much more advancement in the field.

Data interpretation
One of the biggest challenges facing quantum computing is that the data coming out is almost unintelligible or requires even more processing power than was required to obtain the information in the first place. At the moment we don’t understand the sheer volume of data being processed, it’s unintelligible to the human eye so it has to be translated. The problem here is that the volume is so incredible and requires such deep amounts of understanding for it to make sense, that the programs utilised also need a great amount of development. There has been information “teleported” across quantum nodes which allows for quantum key exchanges - this is helpful in moving information quickly to be interpreted, but as things stand it is still the biggest challenge opposing mainstream sourcing of quantum computing.

Topics: new technology, quantum computing

Dan Akister

Written by Dan Akister

Have your say