Tech Talk: What’s old is new again – quantum computing explained

By

Quantum computing, the next frontier of machines that think not in bytes but in powerful qubits, which is the quantum version of the classical binary bit. (Image courtesy flickr.com)

Something that’s been around for decades isn’t usually considered cutting edge, but quantum computing is clearly the next step in the evolution of our daily digital lives.

Typically the chip or brain in your smartphone is a centimeter square. It has a small section in the middle made up of around 300 million transistors, with connections spreading out like fingers to talk to the screen, the camera, the battery and more, writes Parmy Olsen for forbes.com. 

[uam_ad id=”54865″]


But imagine a chip with no transistors at all, and instead a small chamber that’s controlling the processes and energy levels inside of atoms.

This is quantum computing, the next frontier of machines that think not in bytes but in powerful qubits.

It sounds cutting-edge, but scientists have been studying the theory of quantum computing for 30 years, and some say the first mainstream applications are just around the corner.

M Squared’s expertise in ultra-precise lasers means it can already make chips and arrays with 5 qubits on them.

To read the complete story click here. 

[uam_ad id=”61456″]

Stay Connected, Stay Informed

Subscribe for great stories in your community!

"*" indicates required fields

Hidden
MT Yes
This field is for validation purposes and should be left unchanged.
Advertisement