World's First Complete Design Of A Silicon Quantum Computer Chip Unveiled
World's First Complete Design Of A Silicon Quantum Computer Chip Unveiled

Recently, researchers at the University of New South Wales (UNSW) have come up with a new type of architecture that uses standard semiconductors (used in conventional processors) to perform quantum calculations.

World’s First Complete Design Of A Silicon Quantum Computer Chip Unveiled

Researchers at the University of New South Wales (UNSW) have come up with a new type of architecture that uses standard semiconductors (used in conventional processors) to perform quantum calculations. Thus, the first complete quantum processor has been created, demonstrating that the power of this technology can be unlocked using the same type of conventional equipment components.

Practical quantum computing has been in the news all this year, with significant advances at the theoretical level. Of course, putting the theory into practice is the most complicated part of the process, and engineers and researchers have been turning against the wall that has been created in this respect for years.

Quantum computing: a bit of theory

Right now, the “normal” chips of technological products (such as PCs or smartphones) store the information as binary bits (ones or zeros). The system works quite well, but logically it has a limit in terms of the amount of data that can be processed.

The Qubits, on the other hand, have states 1, 0, and both at the same time, causing quantum computers to have overwhelming computing power. When calculations are made using Qubits, the represented possibilities grow exponentially.

Two Qubits can exist simultaneously as four possibilities of two-bit numbers (00, 01, 10 and 11). With three Qubits, you can represent all the possible combinations of three numbers (000, 001, 010, 100, 011, 110, 101, 111) and also all at the same time. Now imagine if we have 40 Qubits, and we have the binary representation of each number to infinity, being able to represent the operations in each of these numbers separately and simultaneously. It is an authentic monstrosity of processability, and also in a parallel way. This is, explained a bit roughly, quantum computing.

Of course, the bad thing about quantum computing is that a particular peculiarity of reality is used, in which particles exist in what they call “fog of possibilities” until they connect to a system that defines their properties. This fog of possibilities has enormously useful mathematical characteristics if you know how to manage and, above all, if you know exactly what you are looking for.

While traditional computing is binary and is represented by ones and zeros, quantum computing allows complex layers to be created to represent the enormous spectrum of possibilities it offers. In other words, it is extremely complicated to represent.

The problem is that this fog of possibilities, also called Qubit, is delicate for this reason. You can not “measure” as such, at least not strictly, and also tend to “collapse” when trying to measure. Hundreds of thousands of Qubits are needed for any calculation to be worthwhile and for unwanted collapses to occur. To ensure that unstable Qubits do not introduce errors, a very robust error correction code must be organized, which further complicates all types of calculations.

The first complete quantum processor

As we said at the beginning, the researchers of the UNSW have managed to create a new complete quantum processor using conventional materials. It is the first time in history that is attempted – there have been many previous unsuccessful attempts – so we are talking about a milestone in history.

So, what do you think about this? Simply share all your views and thoughts in the comment section below.



AUTHOR