Until now, normal computers would process one bit (i.e. the smallest unit of information they work with which represents a 1 or a 0) using electrical pulses. Now the big challenge is sending these signals at the speed of light using photons, which will bring with it an exponential increase in performance capacity, giving the devices we use every day the power of a supercomputer.

Photonics and electronics have lived side-by-side in the world of processing and transmission of information for years.  With the use of fibre optics, photons have been used for sending and receiving information, while the main role of electrons is carrying out logical and computational operations.  All signs indicate that this division of labour will remain as is in the near future. “There are fundamental reasons for this,” explains Rajesh Menon, an IT engineer from the University of Utah, in an interview with OpenMind magazine.  “Less energy is needed to transmit information in the form of photons, while the waves associated with electrons are smaller,” he says.  Increased speed also means increased size.

Although the dream of a microchip with optical components that is fully operational is getting closer to becoming a reality, there are still some significant barriers that need to be broken through first.  After many years of experience and an enormous amount of resources available for electronics research, transistors that make up the microprocessors are already nanosized.  But photonics needs further exploration.  “We need a better understanding of how light behaves on that scale,” says Menon.

However, photonics definitely has a place in the race to create a quantum computer that has a stable processing capacity.  This is what physicist Jeremy O’Brien has been working on with his company PsiQuantum, a 5-year-old start up that has just raised 215 million USD to create a prototype that uses the power of light.


Quantum computers: the holy grail

“As we began working on this architecture, it appeared that our machine would have to be the size of the Sierra Nevada mountain range,” O’Brien explained in Bloomberg magazine. But after a series of research advances, they believe the first computer will be the size of an office conference room and will have one million qubits.  So that you can get an idea of the power that this represents, it is estimated that 80 qubits can hold more information than there are atoms in the universe.

Right now, the whole computing world is immersed in the revolution that is the jump from bits to qubits.  Big companies like Google and IBM, and others like PsiQuantum, are locked in a frenetic competition to be the first to build a computer using this technology (something that has already been achieved) and to make it stable.  The latter is the biggest challenge because, until now, qubits have shown themselves to be too fragile, and their state is altered by the smallest external interference.

The idea behind all of this is to go from a processing system with just two states (0 and 1) to one where something can be both 0 and 1 at the same time or part of each.  This is the superposition principle that occurs on a sub-atomic level and which makes the simultaneous processing or storing of different states possible.

In statements made to the BBC, the director of The Institute of Photonic Sciences in Barcelona, Alejandro Pozas-Kerstjens said that “It’s a revolution similar to the one that led to the first computers being developed”.  It has countless applications in real life “from allowing us to do things that are very important right now, such as making medication or their prototypes to choosing the optimal route when driving  in order to try and spend less on petrol,” says Pozas-Kerstjens.