Scientists drew inspiration from the body’s power plants to create a stunningly energy efficient supercomputer.
As we reported earlier, a team of scientists from McGill University in Montreal has laid out a process for creating a ‘biological supercomputer’ using the very same mechanisms the cells in our body use to operate. According to a report from RT, the development could to stunning improvements in energy efficiency compared to traditional supercomputers.
The biggest problem with present-day supercomputers is their insatiable appetite for energy – many larger systems even require their own power plant to achieve a reasonable level of energy efficiency. The heat alone generated by most supercomputers is a problem, often requiring a built-in cooling system that consumes even more energy.
In light of this problem, the McGill team came up with a clever idea. Drawing inspiration from biological processes, they figured out that they could use Adenosine triphosphate, one of the key drivers of energy transfer in the body’s cells.
Scientists developed a computer chip that measures just 1.5 cm, with a complex network of channels that can execute staggeringly complex calculations like a traditional supercomputer would. Instead of electrons propelled by an electrical charge, however, short protein strings, referred to in the study as biological agents, carry messages around the chip – and it’s all powered by ATP.
The new chip generates a tiny fraction of the heat an electrical circuit would, meaning that the model-biological supercomputer would require significantly less energy than its traditional counterparts. The development makes high-level computing significantly more sustainable, and could usher in a new era of clean computers.
The computer, which is still in its early phases, was able to carry out a complex classical mathematical computation using parallel computing, the same method employed by today’s supercomputers. While this feat is a huge accomplishment in itself, scientists say that there is still much work to be done before the biological supercomputer can take on normal supercomputers in other areas.
A press release from McGill describing the details of the study can be found here.