These waves of economic development are driven by invention, themselves typically driven by technological breakthrough.
One of the dominating features of the economy over the last fifty years has been Moore’s law, which has led to exponential growth in computing power and exponential drops in its costs.
What is Moore's law?
Today’s reason for the continuation of Moore’s law
Graphic Processing Units (GPUs) are todays reason for the continuation of Moore’s law. GPUs, while first introduced for the fast computation of graphics in computer games, allow massive parallel processing of problems on thousands of cores (Current Nvidia GPUSs have up to 6000 cores). GPU producers can reach this high number, by reducing the computational flexibility of single cores. GPUs are often seen as accelerators: For complex tasks developers have to rely on CPUs, whenever an algorithm requires a high number of repetitive tasks on a regular dataset, the CPUs assigns work to a GPU.
We see that algorithms, that have the highest benefit from this type of hybrid hardware, are the ones that show the most impressive results: Neural networks in the context of machine learning and problems used as proof-of-work in blockchains.
In parallel the industry started to develop specialized chips for designated fields of application: Tensor Processing Units (TPUs) are developed by Google or Nvidia for the sole purpose of accelerating machine learning algorithms. Apple follows a hybrid approach in the current chip architecture used in iPhones. Energy efficient cores are used whenever the smartphone is idle and only tracks data, when in use high-performance cores are activated. In recent supercomputers clusters hundred-thousands high performant and energy-consuming processors run on parallel algorithms to reach a performance in the exaflop range.
Its impact for the future
The developments we describe, are the ones we will most likely see in the future.
While transistor sizes are being reduced in slower cycles (the 2nm barrier will be reached in 2025), Moore’s law has changed from a general development of chips to a matter of performance improvement through specialization.
As emerging technology quantum computers (QC), will also be applied to a specific set of problems. While quantum computers today are mostly an abstract concept (all attempts for an actual quantum computer focus on finding a physically stable implementation.), physics and theoretical computer science already allow as a peak into the feature:
In quantum physics states become a matter of probability, which allows quantum bits to represent infinitely many states at once (superposition principle), instead of two in a classical bit. With suiting algorithms (Shor’s algorithm, Grover’s algorithm), quantum bits can be changed to represent the most probable solution to a problem. If the result of a quantum computation is the actual solution to a problem is uncertain, it will always have to be verified by a classical computer.
With these properties, quantum computers are perfectly suited for problems where classical computers can only perform as good as guessing the solution. We will see that quantum computers change encryption as we know it, as today an encryption key is safe if it can only be guessed. On the other hand, for search-problems as they occur in databases or in machine learning we will see a huge performance improvement.