top of page

Universality and Computation

00:00 / 03:23

Universality is a concept introduced by David Deutsch that describes how incremental improvements in a system of knowledge or technology can suddenly cause an increase in reach, making it a universal system in the relevant domain.

Deutsch speculates that to appreciate universality when it occurs, one must either value abstract knowledge for its own sake or expect it to yield unforeseeable benefits someday.  During the Enlightenment, societies began to value progress as both desirable and attainable, and thus value universality itself.  Up until that period, many historical innovations were limited to solving parochial problems and contained arbitrary limitations.
 
Movable-type printing is an example of a jump to universality that occurred early in the Enlightenment.  Before the invention of Gutenberg's printing press, specialized objects (whole page printing plates specific to each document) were required for a document to be printed.  Afterwards, the printing press allowed for any document to be printed with the same set of objects (individual letter blocks), and any idea could be transferred to paper, reproduced quickly, and spread widely.

The most recent significant jump to universality has come with computers.  Although Charles Babbage and Ada Lovelace conceived of the first theoretical / mechanical universal computer in the early 1800s, (the Analytical Engine), it would take more than a century for anyone to realize its potential.   Alan Turing further developed the definitive theory of universal classical computers in 1936.


Even during the Second World War, when the first universal computers were created they were only used to solve limited parochial problems like code breaking and calculating artillery trajectories.  It wasn't until after the war, when the Americans adapted ENIAC to solve other diverse problems, that the jump to universality in computation occurred as Turing had outlined in his theory. Today, with miniaturization and the invention of programmable microprocessors, universal computers have become ubiquitous and universal in their application.

An intriguing facet of universality in computation is that it can only ever happen in digital computers, because of the need for error correction.  This is because in analog systems, the small errors that occur will accumulate until the analog computation wanders off its intended computational path.  Error correction is essential in processes of potentially unlimited length and as a result the jump to universality can only occur reliably in digital systems.  Deutsch states: "Error-correction is the beginning of infinity".

The universality of computation follows from Alan Turing's widely accepted theory and has led to an even more explanatory form of the Church-Turing principle that is now known as the Church-Turing-Deutsch Principle.   It "states that a universal computing device can simulate every physical process". - This is a central principle of Deutsch's work that, if correct, implies at a fundamental level that all of reality must be comprehensible. [1]

This leads to interesting implications for the origin of life and the mysterious jump to universality enabled by genetic code.
 

bottom of page