This is just incredibly cool as well as important. The authors show artificial life spontaneously emerging from a soup of random code, under a range of conditions.
Now we’re in a position to ask: In a universe capable of computation, how often will life arise? Clearly, it happened here. Was it a miracle, an inevitability, or somewhere in between? A few collaborators and I set out to explore this question in late 2023.
Our first experiments used an esoteric programming language called (apologies) Brainfuck.8 While not as minimal as SUBLEQ, Brainfuck is both very simple and very similar to the original Turing Machine. Like a Turing Machine, it involves a read/write head that can step left or right along a tape.
In our version, which we call “bff,” there’s a “soup” containing thousands of tapes, each of which includes both code and data. The tapes are of fixed length—64 bytes—and start off filled with random bytes. Then, they interact at random, over and over. In an interaction, two randomly selected tapes are stuck end to end, creating a 128-byte-long string, and this combined tape is run, potentially modifying itself. The 64-byte-long halves are then pulled back apart and dropped back into the soup. Once in a while, a byte value is randomized, as cosmic rays do to DNA.
Since bff has only seven instructions, represented by the characters “< > + – , [ ]”, and there are 256 possible byte values, following random initialization only 2.7 percent of the bytes in a given tape will contain valid instructions; any non-instructions are skipped over. Thus, at first, not much comes of interactions between tapes. Once in a while, a valid instruction will modify a byte, and this modification will persist in the soup. On average, though, only a couple of computational operations take place per interaction, and usually, they have no effect. In other words, while computation is possible in this toy universe, very little of it actually takes place. When a byte is altered, it’s likely due to random mutation, and even when it’s caused by the execution of a valid instruction, the alteration is arbitrary and purposeless.
But after a few million interactions, something magical happens: The tapes begin to reproduce. As they spawn copies of themselves and each other, randomness gives way to complex order. The amount of computation taking place in each interaction skyrockets, since—remember—reproduction requires computation. Two of Brainfuck’s seven instructions, “[” and “],” are dedicated to conditional branching, and define loops in the code; reproduction requires at least one such loop (“copy bytes until done”), causing the number of instructions executed in an interaction to climb into the hundreds, at minimum.
The code is no longer random, but obviously purposive, in the sense that its function can be analyzed and reverse-engineered. An unlucky mutation can break it, rendering it unable to reproduce. Over time, the code evolves clever strategies to increase its robustness to such damage. This emergence of function and purpose is just like what we see in organic life at every scale; it’s why, for instance, we’re able to talk about the function of the circulatory system, a kidney, or a mitochondrion, and how they can “fail”—even though nobody designed these systems.
We reproduced our basic result with a variety of other programming languages and environments. In one especially beautiful visualization, my colleague Alex Mordvintsev created a two-dimensional bff-like environment where each of a 200×200 array of “pixels” contains a tape, and interactions occur only between neighboring tapes on the grid. The tapes are interpreted as instructions for the iconic Zilog Z80 microprocessor, launched in 1976 and used in many 8-bit computers over the years (including the Sinclair ZX Spectrum, Osborne 1, and TRS-80). Here, too, complex replicators soon emerge out of the random interactions, evolving and spreading across the grid in successive waves.
Their main output is a paper, but I strongly recommend starting with the lead author’s more generally accessible article on the work: In the Beginning, There Was Computation
(hat tip to Peter Watts)