The Origins of the Digital Universe
This book is available for download with iBooks on your Mac or iOS device, and with iTunes on your computer. Books can be read with iBooks on your Mac or iOS device.
“It is possible to invent a single machine which can be used to compute any computable sequence,” twenty-four-year-old Alan Turing announced in 1936. In Turing’s Cathedral, George Dyson focuses on a small group of men and women, led by John von Neumann at the Institute for Advanced Study in Princeton, New Jersey, who built one of the first computers to realize Alan Turing’s vision of a Universal Machine. Their work would break the distinction between numbers that mean things and numbers that do things—and our universe would never be the same.
Using five kilobytes of memory (the amount allocated to displaying the cursor on a computer desktop of today), they achieved unprecedented success in both weather prediction and nuclear weapons design, while tackling, in their spare time, problems ranging from the evolution of viruses to the evolution of stars.
Dyson’s account, both historic and prophetic, sheds important new light on how the digital universe exploded in the aftermath of World War II. The proliferation of both codes and machines was paralleled by two historic developments: the decoding of self-replicating sequences in biology and the invention of the hydrogen bomb. It’s no coincidence that the most destructive and the most constructive of human inventions appeared at exactly the same time.
How did code take over the world? In retracing how Alan Turing’s one-dimensional model became John von Neumann’s two-dimensional implementation, Turing’s Cathedral offers a series of provocative suggestions as to where the digital universe, now fully three-dimensional, may be heading next.
Publishers Weekly Review
© Publishers Weekly
There are some really wild analogies in this book that I think will stay with me for the rest of my life. The idea that computer code could be the means of travel for aliens to Earth; the idea that code is using humans to self-replicate; the concept that computers are used first to model concepts and events and eventually run them (e.g., computer trading); finally, the idea that humanity might just outsource our own governance to computers, which would complete what appears to be the inevitable domination of code and computers over humans. Yes, there are the obligatory history stories and detailed personal portraits of the main characters, but the author really shines and captivated this reader with the computer code as "selfish gene" meme and the wild applications that concept may have for humanity. Thanks for a great read, George."
I just finished chapter 2. It was appalling!
It diverged for no apparent reason into a drab history lesson, apparently unrelated to the book's topic as far as the reader can tell.
We have been led down the path of understanding Turing and Von Neumann and all of a sudden we are in some side track boring dissertation only interesting to the author. If the rest of the book is some how related to this history then give us a clue for heaven sake!