Episode Transcript
[00:00:07] Speaker A: This is Aaron. This is Michael, and you're listening to.
[00:00:11] Speaker B: The Nathan's and Roncast Roncat.
[00:00:15] Speaker C: Far sea croons itself to rest. Well, actually, you heard about Michael Roncat.
[00:00:20] Speaker A: Michael Roncat.
[00:00:21] Speaker C: Michael Roncat does approve of these podcasts, by the way. I got word from my friend Dave that his cat approved a pale cloud turn.
[00:00:30] Speaker A: Well, this is a really poignant story that we're going to tell today. This is the story of the late, great Alan Turing.
And, boy, what's there to say? This is a song that's based off of. I wrote this song based off of an apology from former British Prime Minister Gordon Brown that was just talking about. And he was the prime minister of the United Kingdom back. This was maybe twelve years ago, and this was the formal apology that he made because Alan Turing, who was a war hero and the father of computer science, was persecuted because he was gay. And he may have single handedly ended World War II with the technology that he created to intercept the German.
[00:01:30] Speaker D: Code.
[00:01:32] Speaker A: That the ships were using to communicate with each story. This song is the story of Alan Turing's life and the great things that he did and then his downfall and eventual suicide because he was persecuted for being gay.
[00:01:52] Speaker C: So this song really will break your heart because it's something that we humans do as governments or as individuals or as institutions. There's always bad choices, and sometimes people who have saved lives will be punished unfairly. And because of that, it's a simple message.
It's an apology. It's just a straight apology. And it's like, you deserve better. We want you to know that you deserve better. And we're just going to play a little snippet of the song. Some sections just, you know how Stark. We have guitar, we have some cello, we have some vocals, some harmony.
[00:02:42] Speaker A: It's probably the most sparse song on the record.
[00:02:46] Speaker C: Yeah.
And the cello is perfect for it because it emotes that thing that cellos do.
[00:02:55] Speaker D: Yeah.
[00:02:56] Speaker A: So let's play it.
I'm really excited that our guest today is George Dyson. He's the author of the book Turing's Cathedral, which is a fascinating look at how Alan Turing's idea for a mechanical computer ended up being realized. And he wrote this book in large part as a chronicle of the Institute for Advanced Study in Princeton, which is not Princeton University. It's a smaller organization that is kind of off. You have to kind of know what's there to go looking for it. But some of the greatest thinkers in history came through the Institute for Advanced Study, and it's a terrific book, and I'm really looking forward to you hearing this interview with the author, George Dyson.
All right, good morning. We are here with George Dyson, author of the fantastic book Touring's Cathedral, written and published in. Was it 2012?
[00:04:35] Speaker D: 2012, yes.
[00:04:36] Speaker A: I've really enjoyed what I've read of it. It's a wonderful History of how the first Turing machine was realized. What did Alan Turing do and what did he not do in terms of the creation of the first universal computing machine?
[00:04:53] Speaker D: That's a deep question because people have been thinking about computing for a long time, hundreds of years, in fact. But Alan Turing came along. I mean, he just was the right person at the right time. The way I view the history of computing is there's an Old Testament and there's a New Testament. There's Old Testament prophets like Leibniz and Hobbes in the 16th and 17th centuries, and then there were the New Testament prophets. So the Old Testament prophets sort of developed the logic, and the New Testament prophets developed the machines. And Alan Turing was the guy who was. Spanned both those worlds. So he came into it as a pure logician, very abstract mathematician, interested in sort of proving or actually disproving a mathematical point called the Scheidung's problem or the decision problem. The question of whether, is there any formal, systematic way that you could build a machine that would be able to look at a formula written in a mathematical language and to tell you whether it's true or false? And I'm sort of garbling that, but that was a very deep, unresolved question. And Alan Turing being absolutely young and not knowing what was impossible and how to do things, he came up with just an extraordinarily original way of disproving this hypothesis. And he did it by inventing an imaginary machine that could only do the simplest of things, but it could do them for as long as it took.
So, effectively, it was a machine that could do anything. If you could tell it what you wanted it to do, it could do it. And even that machine that could do anything could not answer the decision problems. But that was an abstract paper in logical mathematics.
He came to Princeton right after that. Actually, the paper was published while he was here as a graduate student, and he complained to his mother that only two people replied to the paper, sort of wrote to him and said they wanted a copy.
So it seemed to be going nowhere, and he came to Princeton to actually work on something else deeper.
But then suddenly, that was the right paper at the right time because of what happened in World War II with needing to break German encrypted messages. A lot of these, which seemed very abstract, logical questions became important. Could you start building machines that could look at strings of codes and make sense of them? Again, I'm very much simplifying what happened, but just sort of through accidents of history, what Turing did in the abstract became very important in a concrete know.
[00:07:39] Speaker A: I'm a little confused on the point of when Turing built that machine that they showed in the movie that we saw at the museum in New York to crack the code, the Enigma machine code. Did that actually help lead to the creation of the first computer, or was that kind of an offshoot, kind of a spin off?
[00:08:00] Speaker D: No, I mean, that's all sort of mixed up in the Hollywood sense. You're talking about the mainstream film.
[00:08:07] Speaker A: Yeah, it was called the Bomb. Bomb, yeah, bomb.
[00:08:10] Speaker D: Okay, so the bomb was actually invented and built by people in Poland, by mathematicians in Poland already, who, of course, were invaded by the Germans long before the war came to EnglanD. And so that was a reasonably simple machine that was effectively, if you're trying to reverse engineer a enigma machine, which has a sequence of rotors, and the rotors have electrical contacts, and it's like a combination lock sort of in reverse. It's like a combination lock for strings of code. And the BLMBe was just enormous. In fact, IBM helped build when they sort of mass produced them. So it was really an information processing machine that ran through enormous astronomical numbers of possible combinations and would then try and sort of worked with the people. The people would give it clues, and then it would help amplify the guesses of the people. So that really had very little to do with modern computing. There was a later machine that was influenced by Turing's ideas called Colossus, which came in later, much later in the war, because the Germans realized their codes were being cracked. So they developed more complicated digital codes with this sort of arms race between the people encrypting the messages and the people trying to decrypt them. So the bomb was very early in the war, and Colossus was much later. I don't understand why they didn't use colossus in the film because it used vacuum tubes. It was electronic. The bomb was strictly mechanical, just wheels and rotors and motors, whereas the colossus actually had punched paper tape and a very crude form of electronic memory to, again, sort of catch up these bunch of different sequences and see where the sequence where there was an overlap between.
Dozens of books have been written about that.
The adventure, the sort of day by day, the Germans were changing the codes every day and the British had 24 hours to try and crack it before the code was changed. And what the movie got wrong or didn't really portray was this was an army of people. It was enormous numbers of people. I mean, they sort of played Turing into their pigeonhole of the lone non social genius working alone. But everybody loved Alan Turing. He had a great sense of humor. He worked well with people. The women all loved him and most of the work was being done by women.
And it was a huge social group. It was like Los Alamos. That's why all the people from both Los Alamos and the. For everybody, it was the happiest time of their lives because they were working in this very cohesive group and nobody turned against Alan Turing until much, much later.
[00:11:12] Speaker A: How do we get from, does the Colossus machine actually connect to the efforts to make the first.
I'm calling it non military computer, but of course there were military ties, pretty strong military ties, and maybe we can get to that in a moment. But where does it go from the Colossus?
[00:11:35] Speaker D: Yes, the Colossus is strongly connected, but in a very unfortunately indirect way.
Colossus, the first prototype of it was immediately successful, started cracking these more difficult codes. So by the end of the war, I think there were ten. It was replicated ten times. So it was equivalent to what you would now call a server farm, these ten Colossi. By the end of the war, it was a pretty massive operation and had broken ground in much of the technology that would be needed to do real computers. I mean, it was very high speed gating and switching and using vacuum tubes for storing bits, things we take absolutely for granted today. And the tragedy was the same as with Turing himself, that decisions was made at the very highest levels, that all this was secret and it would stay secret and the Colossus machines were destroyed and no one was ever allowed to talk about it. So it was silenced in a way that you could make an argument that, well, if there was going to be another war in ten years, it would be important to not have revealed how these machines were built and how the codes were cracked. But there was no cost analysis done of what would it do to the British economy if we opened this technology up and let.
This is effectively why the computer industry developed more freely in America than in England. So many of the people who worked on Colossus sort of you can actually find in von Neumann's letters and people like Andrew Booth and so on, would come from England and say, well, I can't really tell you exactly how to do this, but here is sort of how you might do it. And it's clear they're talking about stuff that was done with vacuum tubes during the war, and that sort of knowledge is reappearing. But it's not like at the end of Los Alamos there was this document called the Smith Report that really explained what could be explained in a non classified way about atomic weapons. And I think that was a very good thing. And the same should have been done for the Colossus project. To say, this is what we learned about electronics, and not just the hardware side, but the probabilistic logic and what now is an enormous field of sort of Bayesian network theory, which has driven a lot of the real advances in AI and so on. That was developed by Turing and Jack Good and those people during the war, but they later could only publish their very academically cleansed versions of what had been done.
[00:14:23] Speaker A: Alan Turing envisioned both computers and artificial intelligence just kind of off the top of his head.
That's pretty amazing.
There's the touring machine. There's the touring test, right? What's the touring test?
[00:14:42] Speaker D: The problem is when something gets named after you, it's usually very different. It just gets shifted. Like, Darwin was not really a Darwinist and von Neumann didn't believe in von Neumann machines. And so the Turing test is taken as being a test for artificial intelligence, that if you can carry on an English language conversation with the machine and you cannot tell whether you're speaking with a machine or a human being, that machine has passed the test. But I think Alan Turing himself, I mean, this was a sort of offhand reference in one of his papers. He wouldn't have stuck with it. For when something gets named like a disease or something, then you're stuck with it forever.
And I think Turing would be sort of disappointed that something so trivial was named after him. He did all These deep things.
[00:15:34] Speaker A: We've talked a little bit about von Neumann. Who was von Neumann?
[00:15:38] Speaker D: Well, von Neumann is another just enormous character who has become larger than life.
Credit is sort of like a black hole. And once you become of the stature of von Neumann, things that just got near you, you get credit for, because people think, oh, that's so brilliant. Von Neumann must have done it. And it's not always to John von Neumann. Yeah. So to be very clear to your critics who will complain, I mean, von Neumann got credit for a lot of other people's work, but he did enough of his own work that he certainly deserves his own credit, but he was very much the sort of orchestra conductor type who could get other people to. Just by waving a little finger, he could get somebody to do something that would turn out. He could see how this would be important in the end. And he also had the. Because his father was a wealthy banker in Budapest, he grew up with that sense of wealth and power that he was attracted to it. So he had the ability to get sort of unlimited funding from the governments, and everybody instantly believed whatever he said was next would be important. And that was, in the computing world, that was extremely important, because all these ideas were floating around. But he was the guy who kind of put them all together and made this project in Princeton happen. But he himself was always very explicit that this was fundamentally Turing's idea, that they were realizing this sort of vision of Alan Turing's, but doing it in a concrete way.
[00:17:12] Speaker A: Once von Neumann was getting started with.
[00:17:15] Speaker D: This.
[00:17:18] Speaker A: There was already a computer at this point, but it wasn't a Turing machine.
[00:17:24] Speaker D: Well, again, that's the subject of endless debate. I mean, on the British side, Colossus was absolutely an electronic digital computer. That's what it was. But they had an immediate problem of breaking these codes. And certainly there were people working on it who could easily and willingly have turned it into something different, but sort of progress was stoppEd. And on the American side, we had a different problem, which was developing the army. We were producing enormous numbers of artillery weapons that at that time had sort of had to be calibrated with firing tables, which took an enormous amount of calculations.
So under the army at the Aberdeen Proving Ground, which was their business, was developing weapons, they developed also a very groundbreaking electronic computer called the ENIAC, the electronic numerical integrator and computer that was actually developed at the Moore School in Philadelphia, also from the beginning, very early in the war, but was grinding away, doing these day to day ballistics calculations. And of course, von Neumann, being so involved in everything, knew about this. And he had a different problem at Los Alamos, the problem of calculating implosions, or neutron, another deep mathematical problem that he needed a computer for.
So he jumped on the ENIAC as how we could use this for our Los Alamos.
[00:18:56] Speaker A: Just to be clear, was this before or after Nagasaki?
[00:19:01] Speaker D: This was before.
[00:19:03] Speaker A: So this was actually in the effort to develop the atomic bomb?
[00:19:07] Speaker D: Well, the interesting thing was that they already had the atomic bomb, but they were already thinking about the next.
They were actually thinking about hydrogen bombs, about how to use fission bombs to create a secondary explosion. And there the sort of numerical hydrodynamics became just beyond what could be done by hand.
And again, all that history is very convoluted. There are people now, there's sort of a third generation of books being written that are really resolving that in a clear way, because the good thing about secret projects is that the documents get saved. It's very hard to throw secret stuff out. It may get lost, but it often doesn't get destroyed. And so the whole thing was very much a mixed up genealogy. There was no clear. Turing invented this, von Neumann invented that. There was constant sort of hybrid cross pollination of ideas. People from Los Alamos going to England, people from England. OF course, there were a whole group of British contingent at Los Alamos.
And computing was threaded all through it. I mean, everything they did required. Of course, they were using punch card machines mainly. But once the existence of the ENIAC became known, it was irresistible to annoyment, to sort of turn this machine into something else, to use it as a programmable computer that you could program. Now, the people who built the ENIAC and invented Enig, they have a very strong, clear case that they had thought of that already. And they also given them the time and the resources that von Neumann had. They would have done that, too. But it happened to be von Neumann who sort of had the authority to come in and know, we're taking over this machine and going to use it in a different way.
[00:20:47] Speaker A: And who had all the money to give von Neumann the resources to realize this?
[00:20:53] Speaker D: Things were so different in those days. It was primarily, of course, the army sort of held the big budget, and machine was built by the army, but the Navy had the Office of Naval Research, and they were the groundbreaking leaders of everything. The ONR, which actually became directly transitioned into what we now call the National Science Foundation. But at the ONR, so much of the looking at deep scientific questions that actually did have military interests, like, how would you predict the waves if you were going to invade, if you're going to try and invade Europe? It really depends on what the waves are like the day that you invade. And how do you pick the day that is least likely to have high waves? Of course, physicists and oceanographers love to work on those kinds of problems, but it was ONR who hired those people. And ONR was actually led by a woman, Mina Reese, and she just could snap her fingers. If the word came from the ONR that the ENIAC is needed for two weeks for Van Neumann to run this problem. That's what would know we're living in a very interesting time where we are right now losing the very last of the people who were there during world War II at Bletchley park or at Los Alamos. There's almost none left. And I was lucky enough to come in right when I could still talk to people who had been there. Wow. One of the people I interviewed told me, how about.
I asked that question, how did you get the budget for this huge computing project at the Institute in Princeton, which know the budget was six times what the normal budget for mathematics in a whole year was? And they said, oh, we just had a meeting up in the. They had a card table up in the boardroom, and somebody from army would be there, and somebody from onr would be. And it was like a poker game. The guy from army would say, well, I'll put in X. They say, well, I'm good for two X. And somebody else would say, well, okay, I'll put in ten. They had their budget, like, in ten minutes. And that was it.
[00:22:52] Speaker A: Wow.
You know, when we were making the video for this song, we went to the Spygate Museum in New York, and I want to take a moment to give a shout out to just a really interesting place in New York City, where we actually saw the Enigma machine. Examples of that in the museum. It was really fun.
[00:23:19] Speaker C: I was blown away by the presentation, the technology, even just a little badge that you would scan to put your stuff in the locker. It fit to, like, they had these games for anyone who wanted to do a little more and kind of try out their Spycraft skills. So something for the whole family and well worth the money that you would pay.
There's not a lot of high dollar things that I would say, this is absolutely worth it, but the Spycraft Museum has it all. So if you're in New York for any reason, please venture out to the Spycraft Museum.
[00:24:02] Speaker A: Spygate.
[00:24:03] Speaker C: Oh, Spygate Museum. Look, I really know.
[00:24:09] Speaker A: They didn't pay us to say this, so we could say anything we want.
[00:24:13] Speaker D: Yeah.
[00:24:13] Speaker C: So go to the Spygate Museum, where you can get your spycraft on.
[00:24:17] Speaker A: This is basically a thank you note. Thank you to the folks at the Spygate for being so kind to let us in.
[00:24:24] Speaker C: Well, and our apologies for this cello player who didn't know the name of the museum After.
[00:24:29] Speaker A: Sorry, Spygate.
[00:24:30] Speaker C: Yeah, sorry. I knew the name at one point. And as songwriters, we mutate things through time.
[00:24:35] Speaker A: But check out the video. It's really cool.
[00:24:39] Speaker C: Back to our.
[00:24:40] Speaker A: Back to the interview.
[00:24:44] Speaker C: What I'm kind of getting the sense of is that in your writing, you approach kind of the story that's not told. The reason I'm thinking about this is I was looking on Amazon at your book and it was interesting to see one of the reviews saying that it know well researched, but they were expecting the story of Alan Turing and World War II, but they in fact got a different story than they expected. And as a songwriter you want to surprise people. You want to say something that someone hasn't heard. So in all your science based writing and historical writing, what are some of the surprises you found yourself maybe with Turing's cathedral that you hadn't known. That was probably like the highlight of your research because it must be kind of fun to go through everything and discover these things that you're going to help us as readers understand.
[00:25:37] Speaker D: Yes, it's like field archaeology. You're just digging around and looking for fossils and suddenly there's the thigh bone of a Tyrannosaurus right in the middle of just a field of mud. But what you said about albums is very true. That's a very good analogy, because if you have an album, you have one song on that album and the album maybe is titled after that song, but that doesn't mean the whole album is twelve versions of that song. So, yeah, that was interesting how a book has to have a title. And of course, to me, Turing's Cathedral has levels of meanings, that the whole field of computing is like a cathedral where everybody works on one little piece. There's no architect of the whole thing, but the foundation of it was Turing's. But the things I found out, yeah, it was always discovering people. I always had an interest in the lower level people, the secretaries and the engineers and the people. How did you actually get this thing done? IAs in Princeton, there was no machine shop or anything like that. How did you actually go in and do that? Memorable things I discovered. I mean, probably the most extraordinary was the first algorithms that were run on the ENIAC were a code called Monte Carlo, which was another just brilliant innovation that didn't come from Van Neumann, it came from Stan Ulam, who actually had this idea while he had developed viral encephalitis in his brain and was told to in the hospital dying as the doctor told him to stop thinking. So he started playing, okay, I'll do something mindless, I'll play solitaire. And while playing solitaire, he invented this absolutely new method of pioneering way of sort of statistically computing these intractable problems. And it was named Monte Carlo after the casino. And then I became very interested in really I think to me, the character who brings touring's cathedral to life is Clarie von Neumann. That's von Neumann's second wife, Clarie, and she just somehow fell in love with Monte Carlo. And through again, just accidents of history, turned out to be very good at it. So she actually ran a lot of those early codes, those Monte Carlo codes were run by Clari. And then when her stepdaughter, Marina invited me to Michigan, said, well, there's this one filing cabinet in the basement next to the water heater, and it didn't go to the Library of Congress, but you should come look at it, because it was all the letters between Clarie and John von Neumann, primarily love letters, the sole history of their relationship. They both were married to other people when they sort of fell in love, but they had known each other back in Hungary in the good old days, before the war, and then read those letters in there is Clarity's journals, too. And it turns out she met, I mean, she had known Johnny in Hungary. Of course, he was ready, a famous, brilliant young mathematician, and she was a national champion figure skater at 14. So their families had known each other. And then her second husband was an addicted gambler, a really pathological gambler. ThEy were in Monte Carlo and he was gambling all their money away, and she was drinking at the bar, and Johnny von Neumann was there and had run out of money because he thought he had a system.
He was wealthy, but not really sort of gambling wealthy, and he thought he could beat the house, and he had lost, and he was out of money and met Clari in the bar in Monte Carlo. That's how they met. So if you find something like that, you could imagine that for a million years, but it would be unimaginably sort of coincidental, people who would bring. I mean, the Monte Carlo code is really the code that changed the world, and that the two people who sort of would make it happen ended up meeting in Monte Carlo. And then there's this extraordinary sequence of sort of love letters back and forth, where she writes to him in a sealed envelope inside another to his secretary, saying, please, personally give this to Dr. Professor von Neumann. Those letters go back, and those letters were all saved, which is now they are in the library of mean. Why, that has not. Well, actually it is. There's a novel coming out in October, has bits of that in it. Labuta. L-A-B-U-T-A. It'll be out October. In fact, I remember the date it's coming out, October 23. It's called Maniac is the title maniac.
[00:29:55] Speaker A: That was waiting to happen.
[00:29:57] Speaker C: I think Aaron sent you the song.
[00:30:00] Speaker D: Yes, I looked at it.
[00:30:02] Speaker C: So the song really pulls at your heart. It builds it builds it, and then it just breaks your heart because, like. Oh, my. And so for anyone who doesn't know that aspect of it, when you listened to it, what was your impression or take on as one who knows a lot about it already, how did it affect you?
[00:30:22] Speaker D: Yeah, well, I just took it sort of what it is. I mean, this is tragic ballad of the classic non compassion of the bureaucracy versus the compassion of human beings. You're not going to find any one person who set out to destroy Alan Turing, but just the combined forces of the law and the systems and stuff. And he survived that for so long. He was very good. You sort of have to. It's like a kitten that grows up having to survive, people kicking it and stuff. I mean, you get good at resilient, but at a certain point, it was too much. And then I think the other tragedy, but that's sort of a little bit under the surface of your song is, to me, it's an equal tragedy of how then, when Turing's ideas made a whole lot of people a whole lot of money, then, of course, everybody wants to take credit for it.
[00:31:14] Speaker C: Yeah.
[00:31:16] Speaker D: Now he's on the 50 pound banknote.
[00:31:19] Speaker A: Too late.
[00:31:20] Speaker D: Yeah, too late. Exactly. Okay.
Why couldn't you have done that a little sooner?
[00:31:25] Speaker A: And that is the tragedy that leads to an interesting dynamic. That was it. The Eniac at University of Pennsylvania, they patented that, right?
[00:31:37] Speaker D: Yes. That became the largest lawsuit in history.
[00:31:42] Speaker A: I didn't know.
I mean, that kind of prevented it from being a jumping off point, in a way, for others to copy it. Whereas the one made in Princeton at the Institute for Advanced Study, I think that was open source, right?
[00:31:58] Speaker D: Yeah. That's very complicated. I mean, actually, that's where Wein was. I think, rather unscrupulous and wrong, because he promised all his engineers that they would patent know, because these guys gave up big careers to come work for him. And the original promise was that they would take out patents, and that never happened. And that's complicated reasons, but there's some reasonably unethical footnotes of that where he was actually being paid as a consultant by IBM. So all the ideas were going to IBM. So there was some bitterness among. I mean, everyone's very polite about it, and the engineers who wanted to all, of course, later got jobs at IBM, but in a way, the Eniac people were. Their company was Univac and von Neumann, he always gravitates to the centers of power, and he was very much on the IBM side. And there's sort of some unfavorable history there. Still, people are bitter about. But the interesting thing about the patent problems was that as a historian, when everything's going well, nobody kind of records anything and it's lost to history. LIke, very few people sort of write about a happy day at Los Alamos or something, but when something goes wrong, then it's all well documented. So this huge lawsuit over the ENIAC patent was just a gift to history, because there were depositions from all the people who had any involvement in computing at that time.
I can't remember what. It's like, 160,000,000 pages of evidence in that trial. And that became the core of this large collection at University of Minnesota that sort of became the Charles Babbage Institute.
Even people from Britain were brought in to testify as to what had happened. So in that sense, it was a great thing. They had that dispute.
[00:33:41] Speaker A: So what was it about the computer that von Neumann made in Princeton, that made it the template for the computers that we're talking on right now?
[00:33:50] Speaker D: Well, that's what the whole toy's cathedral is sort of about, is why do I believe that? That's because everybody takes it for granted. We live in this digital universe, that everything is sort of in this numerical matrix. And now it's enormous, right? I mean, we're doing audio over an Internet, enormous billions and billions of bits per minute going back and forth. But all that, every single bit still has a numerical address in this mathematically defined space, which was this idea that the beginning was just a domain of pure logicians. Imagine that you have a chessboard, and what can you do on this chessboard? And now we live in this. But it still is fundamentally a two dimensional address matrix. And I became obsessed with the question of where did that actually begin? Sort of like with life itself. What's the last common ancestor? And you may have different branches, but some branches die out. And if you look how it all really goes back, it does go back to that machine at the Institute that had a 32 x 32 x 40 bit matrix of memory. So that's what we now would call 5000 bytes. 40, 40,000 bits, 5000 bytes. So it's like a 10th of a second of bad MP3 audio.
But in that little matrix, they ran these Monte Carlo codes and they did all these things. They did all these crazy things with it. And of course, it was so successful, it was immediately copied, even before it was finished, there were copies of it that were finished sooner because the engineer, Julian Bigelow, Von Neumann's engineer, he was trying to get it perfect. And other people were saying, I've got a problem. I just want to get my problem done. And they got their machines finished first, but. So it exploded from there. And I believe that almost every computer today, and they all can be traced back to that original 5 KB, which is also so interesting to be living in a time where it's like being present at the origins of life or something like that, actually seeing the beginning of something, people working on it. They all believed it would change the world. They were true believers. We've lived long enough to see that that was true. Quite an amazing thing. And, of course, it goes back to Turing, the proof. You always need documents as proof. Historians argued for a very long time, did the people at the institute, you can make an argument they didn't care about Turing. They had nothing. They didn't need Turing to build that machine. But if you go to the library, you can actually walk over there, go to the Institute Library. And down there in the lower stacks, they have all the journals that people don't never look at anymore, but there's the journals of the London Mathematical Society, and they're all in very crisp, untouched green covers, except there's one volume, and it's the volume from 1936, which is where Turing's paper on the universal machine was published. If you take out that volume, the binding is completely disintegrated. And it's clear that people had read that paper, like 500 times because we didn't have PDFs or scanners.
Engineers were going to the library and looking at that paper. And to me, that's the proof. I mean, obviously those guys at the institute read that paper. And Bigelow said, several of the engineers told me, yeah, the second day I was at work, they told me to go read that paper.
[00:37:21] Speaker A: And, I mean, a lot of this stuff is this on computable numbers.
I've tried to read that and.
[00:37:31] Speaker D: It.
[00:37:31] Speaker A: Clearly was revolutionary, but I just couldn't understand it.
[00:37:34] Speaker D: Yeah, my dad said he read it when it came out because he was just a pure mathematician at that time. And he said, I thought it was a brilliant piece of math, math at work, but I didn't expect it would ever have any effect on the real world.
[00:37:48] Speaker A: Tell us about your.
[00:37:50] Speaker D: Was he. He sort of dodged that bullet? It's interesting. I have all his letters now, and so he was younger, but he was a brilliant math student at Cambridge during the war, and they actually tried to send him to Bletchley at that time. There was a guy who also famous now, C. P. Snow, who was in charge of sort of finding positions for bringing scientists into the war effort. World War I, the policy was, everybody's equal, you can be a mathematical genius, but we're going to send you to fire mortars at the front like everybody else. And that was very stupid. They realized that they lost a whole generation of scientists. So World War II, they tried to put the scientists into positions where they could help. And my father didn't want to go to Bletchley. I think it was a little late. I think if it had been at the beginning. For him, it was 1943, before he was old enough. He gave him 18 months of college, and then it was off to the work in the war, and that was all the college he had. But he didn't want to go to Bletchley because he wanted to be more involved. So he went to work for the RAF in what we now would call operations research. But I think by then, Bletchley was just too big. It was like 10,000 people. And he felt he would just be a minor part of this machine.
[00:39:06] Speaker A: This is Freeman Dyson. He ended up being an important.
[00:39:12] Speaker D: Yeah, he ended up a physicist or a mathematician, who, at the end of the war, decided to come to America and go into that. He could be more useful to physics.
He tried to solve a deep mathematical problem and failed. It was his test case. If I can solve this problem, I will be a mathematician. If I can't solve it, I better just go be a physicist.
[00:39:34] Speaker A: That's your original connection to the Institute?
[00:39:37] Speaker D: Yes. So Oppenheimer, who was the director, just by coincidence, invited my mother and father at the same time. They happened to be brought together there in the fall of 1948. My mother actually had a PhD. Freeman did not. And that's where they met.
So thanks to directly thanks to Oppenheimer, that I ended up there.
[00:40:06] Speaker A: It's interesting, I know that we're running out of time here, but it's really interesting to me how much of the origins of the computers that we work on today have their roots in military just. You wouldn't know it looking at the MacBook Pro that I'm talking to you on, or the iPhone in my pocket, or even the smart television downstairs. But are there some ethical questions kind of bound up in some of the devices we take for granted today?
[00:40:39] Speaker D: Yes. I mean, there's deep ethical questions, and I've always put that sort of a thought experiment. Everyone likes to think that the classic view is that the sort of Los Almos bomb project was a deal with the devil that the scientists got. For the scientists, it was absolutely a dream. You could do all the science you want. You can work on whatever. You don't have to teach, you don't have to grade papers.
We don't care what kind of degree you have. You can just do pure science. But there's a catch, and that's the deal with the devil. You got to build this bomb. And the assumption always been that the bomb was the work of the devil and that the devil wanted the bomb. And I've always said, well, you can't be so sure, because they didn't just build the bomb, they built computers. The modern computer was as much a product of Los Alamos as the bomb was.
And we haven't had a thermonuclear war. The thermonuclear war never happened. Like, if the devil had wanted the bomb, wouldn't the devil have used it? But the world has been absolutely taken over by computers and so got to be careful that maybe the deal was the devil wanted the computers. And that, I believe, is the job of responsibility of people who work in technology, is keep that in the back of your mind that this absolutely amazing thing that changed the world and brought us the Mac and everything that we love can also bring us the work of the devil. And just keep that in mind and watch out for it.
[00:42:15] Speaker A: Michael, I think that's our next song.
[00:42:18] Speaker C: You never know. We write every February, especially, we go into album writing month.
A lot of that makes the next album.
[00:42:28] Speaker A: How do you feel Turing's legacy has been?
How has it changed in the last 1020 years? And do you feel like Turing's Cathedral was truly realized?
[00:42:43] Speaker D: Well, yes, in the sense that this digital universe that he imagined and the powers of digital computing were, of course, completely realized, but unrealized in the sense that we've remained stuck in that one. It's like the world is still running. On his undergraduate thesis, for instance, when he came to Princeton, he came to work on something quite different, sort of non predictable, non logical machines. He called them Oracle Machines, which were know he already, I think, was bored with this purely predictable digital. That would never be the way to true intelligence. And so if Turing were alive, I think he'd be running around saying, you guys are nuts. I mean, to think that you're going to get real AI out of these defined formal codes. It's going to come from somewhere else. And in that sense, we haven't at all got there yet. We're still stuck with just being absolutely hypnotized with the power that these one certain species of machines had, but sort of disregarding all the rest and the way that the world of nature works, which is what was of interest to Alan at the end of his life. And such a tragedy that he died so young before he had a chance to see any of that start to happen.
[00:43:58] Speaker A: Did you ever hear about his reaction to the computers that were created during his.
[00:44:06] Speaker D: I mean, he was deeply involved.
Know, he ended up at Manchester, which was very much a center of actually doing thIngs.
It was sort of the Mit of Britain, and worked on some of the early commercial computers of Ferranti, Mark one and stuff. He was involved in design. So he was definitely a very hands on guy. And it's just a tragedy. He could have lived long enough to. Like when Danny Hillis, who started one of the really far thinking early American ventures, let's actually build a. This company was called Thinking Machines. Let's build something different in hardware. One of the first things he did was he hired Richard Feynman just called up, you want. Will you come work for us? And Feynman said, sure, if you actually give me a job, I'll come work. And if Alan's Turing could have remained alive to be hired at a place like that, the world would be very different.
[00:45:01] Speaker A: Anything else you'd like to talk about before we wrap up?
[00:45:03] Speaker D: No. Thanks for having me. And I have to thank all the teachers in Princeton who suffered me as I was a very difficult student, very non appreciative of a lot of the help people tried to give me. And it was a very interesting place to grow up. And my hero was Julian Bigelow, who was the guy who going to his house, and he had an airplane engine taken apart in his living room.
And there's that other sort of.
Princeton gets too much credit as this great ivory tower of academic theoretical work and not enough credit for all the hands on sort of engineers who make that stuff actually possible.
[00:45:46] Speaker A: Boy. Well, next time you're in town, I hope you'll stop by the office.
We certainly have enough people doing some great things today. George Dyson, thank you for helping us learn a little bit more about Alan Turing and Johnny von Neumann and speaking to us from your home in Washington state.
[00:46:06] Speaker D: Thank you.
[00:46:08] Speaker C: Thank you for having us.
[00:46:10] Speaker D: Thank you.
[00:46:10] Speaker A: It's been very illuminating. I appreciate it.
[00:46:22] Speaker C: Well, that was a lot of fun.
I still feel bad about the Spycraft Spygate incident.
[00:46:28] Speaker A: Well, it'll be. We'll call it a scandal, and we'll call it Spygate.
[00:46:33] Speaker C: That's right.
Spygate.
Hey, you know, you used to be near that area when you were in school.
[00:46:40] Speaker A: ThAt's right. I was in Washington, DC, near the Watergate.
Oh, boy. That goes in a completely different direction.
This was a lovely episode, if I say so myself.
[00:46:53] Speaker C: And I hope you learned a little bit that dive and delved deeper than the movie about Alan Turing or anything you may have read. I hope we learned something more because I sure did. So, until then, we want to play you the song.
[00:47:09] Speaker A: Here's the song.
[00:47:23] Speaker B: You had vision, you had smarts, but we were care. With your heart.
You cracked the code, you had the dream.
The answers lay in your machine.
The algorithm. You'd feed in a device that could do the work of a thousand men.
Oh, the thanks that you were due.
But instead we tried to program you and we made you pay the price for simply living out your life.
A nation ought to rise and fall together.
But maybe now I understand the many ways to be a man.
We're sorry, Alan. You deserved much better.
The Germans sent their messages at sea, scrambled by the Enigma machine.
They reign their vengeance on the Royal Marines.
Till you found their boats position out at sea.
With a program you designed. You won a world war with your mind. Your device decoded every ladder.
You were the hero of the day. Till we found out you were gay.
We're sorry, Alan. You deserved much better.
Sake pills you took to bed.
That's where we found that pores and apple by your head.
We made sure that you were held apart.
But you let us know you'd never change your heart.
Now as I reach out across time on a computer you designed. I know your contributions. We could never measure.
I can't make it.
[00:50:37] Speaker D: Okay.
[00:50:38] Speaker B: All I can do is say your name.
I'm sorry, Alan. You deserved much better.
I'm sorry, Alan. You deserved much better.
[00:51:00] Speaker C: That's cool.
I hope you had your tissues out, because I had mine. And really just. It breaks my heart every time we hear the song. It doesn't fail to break my heart. And that apology, when we sing it in harmony, is out of the. It's absolutely genuine.
[00:51:20] Speaker A: It's connecting with audiences, and that's the best thing.
Word of wisdom.
[00:51:27] Speaker C: Word of wisdom.
[00:51:29] Speaker D: Do you have one algorithm.
[00:51:33] Speaker C: Number.
[00:51:36] Speaker D: You'Ve.
[00:51:37] Speaker A: Been listening to the Nathans and Roncast.
[00:51:40] Speaker C: Brought to you by Michael Roncat and.
[00:51:45] Speaker A: Aaron M. Nathan's Bachelor of.
[00:51:49] Speaker C: That's. That's my favorite outro. Ever.
[00:51:52] Speaker D: Okay.
[00:51:52] Speaker C: Have a good day, everyone.
[00:51:53] Speaker A: Peace.
[00:52:08] Speaker C: Itself to rest.