TLS: Alan Turing in three words by Michael Saler

Alan Turing in three words

by Michael Saler

B efore the Second World War, computers wore clothes and did their work with ink-stained fingers. The term referred to human beings engaged in mathematical calculations, who were slow and at times erroneous. It was the war that introduced the first modern electronic computing machines. They were immensely faster and more accurate than their soon-to-be-redundant human counterparts, and facilitated the invention of nuclear weapons. The decades following the war have been dominated by these two technologies. It is tempting to think of them in apocalyptic terms: the bomb threatens to end human existence, and the artificial intelligence of computers threatens to challenge or even change human nature. Recent histories charting the intertwined origins of the nuclear age and the “digital universe” provoke the queasy feeling that our species is positioned precariously between atomic night and transhuman dawn. Ironically – and reassuringly – the principal instigators of this new era, such as Alan Turing and John von Neumann, showed themselves to be human, all too human, their fallibilities and resiliencies restoring a more grounded perspective about our future. The triumph of a Dr Strangelove or a Hal 9000 remains a possibility, but either scenario pales before the lived reality of their flesh-and-blood progenitors. As the following histories suggest, truth is often stranger than science fiction.

Alan Turing, 1946

B. Jack Copeland’s new biography of the father of modern computing opens with an Alan Turing Test: “Three words to sum up Alan Turing?”. It’s a challenging question for such a multifaceted man. No doubt Watson, the polymath computer that won at Jeopardy! in 2011, could generate apt terms, but so, too, could more ordinary folk, especially given the wide publicity the English mathematician has received during this centenary year of his birth. Yet Turing’s justified fame was unthinkable a little more than a generation ago. The man who helped defeat the Nazis and create the digital age was known among mathematicians and computer scientists, but few others. His crucial contributions to decrypting German codes during the war remained classified for decades after his death in 1954. And his signal concept of an all-purpose, stored-program computer – the model for our digital devices today – was often attributed to others, from Charles Babbage in the nineteenth century to John von Neumann in the twentieth. (With his unfinished “Analytical Engine”, Babbage was on the right track, but never posited the critical idea of storing programs in memory, enabling a single machine to execute multiple tasks – in effect becoming the “universal” machine residing on our desks and chirping in our pockets. Turing made this breakthrough, which in turn inspired von Neumann’s general architecture for electronic computers that became the industry standard.) Until his classified war work became public knowledge and the genealogy of modern computing was sorted out, Turing would not have been associated with Newton, Darwin or Einstein – a comparison drawn by Barack Obama in an address to the Houses of Parliament in 2011 – or considered among the “leading figures in the Allied victory over Hitler”, as Copeland does here. Picasso would have been an unlikely comparison also, but like the restless artist Turing was a fertile innovator, leaving new fields to sprout from his seedlings (computer science, artificial intelligence, mathematical biology), and making pioneering contributions to others (logic, cryptography, statistics).

Nor would the more unfortunate parallel have been made between Turing and another courageous scientist, Galileo. Both were persecuted for acting on beliefs anathema to the reigning social institutions of the times. Turing’s unapologetic expression of his sexuality with another man in 1952 was illegal under the 1885 Act against “gross indecency”, which was not repealed until 1967. Found guilty, he faced a draconian choice: either prison, which would have ended his career, or enforced chemical castration. He chose the latter, a year’s worth of oestrogen treatment, which affected his body and perhaps his mind. Even after he had paid his penalty, Turing had good reason to believe he was under police surveillance as a security risk, which dramatically constricted his personal life. In June 1954, he was found dead, an apparent suicide, just shy of his forty-second birthday. (There has been a recent drive to grant Turing a pardon, although a successful effort may take some time: Galileo did not receive absolution from the Roman Catholic Church until 1992.) Turing’s rehabilitation from over a quarter-century’s embarrassed silence was largely the result of Andrew Hodges’s superb biography, Alan Turing: The enigma (1983; reissued with a new introduction in 2012). Hodges examined available primary sources and interviewed surviving witnesses to elucidate Turing’s multiple dimensions. A mathematician, Hodges ably explained Turing’s intellectual accomplishments with insight, and situated them within their wider historical contexts. He also empathetically explored the centrality of Turing’s sexual identity to his thought and life in a persuasive rather than reductive way. Thus he made a convincing case that Turing’s teenage crush on a fellow schoolboy, Christopher Morcom, was an important catalyst for his lifelong preoccupation with the relationship between brain and mind. Morcom’s unexpected death at the age of eighteen was a shattering blow to Turing, who began to reflect on whether his friend’s consciousness might survive after death or whether it was simply a result of complex material processes and expired when life did. Hodges also linked the famous “Turing Test”, in which a computer attempts to pass as an intelligent human being, to Turing’s own dilemma as a gay man in a homophobic world. (Turing called his test the “imitation game”, and Hodges observed, “like any homosexual man, he was living an imitation game, not in the sense of conscious play acting, but by being accepted as a person that he was not”.) Hodges’s book was widely praised. But biographies in the early 1980s were usually less forthcoming about their subjects’ bedroom practices than they are today, and some reviewers were discomfited by Hodges’s insistence that Turing’s homosexuality was intrinsic to a fuller understanding of him. Writing in the New Yorker in 1986, Jeremy Bernstein complained about “Mr. Hodges’ polemical emphasis on Turing’s sexuality . . . The biography would have been an even more powerful statement . . . if it had left more unsaid”. For some, this was still the love that dared not speak its name, although Hodges was hardly writing pornography, let alone erotica, about someone who was not exactly a Lothario. Turing’s romantic longings, occasional partners and shy propositions were simply essential to comprehending his outlook and behaviour, to say nothing of his tragic downfall.

Hodges’s biography was not re-shelved in the closet, however. It was the basis for Hugh Whitemore’s well-received play Breaking the Code (1986), which in turn was filmed by the BBC in 1996. By then, Turing had entered the mainstream. Popular interest in his life had grown enormously, thanks to the declassification of wartime documents, the transformative effects computers were having on everyday life, and a sea change in attitudes towards sexuality. (The next notable biography, David Leavitt’s The Man Who Knew Too Much (2006), is arguably more polemical about Turing’s sexuality than was Hodges’s.) Which brings us back to the Alan Turing Test that opens Copeland’s new biography. If this were a Jeopardy! challenge, surely one of the three words to sum up Turing would be “gay”? Copeland’s answers, however, are “humour”, “isolation” and “courage”. Perhaps he is being coy, because he dangles another chance before us: “Three more words?” These, though, turn out to be “patriotic”, “unconventional” and “genius”. Copeland does not associate any of these terms with Turing’s sexuality. Indeed, Turing’s sexuality is not broached in this biography until it reaches 1938, when the twenty-six-year-old graduate student in mathematics at Princeton briefly plays the tourist: “He was probably surprised by the openness of New York’s gay bars and clubs. In Washington he visited the Senate”.

Minimizing this aspect of Turing’s life constricts Copeland’s understanding of him. In contrast to Hodges and Leavitt, he provides a strangely upbeat narrative. He illustrates Turing’s charming sense of humour, but the darker eddies have evaporated. There are little more than hints of a difficult childhood and adolescence: Copeland finds that because Turing’s father worked in the Indian Civil Service, his parents were often abroad, which consigned Alan to “the life of a near-orphan”. He attended public school at Sherborne, where he became devoted to mathematics (but not, here, to Morcom, who is never mentioned). After he arrived at King’s College, Cambridge in 1931, Turing’s life at times sounds like something out of the Boys’ Own Paper: “He was soon punching above the weight of any normal maths fresher”; his preferred food was “mutton chops and other good plain British fare”. Copeland finds that Turing fitted in well at King’s, but doesn’t explain why; Hodges persuasively argued that Turing’s comfort stemmed from the college’s Bloomsbury-inspired tolerance of homosexuality. Turing’s brilliance was recognized, and he was made a Fellow of King’s at the age of twenty-two. After a brief sojourn at Princeton, where he received his PhD, Turing returned to King’s in 1938 and “for the next sixteen short years . . . his career moved from crescendo to crescendo”.

Copeland must acknowledge that there were the occasional basso profundo moments. He details several of the personality conflicts Turing had with co-workers, but attributes these to his eccentricity, rendering his homosexuality relatively unproblematic. He claims that Turing was not out to his colleagues at Bletchley Park during the war, but Hodges relates a sad incident in 1944 when Turing trustingly revealed his sexuality to a co-worker, who recoiled in disgust. (Hodges also found that Turing had mentioned suicide to a friend in the late 1930s, describing a method involving an apple and electric wiring; Turing had a fascination with the poisoned apple in Disney’s Snow White. These facts are not in Copeland.) Turing’s personal life could be harsh, Copeland admits, particularly when he was convicted for gross indecency and forced to take oestrogen. Yet in his account, Turing remains plucky and “seems to have borne it all cheerfully enough . . . . The whole thing was an episode to be got through”. After the year was up, “he was rid of the organo-therapy, and in the warm sunny spring of 1953 the skies were blue again”.

It is not surprising, then, that Copeland challenges one of the darkest aspects of the Turing legend: “The idea that Turing committed suicide is now deeply entrenched. It is time for a dispassionate assessment of the evidence”. But there is nothing really new in his analysis of the circumstances of Turing’s death by cyanide poisoning, which has puzzled observers from the moment that Turing’s housekeeper found his body tucked in bed, a partially eaten apple on his bedside table. Adjoining his bedroom was a small laboratory, in which a solution containing potassium cyanide had been left brewing in a pan. (He sometimes used the chemical for electroplating.) The inquest determined that he had committed suicide, but Turing’s mother Sara knew him to be sloppy, and always insisted that he had accidentally ingested the cyanide. It is possible that this master of cryptography deliberately coded his suicide so that his mother could read it as an accident. Leavitt speculated that Turing might have been murdered by the British secret service, because of his knowledge of state secrets. The nation was already on high alert against spies – especially Cambridge homosexual spies – following Guy Burgess and Donald Maclean’s defection to the USSR in 1951.

Copeland is right to argue that we may never know if Turing committed suicide. But that’s not the point. What rankles is his claim to be engaged in a “dispassionate assessment of the evidence”, when throughout the book he has chosen to disregard the bountiful evidence, however circumstantial, advanced by others for why Turing might have been very unhappy for a very long time. (In his forty-nine pages of notes, Copeland cites Hodges’s biography three times and ignores Leavitt’s.) By not openly acknowledging these earlier interpretations – if only to refute them – he leaves a reader new to Turing with a distorted perspective that diminishes Turing’s courage as well as his pain.

If Copeland’s biography is not always convincing about the man, it is more compelling about the work. Copeland has spent much of his career studying Turing, mining the archives and interviewing those who knew and collaborated with him. This account is intended for a general audience, and usefully relates Turing’s thought to current developments in computer science. There is some potted history, but Copeland provides lucid overviews of Turing’s contributions to mathematical logic, the decryption of German cyphers during the war, the development of the modern computer, artificial life and artificial intelligence.

Turing’s activities spanned so many domains because he had an unusual capacity to combine abstract reasoning with its practical application. He demonstrated this aptitude at the age of twenty-three when he solved an abstruse problem in mathematical logic, the “Decision Problem” proposed by David Hilbert in 1928. Hilbert wondered whether there was a mechanical, finite procedure to distinguish provable from unprovable mathematical statements within a formal system. Turing published a paper in 1936 describing a hypothetical machine which corresponded to the process sought by Hilbert. His hands-on, engineering approach was novel – logicians’ papers are freighted with abstract equations, but not imaginary machines – and the machine itself was as elegant in its simplicity as a haiku. Viewed from our present vantage, the machine has a central processing unit, and uses an infinite strip of tape for memory, a program, and input and output data. It can execute any computable program by manipulating the symbols on the tape; in principle it could run all of our current software. Through an ingenious argument, Turing demonstrated that the machine would be unable to decide the validity of all mathematical statements, thereby answering Hilbert’s question and making a fundamental contribution to mathematics. And he had also invented a schematic for the universal, stored-program computer. For him this wasn’t a mere thought experiment: he yearned to build what others soon called the “Turing Machine”, but the technologies of speed and memory it required would not be developed until the war.

Turing’s wartime work in cryptography at Bletchley Park also married theory and practicality. He developed algorithms to hone potential solutions to the Germans’ Enigma code and, building on earlier work of Polish cryptographers in the 1930s, he designed the “bombe”, a calculating machine that rapidly tested possible code sequences. Copeland shows that by late 1943, Turing’s machines were decrypting 84,000 messages each month; among other crucial results, the U-boat stranglehold on North Atlantic shipping was undone. For their highest-level communications, however, the Germans employed a more complex coding system which required a higher-powered machine. By 1944, Tommy Flowers had designed the Colossus, “the world’s first large-scale electronic digital computer”, to crack it. (Copeland scores many firsts for Turing and Britain in this book.) While effective for its purpose, Colossus lacked the ability to store programs in memory. Copeland relates that Flowers had been shown Turing’s 1936 paper, but he didn’t understand it.

But the electronic Colossus showed Turing the way forward to construct his universal machine. In 1945, he designed the “Automatic Computing Engine” (ACE) for the National Physical Laboratory, although bureaucratic delays impeded its construction. The world’s “first electronic universal stored-program computer”, nicknamed “the Baby”, was built in 1948 at the University of Manchester, and Turing moved there to direct the University’s new computer laboratory. “Turing was undoubtedly the first Hacker”, Copeland observes, playing with his Baby to explore the chemistry and mathematics of how cells differentiate themselves into structures and patterns. Copeland frames these experiments in “morphogenesis” as “the first appearance of the field now called simply Artificial Life”.

Turing also pioneered the field of Artificial Intelligence (AI), which hitherto had been the domain of science fiction writers. (Intriguingly, he wrote an unpublished, autobiographical story about a scientist who specialized in “interstellar travel” and enjoyed “such crackpot problems”.) He believed that machines could be taught to learn, which at a minimum involved a combination of logical rules and open-ended searches. “What we want”, he announced in 1947, “is a machine that can learn from experience.” This is what contemporary search engines and other ubiquitous examples of AI do, however imperfectly.

For Turing, computers could be said to be thinking if their behaviour corresponded to our rule-of-thumb understandings of intelligent conduct. Hence his idea of the “imitation game”, in which an individual asks written questions of a human and a machine, neither of which he can see. If after a short interval the machine convinces the interrogator that it is a human, it has demonstrated for all practical purposes that it is “thinking” or “intelligent”. (Computers have yet to pass the Turing Test, but a fair number of human contestants have had the dubious honour of being judged unintelligent machines.) And while Turing is often thought to have equated minds with machines, Copeland suggests that he may have been more open-minded about the issue. Turing was less ambivalent in his conjecture that intelligent machines might eventually surpass humanity: “At some stage . . . we should have to expect the machines to take control”.

This unsettling scenario might be pre-empted by a more apocalyptic one: nuclear Armageddon. George Dyson’s Turing’s Cathedral reveals how the computing torch was passed after 1945 from Turing (himself an excellent marathon runner) to John von Neumann (who preferred driving fast cars; Dyson observes that he bought “a new one at least once a year, whether he had wrecked the previous one or not”). While Turing may have been a victim of the Cold War, von Neumann was its champion, influencing the policy of Mutually Assured Destruction from his commanding positions at Princeton’s Institute for Advanced Study and the Atomic Energy Commission. Born to a wealthy family in Hungary in 1903, he experienced at first hand the Communist government that briefly came to power after the First World War, and became an ardent foe of Communism. He acknowledged that atomic weaponry was a “monster” but perceived it as the lesser of two evils, advocating a pre-emptive nuclear strike against the USSR. Hoping to create a hydrogen bomb, he needed a stored-program electronic computer, and in 1945 began to construct one at the Institute.

Dyson explores the efforts of von Neumann, his wife Klári, other scientists, and a team of raucous engineers to create the computer in the idyllic yet stuffy environs of Princeton. While at times his prose can be dense and unduly detailed, Dyson vividly evokes the conflicts between the genteel world of scholars and the more freewheeling culture of engineers. Many of the Institute’s Fellows objected initially to the project, because engineering was not a “liberal art”, and the computer merely a tool. (The Institute’s Director also complained that the engineers consumed more than their fair share of sugar at tea.) When they intuited that the computer was being used for top-secret weapons work after its completion in 1951, the Fellows objected even more: the project was ended, and nearly all engineers dismissed, a few years after von Neumann left the Institute in 1954.

Dyson provides a counter-narrative to this linkage of computers and destruction that charts the birth of the “digital universe”. He is inspired by the example of Nils Barricelli, who in the early 1950s used the Institute’s computer to see if randomly generated numbers might follow biological processes of evolution, simulating life. Dyson in turn speculates freely throughout about the analogies between biological and technological entities, genetic codes and computer algorithms. Might DNA be likened to a digital program, and might computer programs themselves evolve within our complex ecosystem of information technology and assume virtual life? Given that we currently rely on computers to comprehend and manipulate our own genetic code, Dyson worries that humanity could be in danger of being reconstituted by Turing machines: “Are we using digital computers to . . . better replicate our own genetic code, thereby optimizing human beings, or are digital computers optimizing our genetic code – and our way of thinking – so that we can better assist in replicating them?”

Turing’s Cathedral devotes only one chapter to Turing. Yet as the title indicates, the atheist Turing remains central, having engendered not a God in the machine, but machines that are as gods, silent but omnipresent. Dyson testifies to signs and portents that we are already in thrall to this digital universe: “Facebook defines who we are, Amazon defines what we want, and Google defines what we think”.

It is all heady stuff, and Dyson knows it can sound loopy. He recounts how he explained some of his theories to von Neumann’s colleague Edward Teller, who advised Dyson that “instead of explaining this, which would be hard . . . you write a science-fiction book about it”. Undaunted, Dyson wrote his history, and the profoundly human drama of his principal characters tends to overshadow his more fanciful theories. Still, they remain thought-provoking, like the best science fiction. “No genuinely intelligent artificial intelligence would reveal itself to us”, he warns. Perhaps it is hiding in plain sight, wearing clothes.

more info