History of Technology banner

Lecture: World War II and Cold War

War, whether hot or cold, causes rapid advancement in technology. That's a theme in history. Certainly World War II and the following Cold War provided an impetus and a backdrop for technological development. It is also with this era that science and technology work together most closely to create our contemporary mindset.

Futurism

Following World War I, Europe and America celebrated with both introspection and a view toward the future. The introspection was deep, and much of it was the questioning of technology as human tools. The slaughter of the war had been astonishing, and the boys who did come back were damaged mentally and physically. There were more than a few people who believed that Western Civilization was coming to an end. There was a reaction against the dehumanization of mechanized slaughter, a feeling that the technology had gotten beyond people's ability to control it. Philosophically, there were two ways to go after the war.

Einstein notebookLogical empiricism related most closely to science, because it saw a lack of scientific thinking as part of a larger social problem. Within the scientific community, the early 20th century had been a time of dislocation and change. The relativity theories created by Albert Einstein are a good example of this. Before the late 19th century, geometry was based on the work of Euclid, following his axioms. Mathematicians began to move away from Euclid's two-dimension plane, discussing and calculating curved surfaces that violated Euclidean axioms. Einstein's view of space was similarly non-Euclidean and curved. And this theory of relativity meant that there were, in a sense, no constants in the universe - everything moved relative to everything else. This destroyed the Newtonian physics that every scientist had understood for over two centuries. It was time to build a new science based on rational principles, not just physical observation of the world. The logical empiricists (many of whom, like Einstein, were scientists) were in essence trying to revive Enlightenment principles of reason in the face of a seemingly irrational world. This philosophy saw that "metaphysical" evils such as war, racial hatred, set gender roles, were manifestations of irrationality and should be defeated with reason.

The other way to go was toward Existentialism, which assumed moral responsibility. Technologies do not create evil - people do. Machine guns were manned; they were not automatic in deciding to fire themselves. Jean-Paul Sartre, influenced by the German philosopher Heidegger during the 20s, rejected science as a way of understanding human behavior. Human beings are more than just people situated in a context. They have the responsibility of free will independent of what's happening. Our innate freedom implies heavy responsibility for our actions. We choose to be who we are, and to do what we do. Although existentialism did not develop fully until after World War II (with its highly questionable acts of moral responsibility - for example in the death camps), it began right after the war and was a response to it.

Now, we know that some just rejected the introspective response. The economy of the U.S., which was only involved in the war for a little over a year, rebounded immediately and led to the Roaring Twenties. The economy of France recovered much more slowly, as much of the damage done to the land occurred in France. Germany attempted to become a republic, but the Treaty of Versailles blamed it for the war and demanded reparations. Unable to pay, Germany accepted the occupation of the Ruhr Valley, a major industrial area, by France in 1923 as partial reparations. The German economy collapsed without a base of production, and hyperinflation ensued. A dollar, worth 4 marks before the war, was worth 4.3 trillion marks in 1923. People starved and sold their clothes, jewels and children to survive. Braque

Artists throughout Europe and America responded with a modern vision of the world. Bright colors and geometric shapes were featured in paintings, and geometry was expressed differently in architecture. These "modernists" in general rejected traditional forms and ideas, including Enlightenment rationalism, religion, and Victorian society. They used techniques that drew deliberate attention to the technologies in the art. Brush strokes were featured, works were created out of collages of various materials (like the work of Georges Braque, right). Modernism had begun before the war, at the end of the 19th century, but the responses to the war seemed to push it forward into a form of social commentary.

Another response to the horror of World War I was fascism. Developed in the early 1920s by Mussolini, fascism was a form of nationalism that was based on the idea of historical destiny through the state. As Mussolini himself wrote for the Italian Encylopedia in 1932:

The foundation of Fascism is the conception of the State, its character, its duty, and its aim. Fascism conceives of the State as an absolute, in comparison with which all individuals or groups are relative, only to be conceived of in their relation to the State.

Italy felt itself abased by World War I, ignored at the peace negotiations that redrew boundaries for Europe and the Middle East, and crowded out of the imperialist economies of the late 19th century. Since Germany too was abased, and also would adopt fascism, I tend to see fascism as the politics of those who feel they have been wronged by history. Fascism glorifies war as a tool of state, and despises both democracy and socialism as undermining the true potential of the individual:

Fascism, now and always, believes in holiness and in heroism; that is to say, in actions influenced by no economic motive, direct or indirect.

In its adoption of a metaphysical stance and its assumption of responsibility onto the state, fascism rejected both philosophical trends after the war. Its visual expression was in the art of Italian futurism, which began just before the war and glorified violence, speed and technology. Their work celebrated industrialization and technology in the same self-conscious way that the modernists celebrated the technology of art itself.

Gino Severini Marinetti Boccioni
Gino Severini, Plastic Synthesis of the Idea of War, 1915 Benedetta (Benedetta Cappa Marinetti), Lights + Sounds of a Night Train, ca. 1924 Umberto Boccioni, Unique Forms of Continuity in Space,1913 (cast 1931), bronze

After the Stock Market Crash of 1929, the main European and American investors who had financed the rebuilding of Europe and the vast moneymaking structures of America, became isolationist. Investment was pulled back into American companies, and many went under. The technologies of the 1930s were primarily improvements on what had come before: frozen food using the new electric freezers, improvements in radio that allowed most homes to have a set and listen to programs, the long-playing vinyl phonograph record, reliably produced color film, Disney's first animated feature (Snow White) - all were built upon previous technologies. The television was invented in the 1930s, but wouldn't reach economies of scale till after World War II.

The War

You can read about World War II anywhere. The short version is that the Mussolini's fascist Italy began expanding in the 1920s, and in Germany the republic failed. The fascist National Socialists (Nazis) were brought to power in 1933. Different countries dealt with economic depression in different ways. Some, like the United States, Britain and France, adopted socialized methods to deal with the crisis, using deficit spending to keep people employed and create necessary public works. But Germany and Italy, and the newly imperialist Japan, chose expansion to create economic growth. The Nazi leadership managed to combine pseudo-scientific ideas of race (exactly the sort of thing the logical empiricists objected to) with this expansion to justify a recreation of the German state, destroyed by World War I's unfair treaty. They began a "reunification" project of German-speaking peoples, then continued expanding into eastern Europe.

The invasion of Poland in September 1939 brought Britain and France into the war. The Soviet Union, originally in non-aggression pact with Germany, would join the Allies when it was attacked by Germany in 1940. As with the previous world war, this one brought extraordinary advances in military technology. cavity magnetron

Detection technologies were crucial to finding and engaging (or avoiding) the enemy. Sonar (originally SOund Navigation And Ranging) had been patented right after the Titanic disaster. But it remained experimental until World War I, when German submarine warfare had caused more rapid development. By World War II, improved active sonar would be able to "ping" nearby submarines to find their location. Radar (originally Radio Detection And Ranging) was first patented in Britain by Robert Watson-Watt (a descendant of James Watt of steam engine fame) in 1935. Britain, as an island, was particularly vulnerable to air attack. Radar uses radio waves, broadcast through the air, to detect objects. Watson-Watt's system was able to detect planes up to 80 miles away, but they had to be based on the ground. During World War II, the invention of the cavity magnetron (right) made it possible to install microwave radar in the planes going up to meet the German air raids hitting civilian targets at night. The mass production of the magnetron in the U.S. made possible Allied superiority over the radar being used by the enemy. Nowadays we use it in microwave ovens.

Enigma GEncryption technologies were crucial to communications, since both telegraph lines and wireless communications could be hacked by the enemy. Encryption technologies go back to ancient times, when secret messages were written on strips wrapped around a stick. The strip was brought to the recipient, and had to be wrapped around a stick of the same size to be understood. Julius Caesar apparently also wrote code just by moving all the letters over by one on the alphabet. During World War II, it was much more complex. German engineer Arthur Scherbius invented the famous Enigma machine at the end of World War I, and despite having the code broken by Polish spies, the improvements on design through the 1930s made the code very difficult to break. It was a complex machine, requiring operators at both ends to set the many rotors and plugs the same way and start at the same letter. Some say the machine wasn't that good, and that's why the British were able to break the code. But it's more likely that mistakes and sloppiness in its use for unimportant communications made it possible to discover how to decode the important ones.

Mathematician Alan Turing was instrumental in creating computers that could work on cracking code - he is often given credit for creating the first electronic computer machines (see below). In 1939, the Polish Cypher Bureau passed on the information they had about Enigma, and Turing used that information to create the "Bombe", a machine that could be set using multiple switches and plug boards to decipher the code. The work was done at Bletchley Park in England.

How the code was cracked:

The clip goes a bit far in saying that the Germans didn't know that code books were being retrieved from captured submarines - in fact, they went to great lengths to set self-destruct sequences for the submarines in case of capture. In June 1944, the German submarine U-505 was captured, along with its code books, by a brave team of Americans. The submarine now resides at Chicago's Museum of Science and Industry, which has a good website on the story of the capture.

A book came out recently about working at Bletchley, and this BBC news story features some of the women who broke the code:

The Atom Bomb

As the war dragged on, their was fear that Germany was developing a superbomb based on nuclear fission. Beginning in 1935, three people worked together on the possibility of nuclear fission: Lise Meitner (Austrian physicist and the first woman in German to become a full professor in physics), Otto Hahn (German chemist who would later win the Nobel Prize) and Fritz Strassman (Hahn's assistant). They were not the first to try: substantial work had been done the year before by Italian physicist Enrico Ferme. Meitner provided the theoretical foundation after they had succeeded in turning uranium into barium through removing 100 nucleons, recognizing the connections to Einstein's formula changing mass into energy. Although I do not have a deep understanding of nuclear physics, I am struck by the similarity between splitting the atom and alchemy - both have the goal of changing one natural substance into another.

Although a converted Christian, Lise Meitner was born Jewish and had to escape to the Netherlands, with Hahn's help, in 1938. In 1939, the scientists published their findings, which caused a sensation and led Albert Einstein to write to President Franklin Roosevelt, concerned that the knowledge of how to make a bomb might find its way into Nazi hands. When FDR created the Manhattan Project in 1940 to help develop the bomb in the U.S., Meitner was offered a job but refused, wanting nothing to do with making a bomb. Many of the scientists were, like her, refugees from Nazi Germany. When the U.S. entered the war in December 1941, the project expanded, ultimately employing 120,000 people at research locations around the country (though many were not told the goal of their research). By December 1942, the Chicago branch of the team, including Italian physicist Enrico Fermi, created a controlled chain reaction explosion using fission. Theoretical physicist J. Robert Oppenheimer led the effort at Los Alamos, New Mexico, detonating the first atomic weapon at the test site in July 1945. Oppenheimer was astonished at the power of the bomb. The flash was visible for 200 miles and the mushroom cloud of smoke reached 40,000 feet into the air. He quoted the Bhagavad Gita in a way that resonates with me about a lot of our technologies:

We knew the world would not be the same. A few people laughed, a few people cried, most people were silent. I remembered the line from the Hindu scripture, the Bhagavad-Gita. Vishnu is trying to persuade the Prince that he should do his duty and to impress him takes on his multi-armed form and says, "Now, I am become Death, the destroyer of worlds." I suppose we all thought that one way or another.

Within weeks two versions of the bomb were on their way to Japan. One was dropped on Hiroshima and the other three days later on Nagasaki. Here's my own analysis, using primary source footage:

Instead of returning to normalcy after the war, it became immediately apparent that Soviet expansion into Europe was not going to end with the war. Having invaded as far as Berlin to end the war, the Soviet Union worked to put friendly communist parties into power in the nations of Eastern Europe. By 1946, Winston Churchill was claiming that an "Iron Curtain" was descending in Europe, separating the free West from the communist East. The techniques of spying, and sonar, and many more technologies would be developed further as the United States and its allies vied against the Soviet Union and her allies to prevent either side gaining more control over other countries. The Cold War would continue until the Berlin Wall, built in 1961 to keep Soviet-controlled East Germans from emigrating to the West, came down in 1989.

From Vacuum Tubes to Transistors

From the end of the 19th century until the middle of the 20th, the heart of what we would call electronics was the vacuum tube. When I was a kid, I used to go with my dad down to the drug store, carefully carrying vacuum tubes that had burned out from the radio or television. We'd test them in a big testing rack, and buy new ones as needed. What I didn't realize is that those tubes were capable of taking power from the socket and using it to heat up the filament in the tube to amplify the radio signal so we could listen to the radio through the speaker.

triode tubeA vacuum tube contains two or more electrodes (metal conductors) in a glass container from which all air has been removed. Leads from the electrodes come out the bottom of the tube through an airtight seal, to be plugged in to fixtures. Since the earliest tubes evolved from light bulbs (which is the same idea using a filament), Thomas Edison was involved in its development, though most people agree he had no understanding of the science involved. In a vacuum tube, the filament is called a cathode, which emits electrons when excited and creates an electric field. English physicist John Fleming, who consulted for both Marconi and Edison, developed different types of tubes, including the 1906 triode tube shown at left. first transistorBy acting as a rectifier, vacuum tubes could convert alternating current to direct current to provide appropriate power for radios and radio devices. By the early years of the 20th century, vacuum tubes were used in the development of the telephone, telegraph and radio - the triode tube provided the amplification needed for wireless communications during World War I. Bell Laboratories played a major role in the development of vacuum tubes, and even created the first television in the late 1920s.

Bell Lab scientists invented the transistor in 1947, by mounting two gold contacts onto a germanium plate (the semi-conductor material that amplifies the power). This was the beginning of solid-state electronics, rather than the gas state required inside a vacuum tube. Transistors made it possible to create cheaper, easier ways to amplify power in a less fragile way. They were more durable and used less electricity. During the 1950s, transistors got smaller and lasted longer than any vacuum tube.

This was important to computers. Vacuum tubes were used as switches in computers during the 1940s. The decrypting computer Colossos, invented by electrical engineer Tommy Flowers, had 1600 tubes and was instrumental in breaking the German code during the war. In 1946, ENIAC, developed by the US Army and the first electronic computer that could store information (instead of being reset for each task) had 17,000 tubes. Tubes would fail every couple of days, and have to be located and replaced (though I'll bet the Army didn't have to bicycle down to the drug store to test them!). Though designed to work on artillery tables for ballistics, it was so powerful it did the calculations for the development of the hydrogen bomb. By 1950, Alan Turing was exploring the comparisons between the computer and the human brain, starting up the field of Artificial Intelligence.

Transistors increased reliability of computers because they replaced the tubes. The IBM 1401 was the first transistor computer:

Teenagers in the 1950s loved transistors too - they made possible the portable radio. Bringing music to the beach no longer meant you had to play guitar or lug along a phonograph.

Medicine and Genetics

March of Dimes posterPenicillin posterOne of the few benefits of war is it leads to medical discoveries. After World War I, the huge discovery was antibiotics. Although doctors like Joseph Lister and Louis Pasteur had experimented with substances likely to kill bacteria, it was Alexander Fleming who first used penicillium from a fungus to treat killer bacterial infections such as syphilis and tuberculosis. This led to the creation of an array of antibiotic therapies once it was mass produced in 1943. At the same time, Charles Drew at Columbia Medical Center had discovered that blood plasma (the liquid part of blood) could replace whole blood, which spoiled so quickly it was hard to use in wartime. His system made it possible to replace blood at the site of the battle, saving lives.

Other advances were based on previous knowledge. The pharmaceutical company Squibb, for example, created single-dose morphine tubes that could be injected by medics on the battlefield, knocking out the patient till they got him to surgery. To replace supplies of quinine cut off by Japanese occupation in the Pacific, Atabrine pills were invented to prevent malaria, but it was hard to get soldiers to take it: it was very bitter and turned the skin yellow. Ultrasound was invented shortly after the war, in 1956, and it was based on military sonar.

Immunizations also got a "shot in the arm" (I had to do that) during this era, although development of vaccines for diphtheria and other diseases had been steady since the early 20th century. Some vaccines, such as the one for tuberculosis in 1927, didn't prove effective. Others, such as the flu vaccine given to troops in 1942 to prevent the horrors of the post-World War I influenza epidemic of 1918, may have been more helpful. So was the typhoid vaccine of the 1950s, and the polio vaccine, which got off to a rough start with a contamination of the shots, but was back on track by 1955. Polio had crippled children, led people to be hospitalized for years in "iron lungs" that breathed for them, and had meant that President Roosevelt walked with crutches.

The government sponsored polio vaccines, and from that time to now is involved in vaccination as a public health issue. Vaccine research moved from killer and disabling diseases, to childhood diseases that were survivable and common: measles, mumps, rubella. The ever-increasing list of recommended vaccines, and the enforced nature of their administration, has caused some to turn away from vaccination. Some of the survivable diseases help natural immunity. Anti-vaccine groups who talk about side effects and problems are in the same category as people who objected to electricity - they are not Luddites but are rather resisting the incursion into their private lives of technologies they do not approve.

DNA x-rayThe science of genetics engages similar controversies, particularly as it has developed into genetic technologies. After the war, chemist Rosalind Franklin and others were working to discover the structure of DNA (deoxyribonucleic acid), the large molecule that contains genetic code for an organism. Having learned x-ray techniques doing research in France, Franklin was hired to set up the x-ray crystallography unit at Kings College in London. Although hired to work alongside Maurice Wilkins, who was using x-ray crystallography to examine DNA, she was treated like a lab assistant. She was able to take two clear photographs of DNA (right), and presented them at a talk attended by American physiologist James Watson and physicist Francis Crick, who had also been working on discovering the structure of DNA. Franklin hypothesized that the shape of DNA was a helix. Between 1951 and 1953, Watson and Crick realized the shape was not only a helix but a double helix. They and Wilkins got the Nobel Prize in 1962. Not Franklin.

A major step forward in biochemistry, understanding the structure of DNA, and theorizing about it carrying genetic code, would create the moderns fields of molecular biology and genetics.

Conclusions

You'll notice there's a lot more science in this lecture. Historians James E. McClellan III and Harold Dorn posit that the development of the atomic bomb marks a turning point in the relationship between science and technology. It revealed the "practical potential of cutting-edge science" and established a relationship between government-funded research and the development of technology. They claim it is at this point that we begin to think of technology, not as tinkering or the creation of objects that do things, but as applied science. I think that may indeed apply to the United States (President Eisenhower's warning about a "military-industrial complex" controlling the future is apt), but I'm not so sure about Europe. Besides, in both places there are independent companies (like Bell Labs here) working on technologies with people who have degrees in various fields other than science. And I'm always suspicious of any theory insisting that a huge change has occurred. In my experience, almost all changes are incremental. Perhaps the extraordinary ethical and moral implications of the atomic bomb set it apart, and so it is assumed that its development is a "revolution". Maybe.

Sources

James E. McClellan III and Harold Dorn, Science and Technology in World History: An Introduction (2nd ed., Johns Hopkins University Press, 2006)