History of Technology banner

Lecture: Contemporary Times

In my closing lecture, I focus on communication and relationships, among ourselves and with our technologies. From publishing from our own desktops to texting with friends to accessing the web, we have many daily interactions with both mobile and home-based technologies. We also have conscious discomforts with the newer technologies. We understand the telephone, but what about a continual connectedness that doesn't give us a moment to ourselves? I've lived through all of this, so I'm sensitive to the extraordinary benefits of our age (and sometimes it's like the apocryphal story of the audience diving under their chairs at the first movie). Without the web and Wikipedia, my students would have to buy overpriced textbooks, and I wouldn't be teaching online, which I love. At the same time, I don't like cell towers (even when they're supposed to look like trees) and wind turbines ruining the landscape, or my students looking at their phones while I'm lecturing. In contemporary times, we must learn to balance what we want to keep from the past with the opportunities of the future.

Word processing

I owned an early Apple computer, the IIe. For me, it was a word processor. I began on a manual typewriter when I was young, then used an electric, then a "golfball", then a line editing electronic typewriter (I rented one to type my thesis) . All of these were extraordinary technological advances.

Smith Corona Selectric Swintec typewriter Apple IIe

To me, the point was editing and erasing. On a manual typewriter, before the invention of correcting fluids, you could correct by typing over the errant letter with a piece of plastic coated in chalk. golf ballOn some models they began to build a white chalk strip in at the bottom of the ribbon, so you could correct within the machine. Liquid Paper was invented in 1951 by Bette Naismith Graham, and she sold the product from her home for 17 years. Gillette bought her out in 1979, which is about when I got ahold of it for the first time. The IBM Selectric models (third one from the left) had a strip of correction tape built in. They used a "golf ball" mechanism (right) so keys never jammed. I grew up to the sound of the IBM Selectric III, which was so big the only place it could sit was the kitchen table. The line-editing electronic typewriter that I rented in the late 1980s let me see the line of text I was typing on a screen before it printed to the paper, so I could correct it. The Apple IIe was another step forward - I could not edit the whole document before printing. On a dot matrix printer, which looked awful. The first thing I saved up for was a LaserWriter printer, and I still think it one of the best products ever made.

mimeographThese inventions went along with machines for duplicating documents. As I was growing up, everything was duplicated using a mimeograph machine (left). Mimeograph technology went back to World War I, and used coated paper on a drum. You typed on the coated paper, mounted it on the drum very carefully, poured ink into the machine, and turned the crank to print individual copies until the ink faded.

In 1959 Xerox had invented a commercially successful photocopy machine using xerigraphic (dry) technology, but it was too expensive for me to access. Plus they caught fire a lot. Later, in the 80s, I got very closely acquainted with photocopy machines. Having quit college, I worked in a bank and my official title was "Photocopy Clerk/Messenger Person". Half the day I photocopied documents. They used cameras by then, and they jammed all the time. I think I spent more time clearing jams than photocopying, and I was well-acquainted with the photocopy repair man.

Photocopying technology was self-consciously advanced. Check out this (slightly blurry) Xerox commercial from the 1977 Super Bowl:

The Internet

Despite my familiarity with word processing (I began writing my own mini-programs in ProDos for the Apple IIe), I could not get my head around what the Internet was when I first began using it around 1992.

It derived from Sputnik, in a sense, because the fear of the Soviet achievement led to the creation of DARPA - Defense Advanced Research Project Administration. DARPA researchers were concerned about Soviet spies hacking into the phone system. (I find that ironic, given what's happened with hacking recently). The idea was to create networks of computers, all connected to each other. Information could be transferred securely via "packet switching", invented by American computer scientist Paul Baran but coined by Donald Davies of the UK in 1965. Packet switching breaks up data to transmit it down different paths, with only that data occupying the path. Once it arrived, the path would be clear for more data.

Check out the animation:

packet switching

In 1973, Stanford professor Vincent Cert and American engineer Bob Kahn created additional security by inventing Transmission Control Protocol (TCP), which sends data down no set pathway, since each packet includes its own addressing information. This allowed computers that were far away from each other to send and receive data. Throughout the 1980s, the government and universities were able to connect their information.

Tim Berners-LeeIn 1991, English scientist Tim Berners-Lee was working in Switzerland at Conseil Européen pour la Recherche Nucléaire (European Organization for Nuclear Research) -- he's shown here at CERN in 1994. There he had developed the idea of hypertext, where links could be clicked to access another page with information. As TCP developed, he realized he could combine hypertext with connections to the Internet, since CERN was a major node in the networks. The interface was the "browser", a version of which he also invented, along with HTTP (HyperText Transfer Protocol - the protocol that determines how web pages are served and displayed, including this one), HTML (HyperText Markup Language - the markup behind most web pages, including this one) and the URL (Uniform Resource Locator - the address for every web page, including this one). Some say he invented the World Wide Web, where all information is hyperlinked in a large web of information.

Less than two years later, Marc Andreessen created the first popular web browser, Mosaic. It could display images as well as text. In those days, a home could access the internet with a dial-up modem (I will always remember the sound of it connecting). Images had to be very small, because they took so long to load. The first online classes I designed, in 1998, had pictures the size of postage stamps, and no video.

The web became a huge tangle of commercialism, free applications, open classes. "Web 2.0" was a term invented in 2004 to describe new programming that allowed people to input text and information into websites, instead of just read and view. Web 2.0 allowed photo sharing, social networking, and self-published blogs.

In a sense, sites like Google and Facebook are going back to the idea of the old internet - everything looks like it's hyperlinked but it's all linked back to the company's computers. The "cloud" (a term that became common around 2010) meant that people didn't have to know a lot about what was "under the hood" on the internet. Cloud computing makes the internet more like a utility, like gas lighting or the road system. The difference is that so much of it is controlled by private companies, though governments have thus far been careful to keep the internet itself open to all. In Europe, this philosophy and resulting legislation is referred to as the "Open Internet"; in the U.S. the term "Net Neutrality" is more frequently used. In February 2015, the U.S. Federal Communications Commission affirmed net neutrality by classifying internet access as a telecommunications service, rather than something different that could be priced and controlled by businesses.

Interestingly, the internet affected telephone systems directly. Voice over Internet Protocol (VoIP) allows the use of the internet to carry data packets comprised of voice content. Since the cost has already been paid when subscribing for the Internet Service Provider, calls were cheaper, especially international calls. That led to some consumers and businesses (like MiraCosta College) to replace their phone service with VoIP. But the main reason people used fewer landlines was because of cellular technology.

Cellular/mobile telephones

The term "cellular" comes from an AT&T concept from the 1970s - cells were coverage areas served by a receiver and transmitter. Modern cell phones are just highly sophisticated radios (and tablet computers are just big cell phones, but that's another story). The technology wasn't new in its analog form - we have seen radio telephones on the battlefield in World War I, and there were car telephones in the late 1940s. Car phones weren't cellular - they worked only in cities, off of a single city tower. Only 25 channels were available, and the handsets were large because the antenna was huge. Cellular technology allowed lower power receivers and transmitters, and digital technology allowed instant switching as one moved through the cells. Cells are about 10 miles square (each with a base station/tower), and frequencies are continually reused so that many people can use the system at the same time.

Motorola introduced a personal cellular phone in the 1980s - here's an ad predicting the future:

was lost? Well, landlines are much more reliable. One cell tower going down can knock out service in 10 square miles - one could imagine what an earthquake-caused infrastructure failure could do.

There has also been very little time to analyze the impacts of usage patterns. In addition to concerns from phone radiation, the "hyperlink" thinking encouraged by the web has also bled into phone use. This is particularly the case since "smart phones" (phones that have some computer functionality and can access the internet) became more affordable around 2006. "Divided attention" and "multitasking" have become 21st century habits, primarily because of the increased capacity of computers and smart phones to engage in tasks simultaneously. Current research has indicated that multitasking is impossible - most of the tasks invariably suffer in quality as a result. (At the same time, I admit I'm writing this lecture while half-watching baseball and keeping an eye out for an energetic kitten.)

The ways in which we use our technologies is now seen as the way we "relate" to our technologies. We have seen this concern before. There was worry that communications technologies (such as the telegraph) could keep us separated even as they brought distant people together. And that television meant that people wouldn't go to theatre anymore. And that the web keeps even more people inside their homes (in suburbia, I'd also blame the automatic garage door opener, invented in the 1970s). And certainly cell phone addiction can cause anti-social behavior and lost opportunities, as mourned in this 5-minute 2014 film by poet Gary Turk, a sign of some resistance to always being tethered to our devices:

Robots

As our humanity has become consciously tied with technology, we have tried to make technology more human. Artificial intelligence is one example. Although computers had already engaged in increasingly complex tasks since World War II, the term "artificial intelligence" was coined in 1956. This was the time when Margaret Masterman, philosopher and linguist, created the field of computational linguistics. Her Cambridge Language Research Unit developed machine translation long before machines were capable of doing it. They laid the foundation for semantic computing, which can process multiple inputs and is likely to be the foundation for the future web. In the meantime, AI has given us chess-playing computers, a program named ELIZA that can act as a psychotherapist, and fictional computers that try to take over, such as the HAL 9000 in the 1968 film 2001: A Space Odyssey.

Robots go beyond artificial intelligence - their primary purpose is to replace human tasks or related to people. Long before robots had the capability to do so, pop culture has shown the fear of robots out of control. The earliest robots likely date back to before the 18th century (Leonardo da Vinci designed one that looked like a knight), but the development of sophisticated clockwork made possible mechanical devices that were ahead of their time. It makes me wonder whether many technologies that are ahead of their time become toys?

Pierre Jaquet-Droz made this in 1768.

The Dulcimer Player was made by German watchmaker Peter Kinzing and cabinetmaker David Roentgen and given to Marie Antoinette in 1784. Damaged in the French Revolution, it was restored by magician Jean-Eugène Robert-Houdin in 1864.

electric robotRobots are referred to as automatons in Victorian science fiction, and some were built. In 1900 New York, Louis Phillipe Perew built a huge walking man run by electricity (right). Professor Archie Campion of Chicago, who used the millions he made patenting valvular conduits and polyphase electric systems to build a robot called Boilerplate. At the time, "dime novels" were available with stories about Steam Men and other automatons having adventures with humans.

As with much technology, more advancements followed World War II. The book I, Robot by Issac Asimov (1950) even created rules for robots not being allowed to do bad things, like kill human beings. In 1961, Unimate was the first installed industrial robot - it used its one arm to pull a part from the assembly line at General Motors and weld it on. It even appeared on the Johnny Carson show on television, which expresses its novelty at the same time as people began to worry that robots would take over people's jobs. By 1986 Honda Corporation was working on the interaction between robots and humans, research that would lead to the creation of ASIMO (Advanced Steps in Innovative MObility, it means, though I'm sure it's also a tribute to Isaac Asimov) in 2000. Here's video from 2014 - I have had the pleasure of watching ASIMO become more advanced over the years - until recently it was featured in the Innoventions exhibit at Disneyland.

ASIMO
Descriptive text

And here is PARO, a robot designed to help patients by encouraging sympathetic responses, in 2010. Designed in Japan, it has gone global.

Kibo kidsTamagotchiDuring the 1990s, the Tamagotchi toy got to be a huge fad - it was necessary to continually tend to the little robot's "needs" to keep it alive. Together with robots like PARO, concerns arise about human interaction with robots replacing intrapersonal interactions. Sherry Turkle is worried about older people being handed robots instead of being spoken to and, more importantly, listened to by young people. So again, it's possible that something is being lost. It's likely that without the web and television, children would play outside more. Families might interact more in the evenings, instead of everyone texting their friends. And although relationships with robots are less messy than human relationships, we may be seriously undervaluing human contact.

It's not surprising that one of the best-selling robots, the Roomba disc-shaped automatic vacuum cleaner, is made by the iRobot corporation. They now sell a robot that the user can program. Along with robots and artificial intelligence, the idea has developed that programming (coding) is a crucial skill. In the past few years, even children's toys are being made to teach programming, such as Kibo (left).

Technology as art

As we've noted throughout the course, the boundaries between technology and science are blurred. So are the boundaries between technology and art. Here we talk briefly about the conscious display of technology in art, which has become more common in contemporary times. The Victorian era had similarly been conscious of its technology, but they had tried to design technological objects to look beautiful and fit in with current decor norms. Phonographs and stereoscopic viewers were designed to be beautiful, something you'd be proud to have displayed in your homes. Futurism during the 20s and 30s led to the appreciation of streamlining in everything from railroad engines to desk sets, often in the art deco style. Neon, which had been discovered in 1898 and used to light up signs (and Times Square) after about 1920 was also used in art, sometimes as a commentary on the seediness of society. Neon technology is now used in plasma televisions.

electroluxipodIndustrial design is being increasingly featured in museums. Iconic designs from the Lurelle Guild's 1937 Electrolux vacuum cleaner (left) to Sir Jonathan Ives many Apples products (right) are now considered museum-worthy. I find it interesting that objects with a particular purpose can now be considered for their aesthetic qualities as well as their function. While that idea isn't new, there is a new appreciation of the beauty of our technologies, which has been very helpful in preserving old buildings and industrial sites (so as a historian of course I am in favor!).

Conclusions

This is the last lecture of the class, but I'm sure the trends we've been tracking will continue, because that's how history works. While my lectures are not an exhaustive source of information (there are lots of places for that), I hope you've enjoyed my point of view on technology, and will keep in mind the price we pay for it as we enjoy its benefits.

Any sufficiently advanced technology is indistinguishable from magic.-- Arthur C. Clarke