Armchair historian does London Bridge

Historians sometimes reinvent themselves. Or maybe it’s better to say that historians who are very famous, or not famous at all, sometimes reinvent themselves. If you’re very famous (like Simon Schama) you can do whatever you like. If  you’re basically unknown (like me) you can also do whatever you like. But if you’re an acknowledged expert about One Big Thing, I suspect you can’t do anything else.

I’ve been working for a few years on making Victorian England my new specialty, and I’m also writing a novel that takes place in 1862. To find good resources and just enjoy the era, I’ve joined some Victorian-focused Facebook groups. People post old photos:

These are London Bridge in 1890, the top one facing north, the other facing south. And oh, the traffic! I’ve read that the bridges were often jammed in the 1860s, and it looks like by 1890 they weren’t any better. Can’t you just imagine yourself trying to cross the street?

Then someone on Facebook asked whether the stairway was still there:

 

Let’s go look! (And get a load of all the “temporarily closed” on Google. Might want to take some screenshots — this will all be history too, remember).

 

 

So I “drop down” my little G droid* and go look.

 

 

Hmmm… looks like maybe the top of a stairway? I’d better drop down by the river bank for closer inspection.

 

A ramp! Much nicer than stairs for lots of people. Plus you don’t have to go out past the church and turn left to get onto the bridge.

The ease of doing this sort of thing amazes me. An armchair traveler in the 19th century could sit at home and read books to go to wonderful places all over the world. I can drop down my G droid anywhere and walk around (well, click around).

I can’t go to England this year, but I can do this. The Google Map images are relatively recent. I can walk down streets. I can look up old maps and then go see what’s different (I do that all the time for research). I can even go to webcams like this one and see places in real time. I can start up Google Earth and see buildings in three dimensions.

All of which beats relying on H. Rider Haggard for my view of the world. But I would like a wingback chair, please.

 

___
*I know the droid is called “Pegman”, but why should it be a man? I’m a lot of things, but I’m not a man.

Is it better to be a milkmaid?

In the early 18th century, Lady Mary Wortley Montagu would discover the inoculation process for smallpox while she was living in Turkey. She had lost a brother to the disease, and barely survived it herself. Smallpox in the 18th century was particularly virulent; the CDC says it had an average a 30% death rate for those who got it. The inoculation was done using actual scabs from people who had smallpox, inserting it under the skin through a cut. Lady Montagu had her son inoculated while in Turkey, and her daughter in England when she returned in 1721. She then campaigned to popularize the method in Europe.

But Edward Jenner gets all the glory, even today, because he developed a vaccine to replace the inoculation. Inoculation is an old idea — you take some likely material from a living victim of the disease, and put it in a person who hasn’t had it (similar to the convalescent plasma being tried today). But there is always a danger of actually giving the healthy person the disease. A vaccine uses a more benign substance to achieve the same immunity.

I was taught in college that the discovery of the smallpox vaccine came out of the realization that milkmaids didn’t get smallpox. This was apparently because in leaning their cheek on the cow while milking, they acquired cowpox, a very mild disease. The cowpox antibodies protected them from smallpox.

There is a wonderful mythology around Edward Jenner and his 1796 vaccine. It’s based on the story of a milkmaid he met when he was a boy, bragging about her lovely skin that would never be scarred by the pox.  Rather like the story of George Washington and the cherry tree, it was created by a later biographer. The real story is more ordinary. Nevertheless, milkmaids didn’t get smallpox.

I learned today that an old vaccine for tuberculosis may have some value in helping with the current virus. The New York Times article says:

The B.C.G. vaccine has an unusual history. It was inspired in the 1800s by the observation that milkmaids did not develop tuberculosis.

The active ingredient of the Bacillus Calmette-Guerin vaccine is Mycobacterium bovis, which was isolated from a cow in 1908. It was made into a proven vaccine by 1911, and is today used in areas with high tuberculosis danger.

Unlike the Jenner/smallpox tale, the story of BCG inspiration and milkmaids is harder to track down. I can find nothing in a verifiable source, other than popular/journalistic websites copying each other’s phrasing, to confirm the background of the discovery.  Closest I could get was a 2008 Russian article saying that milkmaids and others close to cattle could carry a latent infection that gives a false positive for a tuberculin skin test. I couldn’t find anything even saying that milkmaids didn’t get tuberculosis. Is this a case of people just mistaking one apocryphal tale for another?

But we know that Mycobacterium bovis is to Mycobacterium tuberculosis as cowpox is to smallpox — a similar, less deadly disease that protects against a more virulent one. The Lancet warns that it might not be effective against the current virus, and there might not be enough available, but it has helped people with similar respiratory diseases recover faster.

I’d still like to think that milkmaids would be safer.

 

Fake Fitzgerald or Real Telegram from 1918

The last couple of weeks this has been going around. Like a lot of people, I was hoping it was real.

I thought it was real not because I know Fitzgerald, but because I know Hemingway. But it wasn’t real. It was from McSweeney’s.

According to Esquire, we all wanted this to be real because we’re all looking for reassurance that we can survive this crisis. I’m a historian, so in addition to being embarrassed because I was tricked, I wanted to then find real documents that could accomplish the same thing.

Here’s what I found:

Western Union Telegram signed [Hon.] Edward Rainey, [San Francisco Mayor’s Office], to Hon. Harvey Neilson, Santa Barbara, California Mayor’s Office.

1918 October 31.

“If you have not already taken such steps strongly urge universal wearing of masks to prevent or check influenza epidemic. Cases here rose steadily from two hundred per day October Six to over two thousand October Twenty-fifth. On Twentieth, some our people wore masks, on Twenty-first on recommendation Health Board, Mayor [James] Rolph issued proclamation calling for everybody wearing masks. Nearly whole population complied. Red Cross backed with advertising and two days later supervisors passed ordinance requiring wearing by all persons. Practically whole population in masks. By Twenty-third, five days after first masks appeared or three days after use became general new cases dropped approximately fifty percent. Deaths at peak 194, yesterday only 103, many of these having been sick for some days. New cases decreasing daily. Health authorities say San Francisco probably get through with far less distress and death than Eastern cities which started with about our figures but keep on going up while ours went down. All agree masks largely responsible. Sending this for your information because I have seen the whole terrible effect of epidemic here, because masks have saved untold suffering and many deaths, and because Santa Barbara my old home city. Portland, Seattle following San Francisco lead.”

Source: Online Archive of California, UCLA Special Collections
Collection of personal narratives, manuscripts and ephemera about the 1918-1919 influenza pandemic, 1917-1923
https://oac.cdlib.org/findaid/ark:/13030/kt2t1nf4s5/entire_text/

 

So. Wear your masks, Californians, and know that it will end.

Going online the 19th-20th century way

Given current discussions about moving on-site classes into an online environment as an emergency response, I’m going to go out on a limb and say: don’t worry about pedagogy.

Current Learning Management Systems have simplified online instruction to an extreme degree. Despite decades of research and experience in various active learning online techniques, our current systems still encourage the basic triumvirate: reading/information, discussion, and assessment.

An on-site class already has assigned reading, so these can be moved online if necessary. Discussion that took place in class can be placed on a discussion board if desired, or one can easily just answer questions through messaging or email. And assessment can be moved online using the built-in quizzes. The fact is that most on-site teachers moving online will focus on these three things. They will need to learn how to upload their readings, set up a discussion board, and retype their assessments into the system if they don’t know how to use a tool like Respondus (or are on a Mac). The fourth big task will be learning to use the gradebook, which is more difficult than handing back a piece of paper with a grade and comments.

These preparations will take far more time than classroom teaching, and will already cause massive workload issues. These concerns will be added to student confusion on assignments and grades, basic worry about managing ones professional tasks in isolation, and trying to ensure enough income and food to keep ones family comfortable.

In other words, pedagogy, universal design, student equity, and other considerations will become instantly irrelevant.

Synchronous lectures, especially at a college without teaching assistants, may be well beyond instructors’ capabilities, and if not, the stress of doing it may be overwhelming. Even discussion boards, which require design of some sort, may be beyond the abilities of many instructors while everything else is going on. That would leave us with: read the textbook, take the test.

This might sound terrible, but it isn’t. It’s the way distance learning was done in its earliest incarnations, through the mail and televised courses. It is the way the earliest online classes were conducted, over email. In other words, read/inform and test was considered adequate for education for many years, particularly for subjects not requiring laboratory work. And it’s the pattern for every publisher’s course cartridge I’ve ever seen.

Certainly it’s ironic that after so many decades of active learning, student-centered techniques, just-in-time education, constructivism, and inquiry-based models, a single emergency would mean a return to the old style.

If this emergency goes on for a long time, several things could happen. The following could well drop out of college: students who only have their phone for a computer, students who cannot motivate themselves and stay on task, students who require personal contact with others in order to learn, students who require that teachers take a deep personal interest in their lives, students who can only learn in groups of other students, and students who are unprepared for college work. While all of these problems can be dealt with effectively in the online environment, there will be no time to work on the methods for doing so. This is particularly the case for faculty who’ve never taught online, or posted more than a minimum of resources in an LMS.

The remaining students, those who can do well under these read-and-test conditions, will be more similar to students of the 19th and 20th centuries. During these years, educational reformers were striving to make sure there were spots available for students who were capable but financially disadvantaged. The ability to read and test, and ask good questions, was essential to student success. It might prove interesting if it were that way again.

 

Rigor or workload?

It appears as though next summer, our 8-week classes will all be offered in a 6-week format. I am in favor of this. At first this seems like a good idea: students finish faster, faculty are done sooner (avoiding the problem of immediately starting fall afterword). Until one thinks about rigor.

Rigor is a word frequently argued about in academia but rarely defined. It has something to do with the academic integrity of a course. If, for example, it is a college class but one assigned a third-grade textbook, there would be a problem with rigor. Our course approval process requires a list of possible textbooks and possible assignments, ostensibly to ensure appropriate rigor.

Years ago, our historians were asked to offer 4-week intersession classes in winter, and we said no. Our senior historian at the time went in with the dean to argue that rigor could not be maintained. Our classes, as approved for transfer to university, were 16 weeks long. We could not maintain standards, particularly with students rushing through reading and writing at 4 times the speed. It’s a community college. Some students had trouble reading college-level work. Forcing them to do it faster would be disastrous for their success and our teaching. We won the argument, because at the time there seemed to be a general understanding that History requires extensive reading and writing, and by extension considerable thought. This requires more time.

As time has passed, however, the expectations for the level of student achievement have changed. The emphasis on “student success” has led not only to a natural and predictable inflation of grades, but a much broader acceptance of less rigor. The available textbooks for a college course are written at a much lower level, and have many needlessly large illustrative images and lots of white space. Courses are approved for General Education transfer that are more “fun” and have significantly lower expectations of learning ability. The push for what is called “equity” has led to an utter rejection of everything from the Western canon to any novel written by a white male, with the result that many longer works with universal themes are no longer considered appropriate for assignment.

So in a sense all rigor, in the sense of expectations of the level of the work completed, has declined. But rigor is not necessarily workload. When I was at university, lower-division courses required a full textbook, and several ancillary texts. When I was a teaching assistant at the University of California, Santa Barbara, only one ancillary text was required, but it was an extensive secondary book. Students chose the shortest one, of course. All the same, the workload (number of pages to read, number of papers to write, length of those papers) was significantly higher.

If rigor is being decreased, but achievement in the discipline continues to be expected, then workload should increase. If the level of what one is being given, and is expected to perform, is lower, then increased quantity would provide more opportunities for practice. Increasing workload thus implies a dedication to higher rigor, even if the standard is not obtained.

But we must also consider the contemporary dedication to the affective well-being of the student. This dovetails with the culture at large. It is accepted that people who are distressed cannot study well. Mental illness, overloaded schedules, job and family demands are seen as reasonable justifications for being unable to perform what could have been considered university-level work a generation ago. Before, they would have been encouraged to leave university and find a job for which they were suitable. Now they are held onto like precious gems, who without university have no chance in life. It’s our fault, not theirs, if they don’t succeed.

The university transfer approval process requires that community college rigor matches that of university. This has not been a major issue. University rigor has also declined. No one checks very carefully, anyway. But approval also requires that the same rigor and workload approved at the course level apply to every class section that is offered. So if I offer a 16-week class that normally required a full textbook, five primary source readings each week, and two assignments per week, the expectation is that this will be compressed but identical in shorter-term formats.

While this may seem to be a way to maintain rigor while increasing workload in the short term, it doesn’t actually work that way. I have adapted several of my classes to the 8-week (double-speed) format already for summer classes, and to provide a “back-to-back” single semester option to complete a two-course sequence. Enrollment in these is excellent — students do indeed appreciate completing the course faster, and they drop less often. I have long felt that 16 weeks is too long anyway. But I do not demand exactly the same number of assignments for my 16-week students. The primary source boards dropped from 16 to 8 to keep their focus on the weekly unit. Everything else, however, I simply doubled up: textbook readings are two chapters a week, primary sources are ten instead of five.

Six weeks presents a slightly different challenge. I cannot simply eliminate the Age of Discovery, or the American Empire. These are required to be “covered” to be approved for university transfer. Thus the workload must increase. I suspect that the transition from eight weeks to six may be a tipping point for rigor and workload.

What happens when one increases the workload beyond the expectations and desires of the students? First, they just don’t do it. They simply won’t be exposed to the facts, interpretations or ideas. They’ll skip the Age of Discovery. Second, they will not enroll in the first place, or drop the class in favor of classes with lighter workloads. Our History department has seen a consistent slide in enrollments over the last few years. While we know that this is partly because the national reputation for disciplines like History is on the decline (as it is for intellectualism in general), there is also a greater dedication to rigor in our discipline, a dedication often misinterpreted as “white” and elitist. (In truth, historiography has been foregrounded the agency and obstacles for people challenged by mainstream culture since the 1940s.) The college now offers far easier course in “culture” that count for the same requirement, and thus compete directly for enrollment.

Simply compressing my 8-week classes into 6 weeks, I fear, won’t work. The workload will be well beyond the expectations of students, and they will leave, drop or fail. While failing used to be acceptable, we are now expected to prevent this at all costs. So some tasks need to be removed. It cannot be topics or “coverage”, so it must be reading, assessment, and writing. I am leaning toward removing the textbook reading, because it could be considered “boring”, they have more difficulty reading, and the facts are not as important as them “doing history” (my lectures may have all the facts they need). Removing textbook reading also reduces the number of quiz questions, or perhaps eliminates the quizzes themselves. The writing assignments should be given a few days without anything else due, so they focus on them — those I am unwilling to change, but I want to provide them with space and time.

The grading weights would change accordingly, so that each of the remaining tasks would be worth substantially more. That is unfortunate, because my usual method is to have many little assignments, so that no one assignment is worth a lot. That way students can learn, practice, improve. So in addition to impacting rigor and workload, my pedagogy will also be affected. I do not, however, see another way.

Musings on equity and pedagogy

I am considering the implications of mapping student learning challenges and solutions like this:

Universality in education implies open opportunity and access. It also has cultural implications, that certain philosophies, methods, and subjects are worth exploring because they lead to knowledge in a larger sense.

Communality in education implies that the people being educated, and the educators, belong to particular groups or communities. These may include professional or personal groupings, but they may also include other groups favored by social scientists: socio-economic, racial, religious, etc.

Individuality in education implies that learning in basically an individual activity, and that the effectiveness of pedagogy depends on the individual. It also implies that teaching must take into account individual talents, proclivities, abilities. It underpins ideas of individualization and personalization of learning materials and methods.

At many institutions, those who privilege communality are increasing awareness of the influence of groupings on the lives of students. Some of this has taken the form of movements for student equity. In its most useful form, communality makes teachers more aware of the challenges students may face because of their identification within a particular social group. In its extreme form, communality mandates particular forms of speech, opposes ideas that are seen to represent the dominant culture, and publicly shames individuals who don’t engage in groupthink.

Within this construct, universality is seen as tainted. Access is not enough, because those who are disadvantaged by their group membership cannot benefit equally from that access. In its most useful form, this can cause the culture to acknowledge those deficiencies and seek to remedy them, through awareness and policy designed to offset limitations. In its most toxic form, it denies the universality of ideas, and engages in cultural relativism.

Individuality is similarly tainted, because it does not consider the pressures resulting from group membership. Individuals must acknowledge, and in many cases are expected to represent, the group. A person who seeks to overcome the limitations of their group is seen as a traitor to the group. This may occur even when the group is defined externally, and the individual does not identify as a member.

The image above centers the individual within the group, and wraps both individuals and communal groups into the universality of humanity. There are similarities to the philosophy of stoicism. Using stoic concepts of individual and universal, the diagram might look like this:

Creative Commons licensed A-NC-ND John Danaher

In stoicism, the goal is to connect the individual at the center with the universal ideas that supersede the social context. The social world is an intermediate place where it is hard to tell noise from signal. Although humans are automatically attuned to the social world, they must overcome its noise to find a deeper connection with the universal. Communality is highly changeable, redefining itself frequently. Today’s communality is the not the same as yesterday’s.

In education, a central goal is to connect the individual to larger schemes of human knowledge. Pedagogy’s purpose is to assist the individual in using information, creating knowledge, and ultimately gaining wisdom. To achieve only typical attainment may attach the individual to the communal in a way that can prevent higher knowledge.

Sustained argument

The Economist‘s column Johnson recently wrote about the extent to which artificial intelligence can compose prose, and claimed we need not fear the “Writernator”.

The reason? Because during an experiment:

Each sentence was fine on its own; remarkably, three or four back to back could stay on topic, apparently cohering. But machines are aeons away from being able to recreate rhetorical and argumentative flow across paragraphs and pages.

Well, most of my students can’t do that either.

So while the article was trying to reassure writers that they need not fear losing their jobs to a computer, I saw quite another angle, a trail of thought that goes something like this:

Computers cannot create sustained arguments. Neither can most of my students. And only the best journalism seems to bother. Educated people can both follow and create sustained arguments. But to whom are they writing? We have many voters and public figures who are anti-intellectual, and only interested in the realms of fear, emotional expression, and personal identity. They have no interest in sustained argument. The media reflect this, with the emphasis on factoids. Articles have gotten shorter, even in journals like The Atlantic. Computers can’t do it, and computer programming is a reflection of ourselves. Perhaps the very idea of sustained argument needs to be defended. But how can one defend it except with sustained argument, and a reliance on the very intellectualism being increasingly rejected?

If I had read the article 20 years ago, I might have nodded along, not because computers weren’t writing then, but because I felt that sustained argument was a norm. It’s perfectly obvious to me why it needs to be preserved, as obvious as the preservation of free speech, democracy, respect for the opinions of others. These things aren’t obvious anymore.

Today on BBC4’s PM program, Evan Davis reported on Jacob Rees-Mogg’s insensitive statement that he would have got out of the Grenfell fire by ignoring the orders of the fire authority to stay in the burning building. He interviewed Andrew Bridgen, Conservative MP of North West Leicestershire:

Davis: Do you think he meant to say that he thought he would not have stayed put?

Bridgen: That’s what he meant to say…

Davis: And that in a way is exactly what people object to, which is he’s in effect saying, I wouldn’t have died, because I would be cleverer than the people who took the fire brigade’s advice?

Bridgen (Sigh.) But we want very clever people running the country, don’t we, Evan?

Well, I’m not sure people do. In fact, I’m pretty sure many people don’t see why having educated people run things is a good idea. They think that educated people run things to the detriment of uneducated people, and sometimes that is true.

People, Bridgen noted, tend to defer to authority, as they did in the case of the fire.

When people trust these authorities and the authorities fail, there is popular anger. At some point people will ask from whence does this authority derive? And one can say “from your votes”, but they feel that isn’t completely true. The response is to elect uneducated populists.

Intellectualism, and education itself, may have a much tougher advertising campaign to run than we suppose. The old norms are suspect, and assumed ideas need a cogent (or, better, non-intellectual) defense. I don’t think saving writers from computer-generated text is quite going to do that.

 

 

 

 

Permalink

Facts are discovered.

Knowledge is created.

The internet’s not for learning?

I confess to being depressed by a summer article in The Economist, “The second half of humanity is joining the internet” (June 6). In the spirit of Thorstein Veblen’s critique, poorer parts of the world are getting on the internet*, mostly though mobile phones. And even fewer people there than in the developed world are using this online time to learn things.

The Economist article did not specifically count online courses, only “education information/services”, but the use is pretty low. And it likely includes looking up something on Wikipedia so you can win a game, or checking the weather.

People everywhere do the same thing: use the internet mostly for “timepass” – passing the time by communicating with friends and family, playing games, and watching videos. I’m not saying these things don’t cause learning. They do. But the purpose is entertainment and emotional satisfaction, not becoming an educated citizen.

It just serves to remind me how truly wide the gulf is between those who value education for its long-term benefits, and those who just want to pass the time. Are the people who get satisfaction from intellectual challenges rare? If so, will the smartphones make them even more rare?

Because that’s the crux of the issue. When all this internet-y, web-by stuff began, we educators were all excited. Vast libraries of information! Massive open online classes! Anyone can learn anything from anywhere!

I’m not anti-entertainment. I’m a huge classic movie fan, and I watch a lot of TV programs where one character calls another “Inspector”. I read modern novels just for fun, or to get to sleep. I’m not always working, always teaching, or always learning.

But I am again reminded of the old Zits cartoon:


The internet relies on huge servers, and uses tons of resources. It only seems “clean”. The mobile phones contain rare earths, the servers are so hot they need to be in the Arctic, the power plants chug away so we can have long power strips full of our charging device plugs. It’s odd to make that sacrifice just so that people can play Fortnite from anywhere.

Perhaps our goals were too utopian. The article points out that our vision of the subsistence farmer checking weather on his phone to save his crop doesn’t really happen. But why shouldn’t everyone use the internet for whatever they like? And can’t we learn wonderful things on our own? Some little boy somewhere is watching a Zeffirelli clip on YouTube and is inspired to become a great set designer. Some little girl is watching the US women’s soccer team and will be a great player. Is formal education a more important use of technology?

After two decades online, however, I am saddened that there hasn’t been a little more educational uptake and a little less “Whasup?”.

 


* I used to be very careful to distinguish the web from the internet — the internet is the entire online structure, while the web is the world wide web accessed through a browser. The recent dominance of the “app” and sites requiring log-in is closing the web, and has become the most-used aspect of the internet other than email.

Wells and the moon shot

On the 50th anniversary of the moon landing, I picked up my copy of H.G. Wells’ The First Men in the Moon (1901), and found these paragraphs:

. . . Then with a click the window flew open. I fell clumsily upon hands and face, and saw for a moment between my black extended fingers our mother earth—a planet in a downward sky.
   We were still very near—Cavor told me the distance was perhaps eight hundred miles and the huge terrestrial disc filled all heaven. But already it was plain to see that the world was a globe. The land below us was in twilight and vague, but westward the fast gray stretches of the Atlantic shone like molten silver under the receding day. I think I recognised the cloud-dimmed coast-lines of France and Spain and the south of England, and then, with a click, the shutter closed again, and I found myself in a state of extraordinary confusion sliding slowly over the smooth glass.
   When at last things settled themselves in my mind again, it seemed quite beyond question that the moon was “down” and under my feet, and that the earth was somewhere away on the level of the horizon—the earth that had been “down” to me and my kindred since the beginning of things.