What the Dickens

Funny the things that happen when you’re a historian just trying to read a book, and the ways in which being a historian can get in the way of a good read.

I recently joined the Victorians! forum on Goodreads. People there read Victorian-era books. I’ve read some of those myself, including a few by Charles Dickens. I’ve read The Old Curiosity Shop (which someone spoiled for me, thinking everyone knows the ending), A Christmas Carol (of course), and Hard Times (ok, so I listened to the audiobook). The rest await me, eight or so volumes on the shelf. I figured Oliver Twist would be next.

Then I saw that the Victorian group would be reading Nicholas Nickleby. I’m not much of a joiner, but I thought it might be fun. I’ve never read with a group, never been in a book group or anything. The closest I get is reading the “book group questions” at the back of  novels.

But I didn’t have a copy of Nicholas Nickleby. I could have downloaded it from Internet Archive, but I don’t like reading things on a screen. Ironic, isn’t it? Backlit screens are for work, but reading for pleasure is different. I want a book in my hand, turning its pages, absorbing its history.

My ultimate site for second-hand books is abebooks.com. But I’m very choosy, because to me books are historical objects. My first preference is for a book published during the author’s lifetime, so “publication date ascending” is my go-to sort. Ending 1870, when he died. Whoa! the prices! How could Dickens be fetching such prices? OK, maybe not 1870. . . But even with an edition from 1920, we’re talking some money.

I knew Mr. Dickens was not getting a cut from my purchase, but I’m not a huge fan of him as a person, despite Simon Callow’s brilliant portrayal in Dr Who. As a historian, I try very hard to separate the creator from his creation. Where would I be with Rousseau if I cared about how he gave up his own children to foundling hospitals? Or if I ignored the brilliance of Thomas Jefferson because I was busy judging how he lived? People are not their ideas. We are all flawed. Good ideas survive long past the person’s lifetime.

Dickens is somewhat different because I went to his house. Not while he was there, of course, but several years ago. It’s a shrine to him.  I found this bizarre, because it was really her house. His wife, Catherine. She raised his ten (!) children. She wrote the best-selling menu book What Shall We Have for Dinner? Since 2016, the museum has made a huge effort to include her in the house’s story. The problem is that Mr. Dickens, who was having an affair with young actress Ellen Ternan, didn’t want his wife anymore. After she discovered the affair in 1858, he turned the situation on her, separated from her, and dissed her all over London. He even tried to get her committed to an asylum. (I’ve begun reading Lillian Nayder’s 2011 biography rehabilitating her reputation. I feel I must.)

I’ve learned that you cannot be a Victorianist without enjoying, or even reveling in, Charles Dickens. Certainly I admire his detailed portrayal of the era, the wonderful characterizations, the turns of phrase that make you chuckle aloud. He wrote so fast, and so much, that I know I haven’t even scratched the surface of his talent. But the hagiographic approach to him annoys me anyway.

So I clicked past the volume of Nicholas Nickleby that said “Works by Charles Dickens” on the spine, because it was part of a collection. I scrolled beyond the $1,000 matched sets of his work. I searched for the small 8vo versions I prefer, but there aren’t any because the novel is too long. I finally found one I liked and ordered it.

I do not hold it against Dickens that I spent so much time looking for a book I hadn’t wanted to read, by an author I personally dislike, just to join a discussion with a group I do not know. It’s just another case of a historian making things more complicated than they need to be.

Writing novels

I read a great quotation today: “You must stay drunk on writing so reality cannot destroy you.” It’s by Ray Bradbury, so of course I had to find out from which book and order it.

Way back in November, which feels like a lifetime ago now, I was feeling stuck in my research. I had applied for a grant for my collection of H. G. Wells’ writings, so it felt like I had no reason to work on it until I knew whether I got it (I didn’t, third time running). The other book on Wells was in the process of being written through a series of scholarly papers presented at conferences, but since I teach full-time plus, and can only present once a year, this was going slowly. I tried reworking the papers into publishable articles, but they didn’t seem to fit what journals were looking for.

So although I was still fascinated by my topic, output was lagging. Nothing felt completable. So on a gloomy November day, I haphazardly began writing a novel based on a character like me, in the process of doing research on H. G. Wells. Over the next four months, I wrote every night between midnight and 1 a.m., until it was done. The writing flowed. I downloaded Scrivener to have a place to write it, and ultimately paid for that program (and I rarely pay for anything). The book seemed to write itself. I edited as I went along, going back to the previous chapters nightly, rearranging and fixing. It was a strange process, since I have long thought of myself as having no imagination. But what came out was pretty good.

I wanted to get it published, so I began reading up on how to do that. I have a former student who’s now an author and writing coach, and subscribed to her advice. I thought I should join writing groups on Facebook, so I found a few and followed them. I searched out information on writing and writers conferences, novel construction, how to make a good plot. I discovered that I’m a “pantser” (writing by the seat of my pants, with no plan) rather than a plotter.

This conclusion annoyed me. I have for many years prided myself on my organization and planning skills. I had read that it is a good idea to start work on a second novel, while waiting for the zillion rejections on the first. The first book was in the genre “literary fiction”, I discovered, but I had been wanting for some time to write a Victorian mystery, so I started that. My many blog posts on the year 1862 attest to the fun I’ve been having doing it. The pundits said no, you should write in the same genre for several books. Oh well.

Unlike the first book, this one should have been planned out rather than “pantsed”. Mysteries are complex, and my memory is not good (few historians have good memories). I tried mind-mapping, and ended up with Scapple, from the same people as Scrivener, to map the plot. This didn’t work well. I tried to plan, but ended up putting things that I had already written on the map instead, a reverse process of tracking rather than planning.

And I kept looking for groups to join, because I’m entering a new world so I felt I should. Writers, they say, should hang out with writers, as a community, for support. I am not a joiner. I don’t like groups. And I’ve become annoyed with the process of looking for an agent, which everyone says takes huge amounts of time and lots of rejections. I expected rejections from publishers, but agents? The whole publishing thing has been frustrating and mystifying. The advice, the formulas, the sample letters, the filling out of forms that each have their own format, just to get someone to represent you whose fee will ultimately be paid through book sales. I have decided on one plan, anyway: write agents some, send directly to publishers if I can’t find an agent, and self-publish if I can’t get an agent or a publisher.

I do not, like some authors, seek fame or fortune. But I would like some people to read and enjoy my work. If the writing itself adds joy to my life, the seeking of agents and publishers seems to suck it back out. My book(s) are good, I think, but I have learned rather quickly that quality doesn’t matter that much in the publishing world. I’ve learned why Dan Brown and John Grisham sell, and beautfully written works do not.

The pandemic now has millions of would-be novelists putting fingers to keyboards. I have been joined by mobs. Am I novelist, without a published novel, just because I’m up at night writing novels? Does this graphomania have anything to do with my job? Why am I doing this?

And yet I continue to do it all wrong. I have read that my protagonist must have a horrible flaw, an Achilles heel that causes conflict. Mine merely has a penchant for buying too many books and taking his time thinking things out. The action is supposed to rise, with a status quo brutally disturbed, truths revealed, and a startling conclusion. Mine has likeable characters that mosey along finding things out. There are supposed to be twists, where I’ve led my reader to think one thing and then — shock! — it’s something else. I have some pinkish herrings, but I don’t think I have a single twist. It’s more like a churro than a pretzel. Is it a cozy? Apparently not, because there’s some plot-based sexuality and the person solving the mystery is a professional. But it seems like a cozy to me.

And now, I’m a bit stuck, with most of the mystery written, and no idea how it’s going to end. But when I allow the characters to just mosey along, talking and discovering and living their lives, the world of today utterly disappears. I am in 1862, caught up in the pushing and shoving of the audience at the Surrey Theatre, sensing the activity of overcrowded London, wondering whether it’s worth the trip to travel to the Exhibition in Kensington when the omnibus doesn’t go all the way there. When I let the characters take over, the plot just goes along fine, so I’ve decided to leave it to them. They know what they’re doing. They’ll figure it out.

Maybe when ones characters become so real they write the story, one really is a novelist. So I’ll stay drunk on writing.

Underpants, cats, and the classroom

I just don’t know about teaching in a classroom.

I know we have to do it, and I am aware that we have no choice. It’s because of the emergency. In seven months, for some of us less, we’ll have to be prepared to teach in the classroom.

Most teachers know that this will be difficult. At times it seems impossible. How can we possibly teach in such a space? Some of us don’t have the training for it. Sure, we’ve hung out in classrooms to meet friends or socialize, but that’s not the same as learning there.

Realistically, how can we get to know our students when they aren’t Zooming from their living rooms, utility rooms, and bedrooms? We can discern so much from the pictures they have on the wall, what items hang in their broom closet, and what space they can (or can’t) claim for themselves.

It will be hard to wrap my head around the learning challenges they face when I can’t see that they’re taking class from their car in a parking lot, or that their parents don’t understand they’re in class and come wandering in wearing only underpants. That’s just not going to happen in the classroom. I won’t be able to hear the noise they’re subjected to when they try to do homework, or see that they enjoy using a different Transformers coffee cup every day.

I won’t have the privilege of meeting their children. No happy waves to the camera, or tugs at the sleeve for a cup of juice. Kids and siblings really give me an idea of who my students are. And pets! They won’t allow pets in a classroom, and you can tell so much about a student from their pet, seeing how they interact with it. We’ll have to abandon that whole Golden Compass thing, with each person having their own familiar. I’m proud that so many cats have learned history from me.

Let’s face it, a physical classroom is a sterile, artificial environment. It smooths away the individuality of our students, with its identical desks and whiteboards. What cool visuals are there in a college classroom? A few maps featuring a divided Germany, a flyer for an event that was over a month ago, some learning cards ordered from a set in 1992. When students are in their own learning spaces, or wherever they can find, we come to know their individuality, and in many cases their creativity.

I’ll miss watching the rearrangement of the cell phone so it leans properly on the bowl of oatmeal, the face turned away from me to yell at someone who’s come in the door, the earnest expression as a student speaks but has forgotten to turn his microphone on. These are all teaching moments.

But I know it can’t last. There’s an enthusiasm for the physical classroom, I realize. And there are people who, in the last few decades, have become real experts in teaching there. I’m of two minds about learning from these utopians. They’re just so enthusiastic about that environment. It’s intimidating. They really believe that learning can happen there, when everyone knows it’s an open question as to whether a physical classroom can ever approximate the online experience of learning.

The pressure on students to answer a question right away, the forbidding of food and drink, the hours spent away from ones dog. The cognitive load involved in seeing everyone’s lower half. I’m just not sure anyone’s ready for it. But if we must, we must. Teachers are nothing if not resilient.

Thoughts on art and windows

Some of the best things happening at the moment are related to art.

Not being an Instagram aficionado, I read in 1843 magazine that Instagram is the place for art and artists. So I signed on and followed my favorite museums: The Metropolitan Museum of Art in New York, the Uffizi, the Getty, the Ashmolean, the Fitzwilliam, the Victoria and Albert, the National Gallery in London, the National Portrait Gallery, and the Bodleian Library (yes, I know it’s not an art museum).

Each are posting an artwork a day, and most are responding to questions. It’s a delight.

Most of these works, like most of my recent blog posts, are not directly about the current situation. And yet they touch upon the values, knowledge, and sympathies that inform our response to it. So for example, Antonella da Massina’s “St Jerome in His Study” (c. 1474-75), from the National Gallery:



This has been one of my favorite works since I first saw it. I can see myself there at the desk, reading and writing. (This is despite the fact that Jerome would not have liked me at all, and that if you look at it realistically the place would be awfully drafty.)

On Instagram, people replied to this post asking about the birds in front and the lion in the back hall, and National Gallery staff explained about peacocks and wisdom and the story of Jerome and the lion’s thorn. It had over 23,000 views in 13 hours. It’s learning, without a class, but with guidance and expertise, in an interactive environment, with object-based instruction and student-based inquiry. A perfect lesson.

When I first saw the work of Vanessa Bell, it was in an exhibit where the curator, Laura Smith, pointed out how views out of windows relate to women’s experiences. An example is “Interior with a Table” (1921 © Tate):

Some would say that women’s domestic lives are often more isolated than men’s, that over time many have seen the world from behind a window. Given a choice, I often prefer life through a window, but that’s because nature and I have an enigmatic relationship. I want her protected, unpolluted by my footprints. I prefer the idea of wilderness to the idea of conserving nature for human use. But I’ve also been teased, having been camping only twice and told that my idea of “roughing it” is a hotel without room service. Looking out a window, one has the illusion that one is in control of what is behind it. The wildness and beauty of nature is beyond, seen through glass. It can do its own thing, while inside I do mine. A Room of Ones Own must be a Room with a View.

Nowadays many people are supposed to be inside for awhile. There have even been art jokes about this self-isolating, and the tongue-in-cheek adapting of artworks. A copywriter named Peter Breuer in Germany posted this in Twitter:

Art can show, articulate, or contrast reality. A person can be put inside, or the outside can be swept of people. The work of Jose Manuel Ballester, which removes the people from art masterpieces, has a new resonance these days. For example, the Hieronymus Bosch’s “Garden of Earthly Delights” becomes “El jardín deshabitado” (2007).

I do miss the guy with the flowers.

Art appreciation can also be personal and timely — the Getty has people recreating art masterpieces all over the world. The idea of people using things they already have, to recreate great works, and create new things, shows the best of humanity. And yes, some people are working extra hours in dangerous conditions, while others are unemployed and too worried to create, but it is often at the busiest and worst times that art provides some comfort.

One of my Honors students just finished her final paper for this term. At the beginning, in January, she wanted to write about the history of social media. I assigned her Tom Standage’s Writing on the Wall. As the term progressed, her topic gradually changed. Her paper is titled “Art and Technology as a Mechanism for the Reduction of Isolation”. And it’s quite wonderful.

It is said that the pandemic has proven the necessity of the arts and humanities. It’s true, and not just for the comfort they provide, but for the reminders. How to take the familiar and make it intriguing. How to hold an object in your hand so people see it and ask questions. How to change your perspective by changing your window. How to teach by showing instead of telling.

With the arts and humanities, we are part of a larger experience. We can be inside looking out, instead of outside looking in.

Armchair historian does London Bridge

Historians sometimes reinvent themselves. Or maybe it’s better to say that historians who are very famous, or not famous at all, sometimes reinvent themselves. If you’re very famous (like Simon Schama) you can do whatever you like. If  you’re basically unknown (like me) you can also do whatever you like. But if you’re an acknowledged expert about One Big Thing, I suspect you can’t do anything else.

I’ve been working for a few years on making Victorian England my new specialty, and I’m also writing a novel that takes place in 1862. To find good resources and just enjoy the era, I’ve joined some Victorian-focused Facebook groups. People post old photos:

These are London Bridge in 1890, the top one facing north, the other facing south. And oh, the traffic! I’ve read that the bridges were often jammed in the 1860s, and it looks like by 1890 they weren’t any better. Can’t you just imagine yourself trying to cross the street?

Then someone on Facebook asked whether the stairway was still there:


Let’s go look! (And get a load of all the “temporarily closed” on Google. Might want to take some screenshots — this will all be history too, remember).



So I “drop down” my little G droid* and go look.



Hmmm… looks like maybe the top of a stairway? I’d better drop down by the river bank for closer inspection.


A ramp! Much nicer than stairs for lots of people. Plus you don’t have to go out past the church and turn left to get onto the bridge.

The ease of doing this sort of thing amazes me. An armchair traveler in the 19th century could sit at home and read books to go to wonderful places all over the world. I can drop down my G droid anywhere and walk around (well, click around).

I can’t go to England this year, but I can do this. The Google Map images are relatively recent. I can walk down streets. I can look up old maps and then go see what’s different (I do that all the time for research). I can even go to webcams like this one and see places in real time. I can start up Google Earth and see buildings in three dimensions.

All of which beats relying on H. Rider Haggard for my view of the world. But I would like a wingback chair, please.


*I know the droid is called “Pegman”, but why should it be a man? I’m a lot of things, but I’m not a man.

Is it better to be a milkmaid?

In the early 18th century, Lady Mary Wortley Montagu would discover the inoculation process for smallpox while she was living in Turkey. She had lost a brother to the disease, and barely survived it herself. Smallpox in the 18th century was particularly virulent; the CDC says it had an average a 30% death rate for those who got it. The inoculation was done using actual scabs from people who had smallpox, inserting it under the skin through a cut. Lady Montagu had her son inoculated while in Turkey, and her daughter in England when she returned in 1721. She then campaigned to popularize the method in Europe.

But Edward Jenner gets all the glory, even today, because he developed a vaccine to replace the inoculation. Inoculation is an old idea — you take some likely material from a living victim of the disease, and put it in a person who hasn’t had it (similar to the convalescent plasma being tried today). But there is always a danger of actually giving the healthy person the disease. A vaccine uses a more benign substance to achieve the same immunity.

I was taught in college that the discovery of the smallpox vaccine came out of the realization that milkmaids didn’t get smallpox. This was apparently because in leaning their cheek on the cow while milking, they acquired cowpox, a very mild disease. The cowpox antibodies protected them from smallpox.

There is a wonderful mythology around Edward Jenner and his 1796 vaccine. It’s based on the story of a milkmaid he met when he was a boy, bragging about her lovely skin that would never be scarred by the pox.  Rather like the story of George Washington and the cherry tree, it was created by a later biographer. The real story is more ordinary. Nevertheless, milkmaids didn’t get smallpox.

I learned today that an old vaccine for tuberculosis may have some value in helping with the current virus. The New York Times article says:

The B.C.G. vaccine has an unusual history. It was inspired in the 1800s by the observation that milkmaids did not develop tuberculosis.

The active ingredient of the Bacillus Calmette-Guerin vaccine is Mycobacterium bovis, which was isolated from a cow in 1908. It was made into a proven vaccine by 1911, and is today used in areas with high tuberculosis danger.

Unlike the Jenner/smallpox tale, the story of BCG inspiration and milkmaids is harder to track down. I can find nothing in a verifiable source, other than popular/journalistic websites copying each other’s phrasing, to confirm the background of the discovery.  Closest I could get was a 2008 Russian article saying that milkmaids and others close to cattle could carry a latent infection that gives a false positive for a tuberculin skin test. I couldn’t find anything even saying that milkmaids didn’t get tuberculosis. Is this a case of people just mistaking one apocryphal tale for another?

But we know that Mycobacterium bovis is to Mycobacterium tuberculosis as cowpox is to smallpox — a similar, less deadly disease that protects against a more virulent one. The Lancet warns that it might not be effective against the current virus, and there might not be enough available, but it has helped people with similar respiratory diseases recover faster.

I’d still like to think that milkmaids would be safer.


Fake Fitzgerald or Real Telegram from 1918

The last couple of weeks this has been going around. Like a lot of people, I was hoping it was real.

I thought it was real not because I know Fitzgerald, but because I know Hemingway. But it wasn’t real. It was from McSweeney’s.

According to Esquire, we all wanted this to be real because we’re all looking for reassurance that we can survive this crisis. I’m a historian, so in addition to being embarrassed because I was tricked, I wanted to then find real documents that could accomplish the same thing.

Here’s what I found:

Western Union Telegram signed [Hon.] Edward Rainey, [San Francisco Mayor’s Office], to Hon. Harvey Neilson, Santa Barbara, California Mayor’s Office.

1918 October 31.

“If you have not already taken such steps strongly urge universal wearing of masks to prevent or check influenza epidemic. Cases here rose steadily from two hundred per day October Six to over two thousand October Twenty-fifth. On Twentieth, some our people wore masks, on Twenty-first on recommendation Health Board, Mayor [James] Rolph issued proclamation calling for everybody wearing masks. Nearly whole population complied. Red Cross backed with advertising and two days later supervisors passed ordinance requiring wearing by all persons. Practically whole population in masks. By Twenty-third, five days after first masks appeared or three days after use became general new cases dropped approximately fifty percent. Deaths at peak 194, yesterday only 103, many of these having been sick for some days. New cases decreasing daily. Health authorities say San Francisco probably get through with far less distress and death than Eastern cities which started with about our figures but keep on going up while ours went down. All agree masks largely responsible. Sending this for your information because I have seen the whole terrible effect of epidemic here, because masks have saved untold suffering and many deaths, and because Santa Barbara my old home city. Portland, Seattle following San Francisco lead.”

Source: Online Archive of California, UCLA Special Collections
Collection of personal narratives, manuscripts and ephemera about the 1918-1919 influenza pandemic, 1917-1923


So. Wear your masks, Californians, and know that it will end.

Going online the 19th-20th century way

Given current discussions about moving on-site classes into an online environment as an emergency response, I’m going to go out on a limb and say: don’t worry about pedagogy.

Current Learning Management Systems have simplified online instruction to an extreme degree. Despite decades of research and experience in various active learning online techniques, our current systems still encourage the basic triumvirate: reading/information, discussion, and assessment.

An on-site class already has assigned reading, so these can be moved online if necessary. Discussion that took place in class can be placed on a discussion board if desired, or one can easily just answer questions through messaging or email. And assessment can be moved online using the built-in quizzes. The fact is that most on-site teachers moving online will focus on these three things. They will need to learn how to upload their readings, set up a discussion board, and retype their assessments into the system if they don’t know how to use a tool like Respondus (or are on a Mac). The fourth big task will be learning to use the gradebook, which is more difficult than handing back a piece of paper with a grade and comments.

These preparations will take far more time than classroom teaching, and will already cause massive workload issues. These concerns will be added to student confusion on assignments and grades, basic worry about managing ones professional tasks in isolation, and trying to ensure enough income and food to keep ones family comfortable.

In other words, pedagogy, universal design, student equity, and other considerations will become instantly irrelevant.

Synchronous lectures, especially at a college without teaching assistants, may be well beyond instructors’ capabilities, and if not, the stress of doing it may be overwhelming. Even discussion boards, which require design of some sort, may be beyond the abilities of many instructors while everything else is going on. That would leave us with: read the textbook, take the test.

This might sound terrible, but it isn’t. It’s the way distance learning was done in its earliest incarnations, through the mail and televised courses. It is the way the earliest online classes were conducted, over email. In other words, read/inform and test was considered adequate for education for many years, particularly for subjects not requiring laboratory work. And it’s the pattern for every publisher’s course cartridge I’ve ever seen.

Certainly it’s ironic that after so many decades of active learning, student-centered techniques, just-in-time education, constructivism, and inquiry-based models, a single emergency would mean a return to the old style.

If this emergency goes on for a long time, several things could happen. The following could well drop out of college: students who only have their phone for a computer, students who cannot motivate themselves and stay on task, students who require personal contact with others in order to learn, students who require that teachers take a deep personal interest in their lives, students who can only learn in groups of other students, and students who are unprepared for college work. While all of these problems can be dealt with effectively in the online environment, there will be no time to work on the methods for doing so. This is particularly the case for faculty who’ve never taught online, or posted more than a minimum of resources in an LMS.

The remaining students, those who can do well under these read-and-test conditions, will be more similar to students of the 19th and 20th centuries. During these years, educational reformers were striving to make sure there were spots available for students who were capable but financially disadvantaged. The ability to read and test, and ask good questions, was essential to student success. It might prove interesting if it were that way again.


Rigor or workload?

It appears as though next summer, our 8-week classes will all be offered in a 6-week format. I am in favor of this. At first this seems like a good idea: students finish faster, faculty are done sooner (avoiding the problem of immediately starting fall afterword). Until one thinks about rigor.

Rigor is a word frequently argued about in academia but rarely defined. It has something to do with the academic integrity of a course. If, for example, it is a college class but one assigned a third-grade textbook, there would be a problem with rigor. Our course approval process requires a list of possible textbooks and possible assignments, ostensibly to ensure appropriate rigor.

Years ago, our historians were asked to offer 4-week intersession classes in winter, and we said no. Our senior historian at the time went in with the dean to argue that rigor could not be maintained. Our classes, as approved for transfer to university, were 16 weeks long. We could not maintain standards, particularly with students rushing through reading and writing at 4 times the speed. It’s a community college. Some students had trouble reading college-level work. Forcing them to do it faster would be disastrous for their success and our teaching. We won the argument, because at the time there seemed to be a general understanding that History requires extensive reading and writing, and by extension considerable thought. This requires more time.

As time has passed, however, the expectations for the level of student achievement have changed. The emphasis on “student success” has led not only to a natural and predictable inflation of grades, but a much broader acceptance of less rigor. The available textbooks for a college course are written at a much lower level, and have many needlessly large illustrative images and lots of white space. Courses are approved for General Education transfer that are more “fun” and have significantly lower expectations of learning ability. The push for what is called “equity” has led to an utter rejection of everything from the Western canon to any novel written by a white male, with the result that many longer works with universal themes are no longer considered appropriate for assignment.

So in a sense all rigor, in the sense of expectations of the level of the work completed, has declined. But rigor is not necessarily workload. When I was at university, lower-division courses required a full textbook, and several ancillary texts. When I was a teaching assistant at the University of California, Santa Barbara, only one ancillary text was required, but it was an extensive secondary book. Students chose the shortest one, of course. All the same, the workload (number of pages to read, number of papers to write, length of those papers) was significantly higher.

If rigor is being decreased, but achievement in the discipline continues to be expected, then workload should increase. If the level of what one is being given, and is expected to perform, is lower, then increased quantity would provide more opportunities for practice. Increasing workload thus implies a dedication to higher rigor, even if the standard is not obtained.

But we must also consider the contemporary dedication to the affective well-being of the student. This dovetails with the culture at large. It is accepted that people who are distressed cannot study well. Mental illness, overloaded schedules, job and family demands are seen as reasonable justifications for being unable to perform what could have been considered university-level work a generation ago. Before, they would have been encouraged to leave university and find a job for which they were suitable. Now they are held onto like precious gems, who without university have no chance in life. It’s our fault, not theirs, if they don’t succeed.

The university transfer approval process requires that community college rigor matches that of university. This has not been a major issue. University rigor has also declined. No one checks very carefully, anyway. But approval also requires that the same rigor and workload approved at the course level apply to every class section that is offered. So if I offer a 16-week class that normally required a full textbook, five primary source readings each week, and two assignments per week, the expectation is that this will be compressed but identical in shorter-term formats.

While this may seem to be a way to maintain rigor while increasing workload in the short term, it doesn’t actually work that way. I have adapted several of my classes to the 8-week (double-speed) format already for summer classes, and to provide a “back-to-back” single semester option to complete a two-course sequence. Enrollment in these is excellent — students do indeed appreciate completing the course faster, and they drop less often. I have long felt that 16 weeks is too long anyway. But I do not demand exactly the same number of assignments for my 16-week students. The primary source boards dropped from 16 to 8 to keep their focus on the weekly unit. Everything else, however, I simply doubled up: textbook readings are two chapters a week, primary sources are ten instead of five.

Six weeks presents a slightly different challenge. I cannot simply eliminate the Age of Discovery, or the American Empire. These are required to be “covered” to be approved for university transfer. Thus the workload must increase. I suspect that the transition from eight weeks to six may be a tipping point for rigor and workload.

What happens when one increases the workload beyond the expectations and desires of the students? First, they just don’t do it. They simply won’t be exposed to the facts, interpretations or ideas. They’ll skip the Age of Discovery. Second, they will not enroll in the first place, or drop the class in favor of classes with lighter workloads. Our History department has seen a consistent slide in enrollments over the last few years. While we know that this is partly because the national reputation for disciplines like History is on the decline (as it is for intellectualism in general), there is also a greater dedication to rigor in our discipline, a dedication often misinterpreted as “white” and elitist. (In truth, historiography has been foregrounded the agency and obstacles for people challenged by mainstream culture since the 1940s.) The college now offers far easier course in “culture” that count for the same requirement, and thus compete directly for enrollment.

Simply compressing my 8-week classes into 6 weeks, I fear, won’t work. The workload will be well beyond the expectations of students, and they will leave, drop or fail. While failing used to be acceptable, we are now expected to prevent this at all costs. So some tasks need to be removed. It cannot be topics or “coverage”, so it must be reading, assessment, and writing. I am leaning toward removing the textbook reading, because it could be considered “boring”, they have more difficulty reading, and the facts are not as important as them “doing history” (my lectures may have all the facts they need). Removing textbook reading also reduces the number of quiz questions, or perhaps eliminates the quizzes themselves. The writing assignments should be given a few days without anything else due, so they focus on them — those I am unwilling to change, but I want to provide them with space and time.

The grading weights would change accordingly, so that each of the remaining tasks would be worth substantially more. That is unfortunate, because my usual method is to have many little assignments, so that no one assignment is worth a lot. That way students can learn, practice, improve. So in addition to impacting rigor and workload, my pedagogy will also be affected. I do not, however, see another way.

Musings on equity and pedagogy

I am considering the implications of mapping student learning challenges and solutions like this:

Universality in education implies open opportunity and access. It also has cultural implications, that certain philosophies, methods, and subjects are worth exploring because they lead to knowledge in a larger sense.

Communality in education implies that the people being educated, and the educators, belong to particular groups or communities. These may include professional or personal groupings, but they may also include other groups favored by social scientists: socio-economic, racial, religious, etc.

Individuality in education implies that learning in basically an individual activity, and that the effectiveness of pedagogy depends on the individual. It also implies that teaching must take into account individual talents, proclivities, abilities. It underpins ideas of individualization and personalization of learning materials and methods.

At many institutions, those who privilege communality are increasing awareness of the influence of groupings on the lives of students. Some of this has taken the form of movements for student equity. In its most useful form, communality makes teachers more aware of the challenges students may face because of their identification within a particular social group. In its extreme form, communality mandates particular forms of speech, opposes ideas that are seen to represent the dominant culture, and publicly shames individuals who don’t engage in groupthink.

Within this construct, universality is seen as tainted. Access is not enough, because those who are disadvantaged by their group membership cannot benefit equally from that access. In its most useful form, this can cause the culture to acknowledge those deficiencies and seek to remedy them, through awareness and policy designed to offset limitations. In its most toxic form, it denies the universality of ideas, and engages in cultural relativism.

Individuality is similarly tainted, because it does not consider the pressures resulting from group membership. Individuals must acknowledge, and in many cases are expected to represent, the group. A person who seeks to overcome the limitations of their group is seen as a traitor to the group. This may occur even when the group is defined externally, and the individual does not identify as a member.

The image above centers the individual within the group, and wraps both individuals and communal groups into the universality of humanity. There are similarities to the philosophy of stoicism. Using stoic concepts of individual and universal, the diagram might look like this:

Creative Commons licensed A-NC-ND John Danaher

In stoicism, the goal is to connect the individual at the center with the universal ideas that supersede the social context. The social world is an intermediate place where it is hard to tell noise from signal. Although humans are automatically attuned to the social world, they must overcome its noise to find a deeper connection with the universal. Communality is highly changeable, redefining itself frequently. Today’s communality is the not the same as yesterday’s.

In education, a central goal is to connect the individual to larger schemes of human knowledge. Pedagogy’s purpose is to assist the individual in using information, creating knowledge, and ultimately gaining wisdom. To achieve only typical attainment may attach the individual to the communal in a way that can prevent higher knowledge.