I hear this a lot, and not always in relation to History as an academic discipline: “You can’t change the past!” While you can change what you are doing right now (perhaps) and therefore alter the future (maybe), the past is immutable. Right? Wrong.
Whether it’s personal history or academic history, the past is variable. We know from recent studies of memory that even our personal memories may be faulty, whether we believe that we shook hands with Mickey Mouse or re-construct our memories in therapy sessions. The entire field of neuroplasticity is based on evidence that the brain itself (and its corresponding shifts in emotion and behavior) changes over time.
The idea that the past is unchangeable derives from the definition of “the past” as consisting of externally verifiable, objective events. These events occurred – there is no way to undo them.
And yet, which things do we remember from that past? Why these things and not others? In which emotional contexts have we placed them? Those contexts influence our interpretation of the past.
For example, I remember my art teacher in high school drawing a large brain and a small brain on the chalkboard, and explaining that girls can’t do art because their brain is smaller. Throughout my life I have blamed him and his sexism for instilling me with a lack of confidence – to this day I do not draw. And yet, he may have thought he was being funny, or I may have had no confidence in my abilities already. I can certainly remember other times when someone in authority had told me I was no good at something, and my response was to prove them wrong. I may well be justifying my lack of visual art skills with this “memory”, putting it in that context for emotional reasons.
Context is also crucial in History as an academic discipline. We cannot change that the Bastille was stormed in July 1789. But how do we look at that event? What do we think it meant? Do we see it differently when our own society is in chaos, and barriers are being torn down, than we do in more placid eras?
Historians know that the purpose of doing history is not to rehash and memorize facts. It is to interpret those facts in the context of our own time. The entire field of historiography is built around the importance of setting historical studies within the timeframe of the historian. That’s why we now have histories of women, or poverty, or empire. What we’re interested in, and therefore what we look for in the past, changes over time. That changes “the past” in a very real sense.
It is no coincidence that we re-evaluate the British Empire when we are struggling with our own, or that we romanticise the counter-culture of the 1960s during a time of heightened materialism, or that we see a revival of World War II books and movies that struggle with the heroism of war during a time when our nation’s effectiveness in war is being questioned.
So of course we can change the past, both the academic past and the personal past. For our own pasts, we can recast our memories, reinterpret them, endow them with different meanings (in many ways, the entire field of psychotherapy is about doing exactly that). For the academic past, we can examine those “objective” events in light of our own interests and understandings. And that, after all, is what doing History is all about.
Having read yet another tweet complaining about the lack of connection between what’s taught in classrooms and what’s needed in the workplace, I posted my own:
It hit a nerve with a number of people.
One of my connections wrote
This is exactly it. My classes in History are General Education. My goal is to help foster an educated citizenry, not an efficient workforce.
And I am not promoting the other narratives either:
One popular narrative is that we should change education because it is irrelevant to the innovations of the future. In this story, today’s entrepreneurs are lauded, the guys who dropped out of school or didn’t like college because their classes were boring. Their success supposedly proves the irrelevance of our educational system. What it actually does is attest to the role of genius, luck, opportunity and money.
Another narrative links the use of electronic technologies, particularly the web, to making education more relevant. While I am deeply tied to the use of web technologies for teaching, I have not been able to buy into the idea that either the openness of the web or the marketplace of ideas is sufficient for providing a full education.
It’s the same reason I can’t accept the narrative that automated online courses and xMOOCs with peer or graduate student feedback schemes are a substitute for what we’re doing well in our colleges.
The final narrative I reject is the one that says that we live in a post-industrial world, so that many of the skills we used to value (the ability to follow an extended argument, or write coherent prose, or articulate ethics) are no longer needed. We need these skills, not because they are going to be applied somewhere specific, but because they change who we are and make us better people.
Knowledge that transforms students, that turns them into growing, learning, educated people, is by necessity broad and deep. What’s learned in college may have very little application to the specific tasks of a student’s future job.
Education changes people’s broad perspective of the world. It trains habits of mind, not technicians.
Related posts: Relevance in an Age of Forgetting
Occasionally it happens that in one particular class, I feel that I am simply not getting through. This always leads to introspection. So I look for ways to improve my teaching. This time, I’m getting a message that doesn’t make sense to me, although I’ve nodded and promoted it for a number of years.
The issue is student engagement. There is tacit agreement that keeping students interested and engaged is a Good Thing. However, after much thought, I’m starting to think that this emphasis is detrimental to good teaching and learning. As a historian, I also fear it will damage my discipline as, well, a discipline by encouraging a lack of…discipline.
We often face classes of staring, bored-in-advance students waiting to be entertained. There’s an excellent post by Dave Graser on how to flip a zombie. He meant flipping a bored class (assigning static material for out of class, and using class for discussion) and connecting it to contemporary issues to engage students. The ideas are exciting, and are behind the whole movement of “flipping” classes.
We also have students who are completely unprepared for the rigors and habits of college-level study. Alford and Griffin on Faculty Focus: Teaching Underprepared Students, claim that the solution for such an unready, disengaged group is “relevance, relevance, relevance”. We must figure out where students are, and then bring them to the subject through connecting their experience to our material.
We are also told can engage them through fun activities, gaming, modern colloquialisms, or pop culture. Dynamic lecturing, new technologies, new approaches, should all be designed to encourage their engagement in our course.
The premise of all this is that teachers have the responsibility is to make things “relevant” and exciting, so that students will stay engaged and maintain focus. It is natural to want happy, active students. I want them too! But there are several problems. One is that the current prescription puts the burden of engagement on the instructor rather than the student, leading to dependency. Another is that trying to effectively engage students can lead to a “dumbing down” of ones discipline.
In short, the current emphasis on student engagement is misguided.
The Instructor’s Role in Engagement
The suggestion, way out there in not-much-research-land, is that engagement equates as student success in the class, presumably in the form of high grades and an advanced level of work.
The problem is that engagement doesn’t do that – engagement makes it interesting to do well if you are already capable of doing well. It cannot ensure doing well if you’re not able to succeed, for whatever reason. I know students who are totally engaged in History, and very enthusiastic, but will not accept instruction in either the discipline or how to express it. Their “teacher” is the History Channel, the things they’ve already read or heard about, and the workings of their own mind, independent of facts and habits of cogent analysis. They are engaged, but cannot construct a coherent historical argument nor back it up with sources.
By the same token, those who do not like the class, or are “disengaged”, may do very well. This is particularly true if they are self-directed and cognizant that they don’t like the class. They push harder to do good work because they want a high grade. Engagement is a side effect, one I encourage by allowing students to pursue their own topics.
I do want them to enjoy their work – that’s important to the quality of the class, providing the opportunity. But it is just, as I’ve indicated before in my post about “student success”, an opportunity. If I don’t provide an opportunity for engagement, by creating a class with both clear direction and some room for exploration, I am not doing my job.
But I cannot force engagement – no one can. And we cannot delude ourselves that we can even track it. I cannot tell whether a student who is looking at me while I lecture or doing the work enthusiastically in her group is learning history or thinking about lunch. Similarly, I can’t assume that the student staring at his desk is not listening and learning. Online, we are deceived by data such as the number and length of log-ins, which is faulty the moment a student leaves to get a sandwich with the lecture screen open, or logs in twelve times a week because they have a nervous disposition.
But these days it is not enough to just provide opportunity and access. If students do not engage, it is my fault, or the fault of the design of my class (my design). They drop because I have not engaged them enough.
I just don’t buy it – teaching and learning doesn’t work that way. I can give them the dance floor and the lessons, but they need to engage the dancing by stepping out there and giving it a whirl. Much of their willingness and ability to do so is beyond my control.
The problem of intellectual integrity
In the above articles, it is advised that we should engage students emotionally first. I know a history instructor who does this, and does it beautifully. He starts his lectures with horrific images or stories of human cruelty. Once students are upset about the injustice being portrayed, they want to know the background, so he gets into the facts in the lecture.
At first, I believed I was just not cut out to lecture that way. But after awhile I realized it isn’t my storytelling ability – I actually have pedagogical issues with the whole approach. An emotional approach is inherently anti-intellectual. It also leads to emphasizing primary and secondary sources that have an extremist viewpoint. There are moral lessons to be sure, but also a real danger of encouraging a “History Channel”, sensationalist approach to history. I have always had trouble with role-playing as a technique for teaching history for the same reason. Although I am enchanted by such projects as the Titanic re-enactments on Twitter (and now, Jack the Ripper), I cannot bring myself to use such a technique with my students. The gamification of education causes the student to focus on side issues instead of learning historical skills (despite the enthusiastic teachers who assign Civilization IV or promote “what if” alternative history).
We are told that we must make history relevant by continually connecting historical events and ideas to those in comtemporary life, and we strive with increasing difficulty to find current affairs with which students are familiar. But again, the approach is misguided. What makes history “relevant” is not related to immediate things, or things that are part of students’ daily lives. And when we emphasize those current connections (having students construct their family histories, or their own) we give the wrong impression of what the historical field is all about.
A great forgetting
Nicholas Carr (in The Atlantic) reports the extent to which our computer dependence causes us to forget how to do things. He uses flying a plane as an example – accidents these days are the result of human error due to lack of practice, rather than mechanical failure. After reading his article, I used an example in my class when students said that the compass was a significant medieval invention. Yes, it was, but it also led to dependence on the techonology – fewer and fewer people would be able to read the sky to know where they were. One of my students came up afterward and told me of a camping trip where they had forgotten their standard compass, and could not figure out how to use the fancy electronic compass on an expensive watch. Only one of them knew where the sun would be, to help find their way.
The final danger is that as we trivialize history to make it relevant, we will forget how to practice skills required for the discipline. Many people are already forgetting how to read a sustained argument, which is essential for understanding many significant historical documents. We are forgetting how to find things in books, how to gloss dense text, and how to take good notes. We are losing the ability to retain information, because we know we can easily look it up.
I used to assure students that they did not need to memorize historical facts, since we could look them up. Now I’m not so sure. To not memorize anything is to allow an important habit of mind to rust into uselessness. Should we really cater to short attention spans with 10 minute videos and breaks in the classroom action every 15 minutes? Perhaps it would be better to teach students how to analyze a document carefully, how to take notes on a document, how to focus on one thing for awhile.
As I noted recently on Twitter, our “customer” in public education is society, not the student. Right now, society in this country is on an anti-intellectual bender, defying rationality in its political system and reducing the financial support for higher education. To cater to students’ demand for entertainment and short “chunks” of information is to further the aims of those who would prefer an uneducated public (as I’ve noted about online “providers”). It is usually the goal of historians to encourage an understanding of the past in order to improve the future. And I’m not sure we can do that if we continue bowing to the gods of student engagement.
Yes, I also write about History (capitalized – meaning the discipline, not the hobby).
And as a historian, I teach my students the difference between primary sources and secondary sources.
A primary source is one created at the time one is studying. A secondary source is created later, about the subject one is studying.
This isn’t as clear-cut as it sounds.
Take, for example, the Scopes trial of 1925, in which John Scopes, a high school biology teacher, was tried for teaching evolution, which was against the law. The transcripts of the trial are a primary source. The Wikipedia article about the trial is a secondary source. Ray Ginger’s book “Six Days or Forever? Tennessee v. John Thomas Scopes” is a secondary source.
Or is it? It was written in 1958. It is a secondary source if you’re studying the Scopes trial. But if you’re studying the arguments about evolution during the 1950s, then Ginger’s book is a primary source.
This is the problem with using primary sources to teach history. What era or subject are they primary for?
So I’m looking at Net Texts, since I’m always seeking good resources, and went to their section on Revolution and the New Nation, which mentioned primary source documents, and I noticed that some of them weren’t, at least not for the American Revolution – several were images from the 1820s supposedly portraying events from the Revolution. They seemed to get around the issue by putting “1820s” as the end date of the unit, but I felt that was deceptive. I went to the source, which they claim was the National Archives, assuming that the site would be more reliable in this regard.
It wasn’t. Throughout the National Archives teaching resources, primary sources were emphasized but many weren’t primary to the issue being studied. In many places they were fine, but in some they were later interpretations or were presented with no date so that students couldn’t tell whether they were primary or not.
(From the Archives, a portrayal of Washington General George Washington and a Committee of Congress at Valley Forge. Winter 1777-78. Copy of engraving after W. H. Powell, published 1866 — Powell was born 24 years after Washington died.)
The year after Washington’s death, Parson Weems published the first hagiographic account of the general and President, not only introducing the cherry tree story but glorifying Washington’s character in general. In the early years of the U.S., such moral tales were important to establishing an American identity and ideology. William Henry Powell’s mid-19th century work is more part of this traditional than of the actual events of the Revolutionary War.
In other words, Powell’s painting is a primary source for the 19th century, not 1777-78.
This is not an idle point. Consider, for a moment, movies. Movies are frequently used to teach history. I used to show Hollywood film clips to illustrate historical events if I felt they were portrayed accurately. But over time I realized that these were being seen as primary sources for the era being portrayed, rather than the era the film was made, and that I was encouraging bad history by showing them in that context.
In other words, the American movie Reds (1981) says more about our revision of communism in the 1980s than it does the Russian Revolution. Going back further, the film Battleship Potemkin (1925) says more about the 1920s than the 1905 mutiny it portrays, and Birth of a Nation is about attitudes in 1915 rather than during Reconstruction. The Battle of the Bulge (1965) is about our ideas of WWII as seen from the 1960s. Lawrence of Arabia (1962 – set during WWI), Charge of the Light Brigade (1968, set during the Crimean War), Zulu Dawn (1979 – set in 1879 South Africa) are about the British dealing with the history of their Empire in the 60s and 70s, once it was lost. Breaker Morant (1980, set during Boer War) and Gallipoli (1981, set during the Great War) show Australian culture coming of age during the 80s. Same thing with films from every nation.
That’s why there are so many remakes (such as Mutiny on the Bounty in 1935, 1962, 1984). Each generation takes the stories and interprets them in light of current issues and concerns. It’s a signifiant contextual distinction.
And it’s very tricky now with the web. NPR recently reported on the first web page (Tim Berners-Lee’s demo page from 1990) having been lost. Very few web pages have what we could call provenance (in fact, the term for web purposes seems to refer to accuracy rather than history).
Original web creations will be difficult to date, and thus difficult to use as primary sources. A meme like Selleck Waterfall Sandwich will be difficult to use. One has to rely on secondary sources to date its origin at 2010. But when I search on Google and set dates further back, it pops up in web pages referring to the meme as far back as 2003.
How will we reliably date less popular items? Anyone using Google to check a student paper for plagiarism knows that tons of text has been copied from website to website – it is impossible to tell the origin of many passages of writing. There is often no “date of publication”.
Not that I’m putting Selleck Waterfall Sandwich on a par with a painting of George Washington or a note written by Thomas Jefferson.
But you get the idea.
KCRW’s To The Point was in the trend as it focused on why college is so expensive, and asked whether online learning is becoming the way to get an education (starts at about 8 minutes in).
The recent emphasis of cost and student loan debt in discussing the worthiness of a university education is not just the result of the economic depression, though it may seem that way. The discussion is certainly much more centered around cost-benefit analysis now than ever before. We argue, as in this radio program, about why costs are going up, and how students are taking out huge loans that can’t be paid back.
Dana Summers c Tribune Media Services 2011
I work at a community college, the ultimate guarantor of opportunity in getting a college education: low tuition, small class size, no entry requirements. As universities fill up and begin to cancel guarantees for admission (UCSD will be ending its TAG program in 2014, seriously impacting my students) community colleges are clearly what they have always been: the best deal for completing lower division work.
Whatever else you think of the conclusions of her 2010 book DIY U, Anya Kamenetz did a great job reviewing the history of higher education in this country. She notes that there has always been an exclusionary aspect to college – a small percentage of the population attended college at all in the 18th century, and though larger now the traditional divides of race and class still exist in terms of college attendance.
As expenses rise, the wealthy can continue to go to college, the poor must continue to rely on government support and tuition breaks to go to college, and the middle class borrows more and more money to go to college. As with the loans undertaken to buy houses that were larger than they needed, the middle class now complains that the amount they’re borrowing for education isn’t good value and the cost is too high (I have little sympathy for those borrowing huge amounts to go to Harvard instead of their state college.)
The result is that, as in the 18th and 19th century, the exclusionary element of higher education is reasserting itself. The wealthier class will attend the expensive universities, and those who are not wealthy will not be able to afford to go to the major universities. This may increase the value of the B.A., since fewer will be given by those universities.
The idea of college as an entitlement will fade at these higher levels. As Daniel Luzer notes, people can pay $40,000 at Harvard or at Occidental, but they’ll feel less screwed if they go to Harvard.
The new abundance of for-profit online colleges is testimony to the desire to get a college degree even if the quality of the education is poor (as I believe it is in most of these places, based on my understanding of their canned courses and lack of pedagogical freedom and preparation in their faculties). Online education is the answer to place and time conflicts, and an answer to finding alternatives to old pedagogies, but it is not a cure for high costs or socio-economic class divisions. (I had hoped to see online education become absorbed into teaching pedagogy in general, and although I work very hard at helping online teachers, I am dismayed at the perpetuation of the distance ed mode as a way of creating standardization and tracking instead of pedagogical innovation.)
As Diane Ravitch notes, the fact that recent changes may lead to fewer university degress is not a bad thing economically, since only 23% of the jobs opening up in the next few years will not require a BA or higher. She says under the new economy, college should be for those who want to learn. This is another kind of bifurcation, similar to that noted as part of the 18th and 19th century in Kamanetz’s book, between the vocational/trade learners and the college/intellectual learners. Actually, it’s similar to every timeframe – ancient Greece comes to mind.
I do believe that everyone should have the opportunity to go to college – that’s part of my programming as a democratic American. But if the intellectual and financial standards are high, some will not be able to. Colleges should offer scholarships and governments should offer aid (more in grants than loans, but loans should be extremely low interest) for those who are academically excellent but don’t have enough money. This would maintain our goal of allowing anyone with the intellectual talent to go to college, which is good for the country. Instead we’ve gone way past that into schools where many are on scholarship, including the wealthy, and community colleges ar expected to focus on “student success” and get everyone transferred to university who wants to transfer.
We should be able to say “you’re just not college material”, but it’s unfortunate that today’s combination of high prices and overenrolled universities may be the only way to do that.
No, I am not a big fan of the James Cameron film, mostly because of the awful script and the inability of the two leads to rise above it.
I have also not been a big fan of tweeting “history”, in historical reinactments done via Twitter. I critiqued the approach heartily almost exactly two years ago.
However, I have now been to two museum exhibits of artifacts from the Titanic (including the current one in San Diego), and tonight I sit here watching Twitter as the Carpathian steams in to pick up survivors, and I have to say, it’s ridiculously riveting to watch the disaster unfold on Twitter.
I have followed both Real-Time Titanic and TitanicVoyage out of the UK publishing house The History Press for the last couple of days. Both have done a great job in creating a suspenseful account of what professional historians like to call (often disdainfully), “popular” history, Real-Time Titanic using a third-person, journalistic style and TitanicVoyage marking posts by the type of tweeter (#captain, #crew, #thirdclass) for an even more harrowing first-person tone.
As with most of the popular history I’ve enjoyed, I was drawn to a single aspect of the subject, in this case the problem with the Marconi wireless radio just a day before they hit the iceberg.
Apparently, the transmitter went down.
And apparently, they broke the rules to fix it.
Naturally, this sent me on a hunt for rules about Marconi wireless, and the stories of the young men who worked the radios. I found a respected article by Parks Stephenson, a fascinating page on the wireless telegraphists, a website “specialising in radio aspects of the Titanic disaster since 1999“, and a recent article from Atlantic Monthly on the importance of radio to the survivors. None of the creators of this stuff are, to my knowledge, professional historians. They are enthusiasts, of history and radio.
Even if only parts of the stories are true, it is possible that a couple of young men took apart a radio against some sort of policy to make sure it transmitted, and if they hadn’t then a day later when Titanic hit an iceberg they might not have been able to send the distress call.
Conclusions can thus be made about the value of mechanical tinkering, and not being afraid to break the rules, and professional pressure to do your job (hundreds of passenger-sent messages were sent from the ship by radio).
It’s harder to explain, though, the emotional impact of watching it unfold, in “real time” 100 years later, as if it were happening now and we could hear the screams of the people freezing to death yards from the lifeboats. There was a certain War of the Worlds aspect to it, even though Twitter is not really the radio and one couldn’t unknowingly follow the Twitter stream the same way people unknowingly tuned in to Welles’ show.
And again this odd use of Twitter makes me rethink the role of stories and history and the enthusiasts who put it all together, and I’m filled with nothing but respect for their work.