Occasionally it happens that in one particular class, I feel that I am simply not getting through. This always leads to introspection. So I look for ways to improve my teaching. This time, I’m getting a message that doesn’t make sense to me, although I’ve nodded and promoted it for a number of years.
The issue is student engagement. There is tacit agreement that keeping students interested and engaged is a Good Thing. However, after much thought, I’m starting to think that this emphasis is detrimental to good teaching and learning. As a historian, I also fear it will damage my discipline as, well, a discipline by encouraging a lack of…discipline.
We often face classes of staring, bored-in-advance students waiting to be entertained. There’s an excellent post by Dave Graser on how to flip a zombie. He meant flipping a bored class (assigning static material for out of class, and using class for discussion) and connecting it to contemporary issues to engage students. The ideas are exciting, and are behind the whole movement of “flipping” classes.
We also have students who are completely unprepared for the rigors and habits of college-level study. Alford and Griffin on Faculty Focus: Teaching Underprepared Students, claim that the solution for such an unready, disengaged group is “relevance, relevance, relevance”. We must figure out where students are, and then bring them to the subject through connecting their experience to our material.
We are also told can engage them through fun activities, gaming, modern colloquialisms, or pop culture. Dynamic lecturing, new technologies, new approaches, should all be designed to encourage their engagement in our course.
The premise of all this is that teachers have the responsibility is to make things “relevant” and exciting, so that students will stay engaged and maintain focus. It is natural to want happy, active students. I want them too! But there are several problems. One is that the current prescription puts the burden of engagement on the instructor rather than the student, leading to dependency. Another is that trying to effectively engage students can lead to a “dumbing down” of ones discipline.
In short, the current emphasis on student engagement is misguided.
The Instructor’s Role in Engagement
The suggestion, way out there in not-much-research-land, is that engagement equates as student success in the class, presumably in the form of high grades and an advanced level of work.
The problem is that engagement doesn’t do that – engagement makes it interesting to do well if you are already capable of doing well. It cannot ensure doing well if you’re not able to succeed, for whatever reason. I know students who are totally engaged in History, and very enthusiastic, but will not accept instruction in either the discipline or how to express it. Their “teacher” is the History Channel, the things they’ve already read or heard about, and the workings of their own mind, independent of facts and habits of cogent analysis. They are engaged, but cannot construct a coherent historical argument nor back it up with sources.
By the same token, those who do not like the class, or are “disengaged”, may do very well. This is particularly true if they are self-directed and cognizant that they don’t like the class. They push harder to do good work because they want a high grade. Engagement is a side effect, one I encourage by allowing students to pursue their own topics.
I do want them to enjoy their work – that’s important to the quality of the class, providing the opportunity. But it is just, as I’ve indicated before in my post about “student success”, an opportunity. If I don’t provide an opportunity for engagement, by creating a class with both clear direction and some room for exploration, I am not doing my job.
But I cannot force engagement – no one can. And we cannot delude ourselves that we can even track it. I cannot tell whether a student who is looking at me while I lecture or doing the work enthusiastically in her group is learning history or thinking about lunch. Similarly, I can’t assume that the student staring at his desk is not listening and learning. Online, we are deceived by data such as the number and length of log-ins, which is faulty the moment a student leaves to get a sandwich with the lecture screen open, or logs in twelve times a week because they have a nervous disposition.
But these days it is not enough to just provide opportunity and access. If students do not engage, it is my fault, or the fault of the design of my class (my design). They drop because I have not engaged them enough.
I just don’t buy it – teaching and learning doesn’t work that way. I can give them the dance floor and the lessons, but they need to engage the dancing by stepping out there and giving it a whirl. Much of their willingness and ability to do so is beyond my control.
The problem of intellectual integrity
In the above articles, it is advised that we should engage students emotionally first. I know a history instructor who does this, and does it beautifully. He starts his lectures with horrific images or stories of human cruelty. Once students are upset about the injustice being portrayed, they want to know the background, so he gets into the facts in the lecture.
At first, I believed I was just not cut out to lecture that way. But after awhile I realized it isn’t my storytelling ability – I actually have pedagogical issues with the whole approach. An emotional approach is inherently anti-intellectual. It also leads to emphasizing primary and secondary sources that have an extremist viewpoint. There are moral lessons to be sure, but also a real danger of encouraging a “History Channel”, sensationalist approach to history. I have always had trouble with role-playing as a technique for teaching history for the same reason. Although I am enchanted by such projects as the Titanic re-enactments on Twitter (and now, Jack the Ripper), I cannot bring myself to use such a technique with my students. The gamification of education causes the student to focus on side issues instead of learning historical skills (despite the enthusiastic teachers who assign Civilization IV or promote “what if” alternative history).
We are told that we must make history relevant by continually connecting historical events and ideas to those in comtemporary life, and we strive with increasing difficulty to find current affairs with which students are familiar. But again, the approach is misguided. What makes history “relevant” is not related to immediate things, or things that are part of students’ daily lives. And when we emphasize those current connections (having students construct their family histories, or their own) we give the wrong impression of what the historical field is all about.
A great forgetting
Nicholas Carr (in The Atlantic) reports the extent to which our computer dependence causes us to forget how to do things. He uses flying a plane as an example – accidents these days are the result of human error due to lack of practice, rather than mechanical failure. After reading his article, I used an example in my class when students said that the compass was a significant medieval invention. Yes, it was, but it also led to dependence on the techonology – fewer and fewer people would be able to read the sky to know where they were. One of my students came up afterward and told me of a camping trip where they had forgotten their standard compass, and could not figure out how to use the fancy electronic compass on an expensive watch. Only one of them knew where the sun would be, to help find their way.
The final danger is that as we trivialize history to make it relevant, we will forget how to practice skills required for the discipline. Many people are already forgetting how to read a sustained argument, which is essential for understanding many significant historical documents. We are forgetting how to find things in books, how to gloss dense text, and how to take good notes. We are losing the ability to retain information, because we know we can easily look it up.
I used to assure students that they did not need to memorize historical facts, since we could look them up. Now I’m not so sure. To not memorize anything is to allow an important habit of mind to rust into uselessness. Should we really cater to short attention spans with 10 minute videos and breaks in the classroom action every 15 minutes? Perhaps it would be better to teach students how to analyze a document carefully, how to take notes on a document, how to focus on one thing for awhile.
As I noted recently on Twitter, our “customer” in public education is society, not the student. Right now, society in this country is on an anti-intellectual bender, defying rationality in its political system and reducing the financial support for higher education. To cater to students’ demand for entertainment and short “chunks” of information is to further the aims of those who would prefer an uneducated public (as I’ve noted about online “providers”). It is usually the goal of historians to encourage an understanding of the past in order to improve the future. And I’m not sure we can do that if we continue bowing to the gods of student engagement.
Recently, an interesting conversation has been going on about educators being tongue-tied and blogging less, feeling they’ve lost their voice. Bonnie Stewart wrote of this after reading a post by Paul Prinsloo, then Jenny Mackness mentioned Bonnie’s post while talking about “conscious incompetence”. I detect a crisis of confidence, but this may be only because I’m experiencing one.
I read much of Jenny’s work – it’s wonderful. I read Bonnie’s posts – they’re wonderful. And yet, I understand the tongue-tied feeling, and the way in which it’s related to the rise of MOOCs. I’ve tried not to blog about MOOCs, but I end up doing it anyway in an effort to make some sense of the role of the original pedagogies, the publicity, and the commercialization. The evolution of the MOOC discussion parallels, not coincidentally, the movement in higher education toward the abandonment of traditional pedagogies, the publicity about college costs and purpose, and the commercialization of educational goals. I’m hopelessly old fashioned – I think the reason for a college education is to become a more educated person.
Whatever silences I may have experienced are the result of disillusionment. Like the others, I used to post more. As my posts became increasingly cranky, I didn’t enjoy writing them as much. I also became frustrated when I began caring whether people read and responded. But I couldn’t not care about it on Twitter, although I tried. Along with the evolution of MOOCs and Higher Ed, Twitter has evolved too among those I follow: it is now primarily a link-sharing network. 140-character commentary on ones own work has given way to, in some cases, only sharing links to other people’s content. So for me, it’s been less interesting to read, although those info-tweets are still a good resource. When I tweet, and do it in the old style (my own commentary, sans link), there is now little response. More than my blog, Twitter seemed like a conversation. Now it’s more like Diigo than meeting friends at a coffee shop.
The disillusionment is now creeping into how I think about teaching. I am at a stage in my career where I have tried many pedagogical approaches, with varying degress of success. And this success has been distributed unevenly among students with varying skills and goals. Although I would not consider myself, I hope, to be at the complacent stage of “conscious competence” noted by Jenny, I may have passed it and come back to “conscious incompetence”. Unlike Jenny, however, I do not feel that I do not know enough, but that perhaps I know too much. Unlike Bonnie, I never went for a doctoral degree, though I have experienced the frustrations of academic writing (or, often, feeling I must read articles so I cite something to back up what I already have experienced in the classroom). There are times when this year feels like my first year of teaching.
While I have not abandoned blogging during all this, and I still feel that I can help others, it’s clear that I need inspiration for a completely different kind of analysis of what I do, or even examining something completely different (poetry? literature? all that history I was going to write about once upon a time?).
For me, I’ll write more when I have begun to think differently.
In terms of social communication and interaction, I am not a stickler. I am not offended by spelling and capitalization errors in emails to me or in social networks.
Student work in my discipline, though, is more formal. I have expectations for clear college-level English writing, with all its rules. That is the communication form of a university education. Proper construction, grammar and spelling (and an advanced vocabulary) make the clear presentation of complex ideas possible. They are required.
I suspect now that in online classes, though, there is a tendency to transfer the informality of other online communications into college work. Because it’s the web, the student default is to communicate informally.
A number of years ago, I changed the way students submit their written work. Having read about and seen the benefits of students being exposed to the work of their peers, I have them submit their writing in a forum rather than privately to me via a test or essay. I assess the work in that forum, but only the student can see his/her own grade. I then point to the best work as examples. At the time I changed over, the literature and anecdotes claimed that students writing “in public” in this way are more careful with their work, because it is being seen by their colleagues rather than just the teacher.
I may be seeing the opposite. Their writing is often poor in their assignments. My colleagues, whom I consulted on this problem, think that I may not be communicating high enough expectations at the beginning of the class. And that may be true – since it’s “in public”, I tend to let them practice, commenting generally on any overall problems of content or construction. I have promised myself to enforce proper writing (through grades – that’s the only “enforcement” we have) earlier in the semester next time.
But I am very interested in defaults when it comes to education, i.e. what do most students think when they use this technology? what do most students automatically do when asked to complete a task? where do most students get lost? how do most students assume things should be?
And I wonder whether the fact that they are writing in a student forum means that the default is to write informally. Since I provide a fairly rigid structure for the assignments, the informality comes out in the form of sloppiness in vocabulary, spelling and grammar. I have assumed thus far that they don’t have the proper skills to write at the college level. But one colleague assures me that they do, if only my expectations are raised.
I wonder also whether those who demand that written work be submitted in a Word document, rather than inside an LMS assignment box, get a higher level of work. Perhaps a Word document implies greater formality than a submission to the teacher, which implies greater formality than a post in a forum. I do have anecdotal evidence: I asked my on-site class to write a paragraph about an article, typed and submitted on paper. The level of writing they exhibited was higher than in the assignments they submit online.
So I’m not sure the extent to which the default of informality is a factor. Do they really not know how to write college-level English, or has no one ever expected it from them? Do they assume that because it’s online it isn’t formal? And are their levels of formality implied by the technology, and they simply follow?
Recently, my college (and many others) have been subjected to demands that we provide solid “authentication” of our online students, in a late and yet hurried attempt to comply with a federal law from the 2008 amendment of the Higher Education Act*.
Ostensibly “student authentication” means somehow proving that the students who take our online classes are the same ones who registered. (This implies that some of them are not, of course – we know that students may have others take classes for them, and that it’s easy to do this online.)
The 14th c. University of Paris,
a hotbed of plagiarism
We ignore, of course, that this form of cheating also happens in the classroom, where we do not force students to show ID and it’s possible to have a mom take an entire class for her kid. We ignore that our on-site students may have others write their papers for them, or buy papers. Entire degrees have been earned by people who were not the ones enrolled, at least since around the year AD 1150 or so.
We react to these problems nowadays by freaking out and instituting methods right out of George Orwell’s 1984: video cameras that watch students take exams (1), keystroke analysis (1), thumbprint verification (2), double-level passcodes.
The big, easy solution is proposed by those who believe in the true “authentication” provided by Learning Management Systems in conjunction with student enrollment systems (3). When a student applies and is given an ID and password to the enrollment system, we assume they are who they say they are. Then we carry that assumption into an LMS that has data fed to it by the enrollment system.
All other places except the LMS are considered “insecure”, because only the enrollment system-LMS password link is considered proper verification in the absence of the more draconian methods listed above.
I have argued extensively and in multiple venues that the structure of the standard LMS adversely influences the pedagogy of online teaching, especially for novice instructors (4). But the days are clearly coming when we will be forced to use the college-supported LMS and only that system (this is already true for many people at many colleges). We have tried to avoid it at my college by developing various policies through faculty power channels, all of which have been gradually dismissed.
A more reasonable approach than either Big Brother or LMS/enrollment is the argument of pedagogy as verification. Teachers should know a student’s writing style, and be able recognize when they vary from it. Frequent assignments, of course, are necessary to do this, and it’s all highly subjective. One way to manage this subjectivity is to implement requirements that faculty offer a certain type and number of assignments, or use particular strategies for assessments (5). One should not give assignments, for example, that can be easily purchased or copied from elsewhere. While I agree that we shouldn’t do this anyway (unless it’s part of analyzing such works), forcing an instructor to change how they do assignments is as bad as forcing them to use the LMS.
The issue here isn’t one of technological appropriation and student verification. It’s an issue of pedagogy and academic freedom. The professor’s right to teach a course with their own methods is clearly undermined by each of the proposed “solutions” to student verification. Gradually American citizens have been deprived of their civil liberties in the name of national security, and college instructors are experiencing the same in the name of student verification. And yet colleges consider these as technical problems, and few faculty are doing anything about it. Many faculty who do not teach online respond to such issues with the same learned helpless they use to repond to educational technology in general.
The only hope, since this incursion cannot be stopped, is to respond to it like Hollywood responded to the Hays Code (6). The Hays Code, in all of its horrid repression of creative expression, forced movie makers to be even more creative. To get around the rules, they came up with new methods, techniques, and memes. The result was an era of screwball comedies and cool mysteries. Many stuck to the rules but got around the intent of those rules, designed to produce only “wholesome” entertainment.
Of course, they also re-cut great films from before 1930, and the restrictiveness affected film-making until the 1960s.
I am trying to determine an appropriate response to the Hays Code atmosphere that is infecting online teaching. Surely somehow the restrictiveness could lead to more creativity?
* The push actually isn’t the 2008 law, but the recent popularity of MOOCs and the desire of many to have have universities accept them for credit. Since they are open courses, often on open systems, the verification issue is more obvious.
(1) Mary Beth Marklein, Colleges try to verify online attendance, USA Today, July 16, 2013
(2) Adam Vrankulj, Human Recognition Systems to launch platform for student ID and attendance verification, BiometricUpdate.com, June 27, 2013.
(3) Jeffrey L. Bailie and Michael Jortberg, Online Learner Authentication: Verifying the Identity of Online Users, Journal of Online Learning and Teaching, vol 5, no 2, June 2009.
(4) Lisa M Lane, Insidious Pedagogy: How Course Management Systems Impact Teaching,
First Monday, Volume 14 Number 10 (27 September 2009).
(5) Justin Ferriman, How to Prevent Cheating in Online Courses, LearnDash, July 11, 2013.
(6) The Hays Code http://www.artsreformation.com/a001/hays-code.html.
Let’s say that David Wiley is right (and why shouldn’t he be, as king of the open course?). He writes:
Our traditional pedagogies scale poorly beyond 30 or so people because they were developed in the context of teaching 30 or so people. I think it’s safe to assume that, in the same way that our pedagogies-for-30-people degrade as the number of students goes up, pedagogies-for-1000s-of-people degrade as the number of students goes down. Pedagogies for 1000s of people probably function so poorly in the context of 30 people that we’ve never even really tried them before. In other words, we’ve never taught 100,000 people at a time before, and consequently we’ve never developed pedagogies for teaching this many people at once – the last few years just show us trying to shoe-horn pedagogies-for-30 into MOOCs and then publishing articles about the astonishing drop rates.
And I commented there:
Well, some would say that connectivist learning theory is the approach indigenous to the online environment, and it often tends to be attacked in the same breath with MOOCs. But I like the idea that something very new is needed. People keep talking about “scaling up” old pedagogies. Maybe it isn’t about scaling anything up after all, but rather creating something entirely new (maybe not even based on connectivism). Maybe the new model could be something between the one-teacher model and the peer-grading model.
So let’s give it a try. Hmmmm…in between the one-teacher model and the peer-grading model.
I’ve got it!
Start with a team of teachers or professors. They approach the MOOC like writing a textbook – each controls a section that is in their area of expertise. They write the curriculum, assignments, select all materials for that section, record a video if that’s their preferred mode (and only if that’s their preferred mode). And then they moderate the whole class with all the other profs, assessing and providing feedback to students, dividing the workload. We could “scale” based on the number of students – at 30 students per prof, that’s about 33 instructors for a class of 1,000 students.
It’s kind of what we do in our open online class-formerly-known-as-a-SMOOC (or Shhhhmooc, since we like to keep it quiet), the POT Certificate Class, where a different expert moderates discussion each week, based on readings and on their own video introduction to the material. Only this would be bigger.
Think of the employment possibilities, which take care of Jonathan Rees‘ concerns (and mine) about doing away with qualified professors when our society needs them the most. More professors employed!
Think of the quality – no work assessed by uneducated peers, but rather by real professors. No “teams” where the professors are relegated to the role of “content experts” while IDs and ed techs take the lead – they would operate in a clearly supportive role.
Think of the academic freedom – each professor controlling their own content and approach for their section of the class. There would be variety, too, of method, readings, focus.
Think of the connectivism – possible in this environment, but within a more traditionally-organized “course” that can be transferrable and assessable, and thus count for credit at real universities. Instructivist, constructivist and connectivist approaches could all be used in the same class.
It’s certainly one possibility.
Here’s what I want my students to do: understand some of the main events and trends in history, get exposed to some of the possible interpretations of those events, learn some historical skills, practice these skills by doing the kinds of things historians do, think and write historically, create their own interpretations, and read some primary sources that contain great ideas that we need even today.
To actually achieve all that, each student would need to do the following tasks each week:
1. Read a textbook or Wikipedia or something that narrates events.
2. Gather, understand, evaluate, cite and use primary sources in their writing in support of their own intepretations.
3. Converse with others to entertain various opinions and interpretations.
4. Read and analyze primary sources assigned by me.
Consider these the four juggling balls of learning history. Each is a slightly different shape, and so easier or harder for a certain individual to handle. But it’s necessary to have them all in the air to achieve some understanding of the discipline. As a semester continues, the sound of dropping balls is common.
And I’ve noticed a pattern.
A and B-level students will keep them all in the air to some degree, because many of them are intelligent and can strategize the time spent on learning. They will occasionally set one aside, depending on their own talents and interests. But they will juggle all four balls most of the time, completing almost all if not all of the various assignments.
Mid-level (C and high D) students will drop a ball early on, and they’ll drop whichever is most difficult for them, regardless of how many points are involved. Most don’t pick it up again, and if they do, they’ve already forgotten how to juggle that many.
And if we look again at the four balls again, there are serious qualitative differences, regardless of which ones are “hard” to work with.
1. Read about the facts: this is what everyone is used to doing through 12 years of schooling, so they expect it and think it’s most important. It’s the easy, round ball. But at college, it’s just the foundation.
2. Understand and use primary sources: this is kind of fun, because they get to discover these on their own and see everyone else’s, so also a fairly round ball. But it takes some work, and some time, to keep it going.
3. Discuss: this is also something they’ve done before, a round ball that’s a little slippery. It’s hard to learn to discuss history critically, but because it seems simple, they’ll set this one aside to deal with the first two when they run short on time – they see it as an extra (many see it in a classroom as an extra also), not real learning.
4. Read and analyze primary sources: a ball with weird stitching, this involves reading English at a level many of them have not achieved, so it’s hard and they put it aside a lot.
So what’s the problem? All students strategize and choose what work they want to do, right?
Take one of the sets of primary sources I assign – selections from Gilgamesh, the Odyssey, Beowulf. All have moral lessons, lessons about becoming an adult that can be very meaningful to a 19-year-old, particularly one who’s male, has a tough relationship with his dad (if he has one around), and isn’t sure how to make his own way in the world.
The high-level 19-year-olds will read and learn, but many of them are from higher socio-economic groups in our society, and have gotten the moral lessons elsewhere anyway (they implicitly understood that Star Wars wasn’t just a sci-fi tale). The mid-level students are the ones who need the lesson, but they’re the most likely to avoid these readings.
But I can’t make those sources the whole class. Without the context (#1 – the facts) teaching sources like these is teaching mythology, or ethics, or literature, not history.
We must have the context. But mid-level students read too poorly to do it all, so they pick up a few facts, find a few sources, write some, and skip the hard reading.
So perhaps my own maturity has encountered their efforts at maturity.
As an inexperienced instructor, I too focused on facts. I was into the textbook and the facts I presented in my lecture, and tested them on that. They wrote essays on those facts, the kind you find in essay banks on the internet. After a few years, I began emphasizing interpretations, then more recently I’ve focused on collecting sources and developing historical writing. Now I’m thinking about morality – what is the importance of history if not to teach lessons?
Yes, this is unpopular. The development of social science since the 1940s says that History should be an objective pursuit, while the continuation of History as a humanities class says we are teaching core values. But the social scientists missed the point of historical study – the very focus of scientific endeavor is guided by the needs of society at that particular moment in time. It’s a natural, Dead-Poets-Society kind of thing to try to help young people by using the universal texts that have helped others shape themselves as individuals, in some case for thousands of years.
If I emphasize those sources more, it may also solve another problem I have – the current cultural focus on storytelling. I’ve never been able to relate to it, this urge to put everything (and in some cases every minute of ones life) into a story. But stories that matter have a lesson to them, which is why they’ve been around so long.
Now to get the C and D students to focus on them.