KCRW’s To The Point was in the trend as it focused on why college is so expensive, and asked whether online learning is becoming the way to get an education (starts at about 8 minutes in).
The recent emphasis of cost and student loan debt in discussing the worthiness of a university education is not just the result of the economic depression, though it may seem that way. The discussion is certainly much more centered around cost-benefit analysis now than ever before. We argue, as in this radio program, about why costs are going up, and how students are taking out huge loans that can’t be paid back.
Dana Summers c Tribune Media Services 2011
I work at a community college, the ultimate guarantor of opportunity in getting a college education: low tuition, small class size, no entry requirements. As universities fill up and begin to cancel guarantees for admission (UCSD will be ending its TAG program in 2014, seriously impacting my students) community colleges are clearly what they have always been: the best deal for completing lower division work.
Whatever else you think of the conclusions of her 2010 book DIY U, Anya Kamenetz did a great job reviewing the history of higher education in this country. She notes that there has always been an exclusionary aspect to college – a small percentage of the population attended college at all in the 18th century, and though larger now the traditional divides of race and class still exist in terms of college attendance.
As expenses rise, the wealthy can continue to go to college, the poor must continue to rely on government support and tuition breaks to go to college, and the middle class borrows more and more money to go to college. As with the loans undertaken to buy houses that were larger than they needed, the middle class now complains that the amount they’re borrowing for education isn’t good value and the cost is too high (I have little sympathy for those borrowing huge amounts to go to Harvard instead of their state college.)
The result is that, as in the 18th and 19th century, the exclusionary element of higher education is reasserting itself. The wealthier class will attend the expensive universities, and those who are not wealthy will not be able to afford to go to the major universities. This may increase the value of the B.A., since fewer will be given by those universities.
The idea of college as an entitlement will fade at these higher levels. As Daniel Luzer notes, people can pay $40,000 at Harvard or at Occidental, but they’ll feel less screwed if they go to Harvard.
The new abundance of for-profit online colleges is testimony to the desire to get a college degree even if the quality of the education is poor (as I believe it is in most of these places, based on my understanding of their canned courses and lack of pedagogical freedom and preparation in their faculties). Online education is the answer to place and time conflicts, and an answer to finding alternatives to old pedagogies, but it is not a cure for high costs or socio-economic class divisions. (I had hoped to see online education become absorbed into teaching pedagogy in general, and although I work very hard at helping online teachers, I am dismayed at the perpetuation of the distance ed mode as a way of creating standardization and tracking instead of pedagogical innovation.)
As Diane Ravitch notes, the fact that recent changes may lead to fewer university degress is not a bad thing economically, since only 23% of the jobs opening up in the next few years will not require a BA or higher. She says under the new economy, college should be for those who want to learn. This is another kind of bifurcation, similar to that noted as part of the 18th and 19th century in Kamanetz’s book, between the vocational/trade learners and the college/intellectual learners. Actually, it’s similar to every timeframe – ancient Greece comes to mind.
I do believe that everyone should have the opportunity to go to college – that’s part of my programming as a democratic American. But if the intellectual and financial standards are high, some will not be able to. Colleges should offer scholarships and governments should offer aid (more in grants than loans, but loans should be extremely low interest) for those who are academically excellent but don’t have enough money. This would maintain our goal of allowing anyone with the intellectual talent to go to college, which is good for the country. Instead we’ve gone way past that into schools where many are on scholarship, including the wealthy, and community colleges ar expected to focus on “student success” and get everyone transferred to university who wants to transfer.
We should be able to say “you’re just not college material”, but it’s unfortunate that today’s combination of high prices and overenrolled universities may be the only way to do that.
No, I am not a big fan of the James Cameron film, mostly because of the awful script and the inability of the two leads to rise above it.
I have also not been a big fan of tweeting “history”, in historical reinactments done via Twitter. I critiqued the approach heartily almost exactly two years ago.
However, I have now been to two museum exhibits of artifacts from the Titanic (including the current one in San Diego), and tonight I sit here watching Twitter as the Carpathian steams in to pick up survivors, and I have to say, it’s ridiculously riveting to watch the disaster unfold on Twitter.
I have followed both Real-Time Titanic and TitanicVoyage out of the UK publishing house The History Press for the last couple of days. Both have done a great job in creating a suspenseful account of what professional historians like to call (often disdainfully), “popular” history, Real-Time Titanic using a third-person, journalistic style and TitanicVoyage marking posts by the type of tweeter (#captain, #crew, #thirdclass) for an even more harrowing first-person tone.
As with most of the popular history I’ve enjoyed, I was drawn to a single aspect of the subject, in this case the problem with the Marconi wireless radio just a day before they hit the iceberg.
Apparently, the transmitter went down.
And apparently, they broke the rules to fix it.
Naturally, this sent me on a hunt for rules about Marconi wireless, and the stories of the young men who worked the radios. I found a respected article by Parks Stephenson, a fascinating page on the wireless telegraphists, a website “specialising in radio aspects of the Titanic disaster since 1999“, and a recent article from Atlantic Monthly on the importance of radio to the survivors. None of the creators of this stuff are, to my knowledge, professional historians. They are enthusiasts, of history and radio.
Even if only parts of the stories are true, it is possible that a couple of young men took apart a radio against some sort of policy to make sure it transmitted, and if they hadn’t then a day later when Titanic hit an iceberg they might not have been able to send the distress call.
Conclusions can thus be made about the value of mechanical tinkering, and not being afraid to break the rules, and professional pressure to do your job (hundreds of passenger-sent messages were sent from the ship by radio).
It’s harder to explain, though, the emotional impact of watching it unfold, in “real time” 100 years later, as if it were happening now and we could hear the screams of the people freezing to death yards from the lifeboats. There was a certain War of the Worlds aspect to it, even though Twitter is not really the radio and one couldn’t unknowingly follow the Twitter stream the same way people unknowingly tuned in to Welles’ show.
And again this odd use of Twitter makes me rethink the role of stories and history and the enthusiasts who put it all together, and I’m filled with nothing but respect for their work.
You know, I teach History. (I capitalize the word because I mean the formal academic discipline, not the subject of the History Channel or that category in the New York Times Review of Books.)
Back in the 1960s, there was an argument at universities about “relevance”, with student activists claiming that what they were learning was not relevant to their lives. Their work, and that of the faculty who supported them, has given us fields like women’s studies, Native American studies, Asian philosophy, sociology of the family, etc.
One of the courses that was criticized most at the time was Western Civilization. It was seen as promoting Dead White Guys history, leaving out women, poor people, and other elements of society. It was seen as elitist, Euro-centric, and, worst of all, irrelevant. Although the course itself did not die (I’m teaching it), the materials and approaches changed. Women scientists and scholars of the past, social groups that had little say in texts, and even international influences, gradually have worked their way into lectures, textbooks, and scholarly focus. This is taking several generations, despite the fact that social history became a major field in the 1940s. It is only recently that you can find textbooks that put women and ordinary people within the narrative, instead of in boxes and “features” that continue to marginalize (quite literally) their contribution to the flow of history. (Children, by the way, have yet to appear as historical actors – I hope I see that before my career ends, but I doubt it.)
At the same time as this necessary shift has been taking place, the desire for relevance has gradually entered the pedagogy of college classrooms as well, with professors of traditional subjects being encouraged to not only make learning more “active”, but to provide lots of connections to today’s issues and the daily lives of students. We are asked to examine ways in which our subject is applicable to students’ lives, and the implication is that we should adjust our teaching and our subject matter accordingly.
I find it contradictory that just as we are finally creating a history of adult society as a whole, a history that includes people who may not be like ourselves, we are seeing pressure to teach history as more directly connected to the lives of our students. And I say…no.
It’s not that I don’t think current events aren’t a good way to talk about history. I do that whenever I have a chance to talk with students, in class or out of class. It’s that the purpose of history is not just to understand today’s events; it’s to understand who we are. And by that I don’t mean only who we are today, but who we are in the context of a history that extends back, way back, beyond ourselves.
To teach a person, especially a young person who’s lived 20 years or less, a history that is relevant to their daily lives is, quite literally, short-sighted. Their daily lives are simply too small. Older people have small lives, too, which may be confined by lack of knowledge about things that don’t seem to immediately concern them.
Many of my students go to college, or go back to college, to get real-world skills and a job that pays well. They see those G.E. areas that don’t interest them as hoops to jump through. Their horizons are narrow and many are quite content that way. Only a few are, as we say in our superior professorial tone, “here to learn”.
The “relevance” approach also plays into the idea of strategic learning — students believe they only need to learn things they see as immediately relevant to their lives. That’s very superficial.
To me, the purpose of higher education is to broaden people’s horizons, not work within them.
I don’t want to make history relevant to their lives today, to who they are today – I want to make it relevant to the educated person they wish to become. And if they don’t wish to become educated, then it’s my job to try anyway, to introduce them to ideas beyond themselves, from times in which they haven’t lived and thus find difficult to understand. I want to show them universal ideas, and controversial ideas, and interpretations of the past in an effort to have them understand some of the context in which they live as historical human beings, not just themselves and their small world.
The relevance of that perspective, if they see it at all, will not play out until long after they leave my class, as they enter the flow of history to make their own contribution. And when they look back, they need to see more than themselves.
So I decided to get rid of my textbook for the online and hybrid modern US History class I’m teaching in Spring. My old edition, on which all my multiple-choice quiz questions were based, was from 2004, and just too much has happened since then. Plus I’d have to redo all the quizzes anyway with a new edition. Plus I’m sick of textbooks, even though my students say they want one. (I am putting that issue aside for the moment, though I could write a whole book on how it’s a bad idea to cater to what students think they need rather than what they really need.)
The snowballing began immediately. Without a textbook, two things were bad.
First, a lot of factual content was now missing from the class. I had liked the labor perspective of my textbook, so that was missing too.
Second, my lectures, which are both written out with images and recorded in audio, must now be gone over thoroughly (not only do I record sections of lecture, but I have been combining these into a downloadable mp3 zip file for each lecture – not very sustainable when one wants to make changes).
So I took a close look at my first lecture, on Reconstruction. And I started adding a bit here and there. And it was boring. I realized I was adding boring stuff, the kind of stuff you’d find in … a textbook. I found that crap for Reconstruction done very well in Wikipedia, so I linked it and moved on.
I got to a lecture on the late 19th century, a section on domestic technology making more work for women. And I got totally sidetracked as a looked for a bit more information. I found out that Project Gutenberg has a bunch of late 19th century Scientific Americans online. I got sucked into the Victorian scientfic mind, and it was so much fun I forgot I was supposed to be fleshing out lectures with textbook stuff.
Then it dawned on me. Don’t do that. The point is to share with students my interpretation and perspective. I can link to anything factual they don’t know. The thing is, every textbook is an interpretation anyway. Writers put in or leave out what they think is important. My textbook didn’t even have Cesar Chavez in it until the most recent edition.
So I’m focusing instead on a bit of cultural literacy, colored by my judgement of what I think they should know. And as I add a bit here and there (oh yeah, flappers, F. Scott Fitzgerald needs to be here…) I am struck by how arbitrary the content really is.
So edupunks, you know the conclusion here, so let’s all say it together: IT’S NOT ABOUT THE CONTENT.
Even the Student Learning Outcomes for the class aren’t about the content (I know ’cause I wrote them). They’re about thinking historically and developing arguments based on primary source evidence. Sure, the COR (Course of Record) is about content, because the University Wants It So. I can link out to that, sure. Or have them go find it. What is the point of an adventure if you not only give them a map but walk them through every step of the wilderness?
I’ll give them the map and the means to the content, but the rest they must simply do. So I’ll finish up deleting the “your textbook says” passages from the lectures and the audio, record something cool on 19th century indoor sanitation, and focus on students collecting their own resources and creating arguments, which is what the class is really about anyway.
As we see colleges like Rio Salado and for-profits like National, Argosy, and Walden “Universities” create huge online programs, we see more and more courses designed by “teams” and taught by associate faculty/staff. When online learning began, of course, faculty created their own courses and taught them, but there were efficiencies to be had by creating one course and having it be reused by everyone. Publishing companies were quick to start creating their own courses to go with their textbooks, complete with Blackboard cartridges and/or their own learning management systems (I was asked by at least one of them to write a course they could sell). And now Google and Pearson are teaming up with their own “free” LMS (you’ll pay with your personal and marketing information) so that people can “share” courses (in their LMS’s format) under a Creative Commons license (Attribution only, of course, so they can be sold later — it wouldn’t do to have them be Non-Commercial and Share Alike).
Sense my disgust? To me, these are all canned courses, made to last a long time and be consumable by anyone, but more importantly, taught by anyone. We continue to sojourn, often voluntarily and with enthusiam, into the Land with No Professor, as detailed elegantly by Alex Wright in his From ivory tower to academic sweatship of 2005.
So now I hear things like this more than ever:
“So what’s wrong with using the publisher’s PowerP*ints if it’s good stuff?”
“So why shouldn’t I use the course cartridge? I create and run my own discussion boards.”
“They do all this video and stuff better than I do — that keeps students engaged.”
I sputter around, after I get my chin off the floor. What about the de-professionalization of teaching? what about improving those technology skills? can’t you see it’s all the commercialization of education and you are a willing participant?
But today, after thinking about this issue for, oh, fifteen years, it occurred to me what’s really, really wrong with using course cartridges and canned material.
It’s modeling the wrong thing.
Modeling is very important — some say it’s the most important aspect of college teaching. It’s our main job, Stephen Downes says, modeling and demonstrating. A faculty member shared with me only today an exam where he accidentally had two questions that were the same, but one phrased concretely and one conceptually. The students aced the concrete question and failed the conceptual question, though the answer was the same. I suggested that instead of asking them what happened, he instead should model how he developed the question, what he was thinking he’d get in response, and what happened when he saw the completed exams. I suggested this would show the students he’s human and works on these things, share his method with them so they feel included, get him good answers to why it occurred, and review the material, all at the same time. That’s what modeling does.
So what does it mean when we build our courses on material created by someone else?
If we are using it wholesale, out of the can, we are modeling a lack of creativity (in addition to implying that our own view as a discipline expert is kind of beside the point). It’s very difficult to model how historians do history (or chemists do chemistry, or writers write) when we are using someone else’s interpretation or method.
We are also displaying an absence of critical thinking, the kind that we say we want our students to engage in, unless we are using canned content as the start of a discussion about perspectives on that content (I wish that happened a lot, but it doesn’t).
And we’re showing a lack of respect for our own professions as practitioners of both a discipline and of teaching.
If we want to promote a thoughtful citizenry that can make important decisions, work creatively to change what’s wrong, and innovate to make our society better, it’s a pretty poor example to rely on canned material.
Tin Can as Cheese Press cc Chiot's Run
Is there a good way to use all this excellent content? You bet. We can disassemble, disaggregate, reinvent, repurpose, re-create. We can take just what we need (quiz questions, maps, slides) and use it to support our pedagogy. If the publisher doesn’t allow that (I can’t take apart the PowerPoints provided by the publisher of my textbook, for example) we don’t use it. We can learn just a few skills — maybe editing video or doing a screencast or slideshow. Make our own stuff. It won’t look professional, and that’s OK. It will look human, and students will be seeing an example of an instructor who makes his own stuff to get a point across. As with modeling the design of a test question, whatever we make will be saying to students that we cared enough to make it to help them understand.
It will also model that we are professionals with viewpoints created from a deep understanding of our fields, individual viewpoints based on common methods, vocabulary and standards. That’s what we want them to do — use the skills of our discipline to better understand the world, and help improve it. As Richard Kahn notes about Howard Zinn’s argument that professors should share not only their viewpoints with the class but how they developed them:
[F]rom a perspective such as Zinn’s, our job as educators is to invite our classes into the rigorous pursuit and production of the living history of ideas—the truth of our unfolding human process in all of its registers. In this way, we thus also model for students how to begin naming and navigating the various socio-cultural forces coalescing around them, to articulate and argue for their own perspectives on society and its institutions, and so in good faith become democratic citizens capable of exerting their own civic leadership.
We certainly can’t do that with a course cartridge.
I have begun to think it is dangerous to consider the digital, the online, the technological, as separate from the whole.
Partly this thought is a result of attending Martin Weller’s presentation this morning for the Change MOOC, where he presented a wonderful discussion of Digital Scholarship. But my question was whether the attention given to digital scholarship as its own issue doesn’t undermine the effort to have it become mainstream.
This goes beyond the “no significant difference” argument that comes up periodically for online teaching, although for me it started there. At our college, online teaching came about as a “modality” or “mode of delivery”, because it was 1998 and we were trying to offer it as an option for students. We taught ourselves how to teach online, all before learning management systems, best practices, or student learning outcomes. And most of us involved said it was just teaching, doing what we do but adapting it for a different “classroom”. I’m not sure I ever saw the difference between “online education” and “education”, or my “online U.S. history class” and my “U.S. history class”.
It’s not that I don’t acknowledge differences between the relationships, work tasks, and communication we engage in online and those we engage in face-to-face. But I also acknowledge differences between relationships, work tasks, and communication in various face-to-face settings, and it has always been that way. If we say “online community” instead of just “community”, we imply a separate reality that may or may not be the case. Rick Schwier’s presentation in Alec Couros’ EC&I831 last night noted that there are many ways that communities form in online environments, and of course there are many ways that communities form in-person also. Schwier noted that some of us use multiple online personalities, reflecting the in-person reality that you don’t talk the same way to your priest as you do to your coach as you do to your mom as you do to your college president.
A class is a class to me, whether it’s taught under a tree, or in a circle, or over the internet, or by hand-written snail mail.
I’m going to argue for completely ignoring the fact that things are “digital” or “online”. In terms of scholarship (it’s own heavily-laden word), continuing to fight for the acceptance of “digital scholarship” perpetuates the idea that it is somehow different from “regular scholarship”, that is is not as real. We shouldn’t focus only on the vetting of articles, the false scarcity of information and the tyranny of for-profit journals, but on behaving as if it’s just scholarship. The same standards (peer review, for example) should apply if you’re going to say it’s real, or scientific, or important, but whether it’s online for free or in a bound pay-walled journal is irrelevant in terms of its value. It’s either good research and useful to me, or it’s crappy research regardless of format.
This is why I am against the idea of having a “dean of online”, a “coordinator of online education”, or anything else that segregates the digital aspects of education into their own sphere. If we do that, we continue to emphasize its differences. While this may be an advantage up to a point (getting funding for online projects, justifying masters programs in educational technology, paying government employees to create standards and rules for accessibility), it also provides ammunition for those who are resistant to technology and resistant to change. It packages the “technology-enhanced” and the “online” and the “distance ed” into something that is easier to dismiss and de-fund. Such packaging can also discourage innovation by making “online education” a specialization beyond the understanding of ordinary faculty, something that requires strict management by administrators. And that packaging can be literally packaged, by selling “online courses” created by “teams” at for-profit institutions, or “course cartridges” in Blackboard, available for those too controlled or too timid to create their own classes.
The distinction thus gets in the way of professional development, when good faculty feel they are entering a new and scary world instead of just extending something they already do skillfully — teach.
So I’m declaring myself against “digital scholarship”, “online community”, “distance education”, and anything else that applies a special adjective to something wonderful we do as humans but happen to do using a computer.
And no, that doesn’t mean my Facebook “friends” are my “real friends”.