Victorian Studies

To begin my work on Victorian England, I need to examine the field of Victorian Studies. Unlike History, area studies of all kinds are newer disciplines, and I often have difficulty figuring out what they’re trying to do. Every discipline has its own methodology and its own literature – that’s what makes it a discipline. Now that I’m moving away from working with online pedagogy and educational technology, it’s necessary to make sure I am aware of the milieu in which I’m operating.

Although by no means intended as an introduction to the subject, Martin Hewitt’s “Victorian Studies; problems and prospects?” from 2001 has nevertheless provided me a good entrée. Noting the expansion of books and graduate programs in Victorian Studies, the article nevertheless critiques the lack of interdisciplinarity on which the field is supposedly based. Hewitt notes several other concerns, including historians uncomfortable with the word “Victorian” and the dominance of presentist topics (gender, women, imperialism) that use the Victorian era just for examples. But a bigger issue is the fact that historians and literary studies have not really combined in an interdisciplinary way, even while conference panels may be multi-disciplinary. Apparently the most comfortable and useful pathway for Victorian Studies has been the “cultural history” of the 1980s and 90s, although it took awhile to shake off the perception that it was elitist. This was interdisciplinary because it used methods like Foucault’s analysis of culture (p. 141). 

But cultural history does not create a disciplinary field that is consistent and has an “agreed focus” (p. 142). The result is that there is no common scholarship, and Hewitt notes a lack of “key texts” (p. 144). This helps me because I couldn’t figure out what those key texts were when I was looking for a way into the historiography of Victorian Studies. Hewitt sees the historiography as fragmented, limiting the impact of important works. Previous historical works also tend to limit biography to a few “semi-canonical” men, such as Carlyle, Mill, and Ruskin (p. 145).

In literary studies, Victorian Studies has become a “sub-field”, and the many journals of Victorian Studies tend to be dominated by literary analysis . When I subscribed to Victorian Studies journal and Nineteenth-Century Studies, I noticed immediately that the editors were almost all from university English departments. As I read the articles, I kept rolling my eyes as the authors seem to plumb the text of Victorian novels for meanings that were obscure, presentist, imaginative, or all three. I found most striking Hewitt’s point that such studies focus on the reading as it takes place in the current reader’s timeframe (ours). The articles use the present tense, as if the characters in the novels are here with us now, while a historical article would use past tense (p. 148).

History, Hewitt notes, is constructive and materialistic, while literary and cultural studies are idealistic and interpretive (p. 149) – I would say “imaginative”. Focusing on the text ignores the history. This is why I dissuade students from constructing theses that seem to show the text as possessing causation (“propaganda led people to hate the enemy”) – we cannot prove such a thesis historically, although it is possible to prove that the text might have been meant to do something, or that something might have caused (or influenced, more likely) a work to be written.  

Hewitt’s agenda includes developing a solid historiography, and creating new research based on larger ideas. His prescription for historians (he’s one too) is to broaden the field to include more ideas and their production, combining more approaches. Since the context and environment of the era is embedded in the text, the process is one of sense-making. At least, I think that’s what he’s saying – he loses me when he talks about “syncretic hermenuetics” (p. 153). His focus seems to be on creating intertwinings of text and practices to create something truly interdisciplinary, where the “text becomes means rather than object” and the focus is on the impact (and reproduction) of the text (p. 154).

In determining which texts have been underutilized, Hewitt notes many that I am engaged with, including essays, lectures, and newspapers – forms of communication not intended to be high culture. His ideal Victorianist study combines elements from history, anthropology, ethnography, literary criticism, sociology, and art history (p. 155).

I believe that the goal here is to provide a more well-rounded, thorough, and (by implication) realistic understanding of the Victorian era. I am at a loss, however, to explain why it is necessary to do this through the methodologies of disciplines other than history. I don’t think I realized that I am a history snob until I began reading Victorian Studies journal, and finding myself enjoying it while at the same time becoming exasperated with the lack of evidence beyond popular texts. The field strikes me as similar to steampunk: an enjoyable romp through Victoriana to fulfill present (and presentist) needs by drawing imaginative connections. (I feel the same way about the new genre of “creative non-fiction”, about which I will write more later.) I in no way believe that the historical method can provide as accurate a portrayal as going back in a time machine, but history is adaptable enough to take on the perspectives, if not the methods, of other disciplines and use them effectively. I think I would have understood a plan for a new Victorian History better than I understand a plan for a more cogent Victorian Studies. 

  

Hewitt, M. (2001). “Victorian Studies: problems and prospects?” Journal of Victorian Culture6(1), 137–161.

Why journalists write such good history books

In an only slightly different life, I would have been a journalist. As a significantly younger person, I followed Watergate closely, reading All The President’s Men (as well as Haldeman’s The Ends of Power), and attending a lecture by John Dean given at my college. I saved all the Newsweek articles on Patty Hearst, and all my newspaper clippings of the 1975 World Series, in a laundry basket. I became copy editor and then editor of my high school newspaper, writing articles and proofing galleys and protesting the truancy laws. I majored in English at UCLA.

I switched to History due to an odd series of events involving a high school counselor who didn’t tell me when the AP English test was offered, a fascination with the musical 1776, and a brilliant course I took with historian Joyce Appleby. I never took a journalism class after high school, but instead trained as a historian. My degrees are in History, and my certificates are in Education.

For the past decade or so, I’ve studied the evolution of the web as a teaching tool, and in particular online pedagogy. I’ve experienced the typewriter, the internet, the web, as customer and creator. I’ve used rotary dial phones, dial-up modems, and cell phones. Even as I experienced digital history unfolding (or perhaps because I experienced it), I have “reported” my findings rather than studying the phenomena as a historian. After years of being the person in the room saying “but this has all happened before”, I have recently returned to the study of history as my primary task. And yet, the history books I most enjoy reading now are not written by historians. They’re written by journalists.

Most of these works are about the history of technology, which was my specialty in grad school (although I studied medieval, not modern, technology). Tom Standage (The Guardian, The Economist) published his brilliant The Victorian Internet in 1998, the same year I began teaching online. The book became a reference for me, a way to connect the present (in which I was frantically operating) with the past I understood. In 2003, a student gave me a copy of Empires of Light, by Jill Jonnes (New York Times), about Tesla, Edison, and Westinghouse. It was another reminder that so many things (commercial competition, technological advancement, bloody-minded geniuses) are not new. Atlantic and NY Times writer Nicholas Carr’s The Big Switch: Rewiring the World, from Edison to Google (2008) was a delight, part of a body of his work that supported my gut instinct that the web was making us stupid and that our dependency on computers had a serious dark side (that was the same year that saw the rise of MOOCs).

Steven Johnson (Wired, NY Times) wrote The Ghost Map, a 2006 book so clear and brilliant in its discussion of the cholera epidemic in London that I assigned both the book and his TED talk to students.

Few of these people have history degrees. Johnson’s are in semiotics and English lit. Carr, also literature. Standage has a degree in engineering and computer science from Oxford. Interestingly for those looking at women writers, Jonnes is the only one with a PhD in history, obtained after she was a published writer for the New York Times.

They don’t pretend to be historians. Standage notes his specialty is “the use of historical analogy in science, technology and business writing”. Johnson just calls himself a writer, and Wikipedia says the same about Carr.  Jonnes uses no noun to describe herself despite her degree.

With such a trend in evidence, it didn’t surprise me to read in Bloomberg Businessweek that New York Times reporter Cade Metz is writing a history of artificial intelligence.

Normally I’m quite the snob about non-historians doing history. For example, we have a number of departments at the college who offer classes with the word “history” in the title, but are taught by language or music instructors. The individuals teaching them are quite wonderful, but they aren’t doing history. They’re teaching cultural heritage, typically without reference to historical methodology. Their technique is usually narrative, rather than the development of a thesis to be proven with evidence. Similarly, the profusion of “history” days and months for groups of subcultures (women, African Americans, etc.) are all heritage-based, although they claim to be doing history in order to show they are on the right side of history, which is another thing entirely.

Such storytelling, however uncomfortable I may be with it as a historian, has always been important to human beings. It has become increasingly significant in recent years, as competing narratives are created to defend particular points of view. To the agggrieved, for example, all of human history may be a story of grievances. Historians study historiography, the “schools” of history formed by different viewpoints (such as Marxist history, or the Annales school, or the New Left). Historians tend to recognize these varying perspectives, though not always. Competing perspectives are inherent in the discipline. They’re a feature, not a bug. Historians know there is no “one” history, but rather histories told for varied reasons. That’s why historical evidence is so important — it is needed to support ones perspective, to ground it in fact.

Neither historian nor journalist, English prof Marshall McLuhan provided the foundation for many of the works mentioned here in his The Gutenberg Galaxy (1962)

So what do journalists and historians have in common? Both observe the world carefully, and note patterns. Both access the past for context. Both rely on sources, tell stories, create narrative, highlight key people and events. But they divide on method. A journalist may consult only a few sources, or a very broad selection of sources, and need not engage in exhaustive research among scholarly articles or primary documents. They may rely on scholars’ interpretations, since they themselves are not engaging in scholarship. Journalists may use more literary techniques to draw the reader in, to make clever connections. (These techniques have actually changed the way history is written by historians, as publishers now seek a broader audience for history books in an age where fewer people purchase books at all.)

Most importantly, journalists need not provide a new perspective beyond the telling of an interesting story. The originality lies in the creative telling of a tale, rather than in the development of an argument that must be proven with facts. Perhaps this is why the articles on Patty Hearst did not lead me to research the Hearst family, or terrorism, or cults. I never got into the history of baseball. I watched Watergate happening but did not feel an urge to research previous presidential scandals, or violations of the constitution, or the composition of the White House staff. The stories were complete in themselves.

So when a journalist turns a hand to history, it has the potential to be more lively, and more immediate. Liberties are taken (almost into “creative non-fiction”) with personalities, like those of Tesla or John Snow. “Bringing history alive” (a phrase that makes me cringe, with its implication of imposed drama) need not involve engaging in historical scholarship, but it does create the all-important analogies that Tom Standage mentions. These books bring facts to light, and connections between past and present. Without the work of writers like Standage and Johnson, it is unlikely I would have found the connections between what I was doing with my teaching, and what others have done in the past. Even if I discovered these connections while defending history in the various MOOCs in which I was enrolled, I might not have realized my own potential to write about them.

Skilled journalists make the reader feel engaged in the story, even if their thesis is nothing more than, “look at this cool series of events that happened”. Because they live in our time, their reasons for looking into the past are the same as those of historians: to find insights about ourselves in the present. With such similar goals, it isn’t surprising that so many good books featuring history are written by journalists.

Paradigm shift? best practice? perhaps not

In searching for information about distance learning theory that might inform my research into 19th century distance education, I came upon this article (thanks to Jenny Mackness):

Lee, K. (2016). A paradigm shift rhetoric and theory-practice gap in online higher education : a case study of an open university. In S. Cranmer, N. B. Dohn, M. de Laat, T. Ryberg, & J. A. Sime (Eds.), Proceedings of the 10th International Conference on Networked Learning 2016 : (p. 251-259).

The author is Kyungmee Lee, Lecturer in Technology Enhanced Learning at Lancaster University. Her paper focuses on the discrepancy between social constructivist learning theories and the actual instructional designs for online classes used at places like the Open University.

I have long suspected that the maniacal adoption of collaborative pedagogy was based on very little evidence of efficacy. Instead, in my experiences studying connectivism and constructionist theory, I was aware that such methods were lauded by techno-utopians, many of whom weren’t actually teaching first-year college students. Studies demonstrated student satisfaction with the methods, but not better grades.

Lee notes that despite the insistence on a “paradigm shift” from “old” methods to collaboration and constructivism, resistance implies that the paradigm never shifted at all. The shift has been purely theoretical, and not adopted in practice, where most online classes do not use these techniques. While many studies have chided instructors and designers, implicitly or explicitly, for resisting the new and superior methods, this one subversively questions whether the methods really are better.

Hercules and Bacchus presenting libations while Atë, goddess of mischief and deception, flies above (1778)

It’s an interesting approach, questioning the assumption of a change in the field. While the study does not get into the “new” pedagogies per se, it implies that they may not be better, or may be better under certain conditions, or that people who really want a paradigm shift think they can just declare one. This last is most interesting to me, because it begs the question cui bono?

Many of us assume that if there’s a new technique or tool, it might be better than what we’re doing, or at least be better than older options. We ask questions like: will this work for my students? is this an improvement on what I’m doing? We give it a try.

With a tool, it may occur to us that adopting it benefits the company providing it, especially if we pay for it. If we don’t pay, it’s become increasingly obvious that the “freemium” model either benefits the startup, or that our data becomes the product being traded, a la Facebook.

But perhaps with a method, we fail to ask these questions. To whose benefit is it that I adopt this method? The well-meaning researchers and their careers, certainly. But our students? If so, which students? Does it benefit me as an instructor? How?

I have attempted many different pedagogical models in my 28 years of teaching, both in the classroom and online. None have been inherently better than the others. Each method has advantages and disadvantages. Whatever is trendy, though, is considered a paradigm shift, or a “best practice”. Right now, for example, the Online Education Initiative, which is moving to control all online classes at California community colleges, insists that collaboration among students is required as part of its online course approval rubric. There is little research to support this requirement.

If we consider that a paradigm shift has occurred, we are much more comfortable requiring such methods, as if they were based on research instead of theory and some successful practice. By questioning whether the basic principles are sound, whether there is any support for “best practices”, we give ourselves much more choice. We also give ourselves the opportunity to examine past practices, not as outmoded or disproven (which in most cases they are not), but as possibilities for current and future practice.

What is required?

Although I have stepped back quite a bit from my reading and research in online education, I still have a Google Alert set, and still receive and examine recent articles, when I can stomach it.

The dictatorial tone of both articles in my inbox today is the subject here.

The first, The Necessary Knowledge for Online Education: Teaching and Learning to Produce Knowledge (Ferreira et al), did a study of 27 educators, all in the field of Education, to determine what knowledge (this sort of article usually says “skills”) are needed to teach online. What struck me was the premise, stated in the abstract:

Online education requires pedagogical mediation and the skills and competencies to work with technological resources which promote interaction, collaboration, and co-learning.

Well, that’s just not true. Online education does not require an emphasis on collaboration – rather it is one possible approach. It is also entirely possible to create online education that personalizes the class through different kinds of approaches to content, or emphasizes at every step the learner’s relationship with the material rather than through colleagues and “co-learning”. I understand that the current phase in online education pushes the collaborative approach, but it certainly is not “required”.

The second article, Online Continuing and Professional Education: Current Varieties and Best Practices (Schroeder, et al), features this idea:

Teaching online requires a team, not just an individual. While face-to-face teaching may be a singular effort, online teaching includes a multitude of technical, pedagogical, environmental, and associated considerations that requires a team of experts.

That’s not true either. I have never had a “team”, but rather developed not only my own pedagogical and technological skills, but helped design a “Pedagogy First” paradigm wherein the individual instructor’s strengths were basic to course design. I realize that these days there are more resources (among them instructional designers with advanced degrees and research articles produced by candidates for PhDs in Education), but those do not, by some reverse design, indicate that these things are required.

As the literature has developed over the last decade, much of it written by people who are not teachers and have not taught online, the “options” have become “requirements”, and the possibilities have narrowed into “best practices” (best for whom?) and necessary elements. This creates downward pressure on the creativity of teaching online, stultifying the field and cookie-cuttering our courses. Faculty who want students to focus on content are forced to develop “interactions” which oppose their own pedagogy, common sense, and experience. Helpless in a context they did not create, and for which they are pedagogically unsuited, they are told that not only is the social learning method “required”, but that a team is “required” to help them.

Did I mention I’d stepped back from reading the newest in online ed? There’s a reason for that.

The LMS and the End of Information Literacy

Having worked with the Canvas system deeply for several months, and then worked closely with an online student who needed help at various levels, I have concluded that the underlying philosophy of Canvas (and OEI in California) is to remove the information literacy requirement for online learning.

Canvas’ defaults encourage a simplistic, linear course with step-by-step navigation for all tasks. The features for instructors to customize extensively, have students collaborate, and make grading meaningful, are conspicuously missing. When requested in the community, such features meet with success mainly when they adhere to the basic philosophy of simplicity.

computerizedlearningThe implication is that any depth must exist within the instructional materials accessed through the system. At the top level, the environment in which the student must work, the danger of cognitive overload is mitigated by providing as few options as possible. It is a clear return to 4th grade “computerized learning”, the kind that takes place in a lab. Pupils sit at stations, and the software guides them step-by-step by pressing as few buttons as possible. With visual and touch-screen interfaces, this is now even easier. Complete a small task, get instant feedback, press ‘Next’.

The fact that such interfaces prevent branching, distributed, or complex learning is considered to be a feature, not a bug. All information is “chunked” for easy understanding and assessment.

Back in the early 1990s, we were all excited about the open web and its possibilities for the exploration of human information. We were able to look up things that had previously been inaccessible before, and we developed pedagogies designed to use that easy-to-access information. To do so meant designing our own pathways through the material, to help students turn their study into knowledge.

With the coming of the read-write web, it became possible for users to interact with the software in online spaces. IRC and other forms of synchronous chat had been available, but required some technical knowledge. Web-based interactions, which required little technical understanding, became simpler and easier to use. With the development of private web spaces like Facebook and Google, companies came to control the interfaces, simplifying even further what we needed to know to use the tools, and pruning the content we could access easily.

wikinoAlthough at first there had been plans to teach information literacy as a school requirement, this trend has tapered off because of such ease of use. In many places, information literacy is still articulated as a goal, but is not implemented in any meaningful way. The result has been students who have no idea what to type into Google when asked to find, for example, information about American imperialism in the late 19th century. We already are aware of the challenges of distinguishing between good and bad sources of information, and want students to distinguish between a scholarly source and a pop culture source. But instead of increasing skills, the fear of bad websites has led to banning certain things, through filters in grade schools and syllabus dictates in college. (When I encouraged my student to use Wikipedia to find primary sources, she was aghast, telling me it had been drilled into her head for years never to use Wikipedia for school.)

Increasing numbers of students have no conception of what constitutes a website, or a link, or a browser. With no understanding of how to navigate a complex web page or database, students have become unable to comfortably navigate a complex online course, regardless of the LMS. It is possible that only students with more sophisticated web skills are able to benefit from the learning pathways we design. As instructional designers remove more and more of our responsibility to construct these pathways ourselves, the “best practices” encourage computerized learning goals such as chunking, instant feedback, and tightly controlled pathways at the expense of discovery, integration and community.

While I would prefer, for the sake of our democratic society, a metacognitive awareness of the control exerted on us by our tools, I have to admit the temptation to follow the larger trend. We have successfully trained an entire generation not to think while using an electronic tool. We may no longer be able to expect them to do so for the sake of their education.

Related posts:

Results from 131 students

It’s only taken me 17 years of teaching online to develop a student survey that is both broad enough to cover all my classes and narrow enough to give me good feedback.

Just sharing a few things here. Total students responding was 131. Most students responding were passing the class.

ClassElementsRestuls

They still like my lectures the most, and textbook readings the least. They still like posting their own primary sources.

AddtlElementsResults

Hours and hours of work on that Help Page and – no surprise given what they email me about – they don’t use it. They do like seeing the whole course on one page (so I won’t switch to showing only the current week, an option in Moodle) and they like my comments and the audio of my lectures (I’ll read it for you!). The None category is a little depressing….

EngagementResults

The engagement results are clear, too. They like the lectures and posting their own source. They don’t like reading much. But they really liked what I added this year – the completion checkboxes on the Moodle page. I will be sad to lose that. But note: they like seeing each other’s work, but don’t require contact with other students. I’ve been saying that for awhile – collaboration and teamwork is online classes is not always needed. For my class, engagement with the work and posting what you find may be taking the place of “interaction” among students. They can learn from each other without necessarily engaging in forums in response to each others’ posts.

Retention and the affective domain

I hate emotions. Yes, I know that’s an emotional thing to say, but they get in the way of learning more than any other thing.

I struggle to understand why students drop online classes. I’m not getting much help from the research. Compared to the traditional classroom, we know that online students get lower marks (Fonolahi 2014). But we’re also thinking that they need greater social interaction (Boston et al 2009), want more direct instruction and feedback (Gaytan 2015), and apparently do not need to experience a locus of control (Cui 2015).

Couple this with the article in the Atlantic on Starbucks helping baristas go to college. What’s working to keep students enrolled, the article points out, isn’t just the money for tuition. Contracting university ASU has in turn contracted with a company to provide personalized monitoring. Students are called and encouraged to stay on track. Most need assistance with their confidence as much as working their way through bureaucracy.

The undercurrent here is emotions, the affective domain. I suspect a great deal depends on how students feel. If you feel comfortable in a class, you stay in the class.

On-campus classes often have a built-in comfort/affective boost, because students have been in that environment for 12 years of school. We remove that when we go online – I understand that. And we assume that because students communicate with each other and with parts of society on cell phones and computers, that the environment is familiar, but we know that for learning it really isn’t.

So we worry about the social online environment. Will students feel isolated? Will they feel they aren’t really in a class?

But now I have to add: will they feel it’s too much work? will they feel they don’t have time to do this class? will they feel that other classes are easier than mine so they’ll drop mine?

My classes are friendly, I’m friendly, I reach out, I email when people are struggling. I use their names. I track all these students. I contact them. I do not phone them or go to their house, though (I’ve had an admin suggest that, but there are many reasons students take online classes, and one is privacy).

Since this post in 2009, my drop-out rate has increased. I have done surveys on why they drop, and asked them. In response I’ve reduced the workload, especially the number of writing assignments. I’ve considered publisher cartridges and programs. I’ve even considered switching from Moodle to Blackboard or Canvas, but if I switch, then my very best pedagogy (the History Lab) won’t work because the LMS won’t let me batch grade posts.

And then I start to wonder, why is all the pressure about retention put on faculty? Some newer studies suggest that retention in courses students take for online breadth-requirement classes (like mine) is 64%, about 10% lower than on-site (Wladis et al 2015). If I had more history majors, it would be closer to 81%. All the studies acknowledge “external factors” (reading level, GPA, online class experience, jobs, family support, etc.) and yet all the advice is that faculty should do things to make the classes more inviting, more engaging, more relevant to their current lives regardless of the subject (Park and Choi 2009)

Could the institution help? Yes, I think so, but how they could help would be controversial:

1. Create a barrier. Students attempting to enroll in an online class would have to do something to force understanding of the self-direction and commitment required. Perhaps this would be an interactive tutorial, but it should be something that keeps popping up throughout the semester as a reminder. This might help students feel like this will be hard, this will be a challenge, this will require effort.

2. Have the college contact them. Not the teacher, the college. According to this article, at the University of West Georgia, retention increased when faculty reporting students they couldn’t reach to academic advisers who tracked them down and offered cheerleading services.

I have more research to do, of course (I’m stashing all my Diigo bookmarks here). Many of the studies are based on student surveys, and I know from faculty evaluations that these seemingly “objective” surveys are usually based on how students feel when they respond to them. Some of the research (Croxton 2014) is tying together student satisfaction and retention in terms of theory. In a world where some students want trigger warnings and controls on free speech in order to protect their feelings, any focus on how students feel, and how their feelings affect their decision to drop the class, would be helpful.

Challenges of the Experiment

In my last post I detailed my experiment for Fall, wherein I will teach one section of modern U.S. History online using a publisher’s course package, adding only my own discussion topics (four) and writing assignments (five). All other presentation materials and assessments will derive from the package. The class will take place in Blackboard, our fully supported college system.

There are challenges already. The package is set up by chapters, yet chapters cannot be assigned individually inside Blackboard. I have “linked” my Pearson package to the Blackboard class, but all this means is that a button can be used from inside Blackboard which takes you out to the Pearson site. (Supposedly the Blackboard gradebook will reflect the Pearson grades – I’ve “linked” that too.)

PearsonListBut that’s not the real challenge – it’s the material. For each chapter, there is a long list of resources: document activities, image activities, map activities, “closer look” features. Since each of these has at least one question attached (I assume that’s the “activity” – there’s nothing else active here), I assumed these were multiple choice questions, for automatic grading. Turns out most of them are “essay” questions, all of low quality (i.e. “what is x talking about in this document?”), that I would have to grade. I’ve assigned over a dozen for each chapter. Besides, the whole idea of the experiment was to be using their pedagogy as much as possible instead of mine.

So now I’ve spent many, many hours creating multiple-choice questions, one for each document or image. Because I’m an experienced teacher, my questions are good and require critical thinking even though they’re multiple-choice. That in itself may undermine the experiment.

The other (huge) challenge is the quality of the materials. Not only are the questions stupid, but the items themselves do not contain full citations. Some are just copyrighted “Pearson”. Many do not name a photographer, or just say “Library of Congress”. PearsonClipSome don’t even have a date! They let you into just enough code that I can kind of correct some of these by adding words to the title. But there are audio files with no lyrics or transcripts. And, worst of all, the primary source video clips (Edison’s footage of Annie Oakley, footage of the Rough Riders) are in low resolution and look terrible. I could find better quality of the same footage using Internet Archive. There are also typographical errors in the transcript and in the titles and descriptions of the sources.

The interface for me requires a lot of deep drilling to do things, and the system persists in showing items I supposedly made invisible because I won’t be using them. It does, however, distribute any changes I make across the system.

Clearly MyHistoryLab is just a book supplement, rather than a full course cartridge, and yes, I expected much more. REVEL, their new, more interactive program, only became available yesterday, so I can’t use that yet because I don’t have time to play with it and make assignments. Stuck with MyHistoryLab for this semester, I can only hope this will be a semblance of the experiment I planned.

Rhizo15: But I like content!

As Rhizo15 leaves the week about content (obviously I was not paying attention), I feel obligated to be the voice at the back of the (now empty) room saying, “But I like content!”

oliver-twist-007I love it. I’m the kid who sat on the floor reading the encyclopedia. I’m the student who got thrown out of the library when it closed. I’m the one looking up studies on the internet. I love content. All content. The expression of human knowledge, going back for centuries. Give it to me. In books, online, in text, on video. I want it all.

Why do we diss content in favor of connections? I like connections, I learn from them, but only when I bring something to the table. What do I bring? What do my students bring? Understanding of, or questions about, content. Content is what we’ve read, seen, heard.

Let’s not remove content – please don’t take it away. If we do that, we’ll all be connecting and communicating, but about what? About connecting and communicating? I like information – it gives me something to argue about.

Say, all these Rhizo15 tweets and posts I’m reading – they’re content! The product of other people’s minds, set out for me to absorb/enjoy/dispute/misunderstand. We create content, we share it, we respond and the response is more content.

I’ve MOOCed and rhizomed and connected and I still love content. The content we’ve inherited, the content we’re given, the content we discover, and the content we make.

Rhizo15: Symbolic measurement

I can’t measure learning, only the symbolic artifacts of learning.

That’s not so strange. We measure civic responsibility by how many people vote, but we can’t measure how “good” those votes are, the extent to which they are backed by intelligent thought or research into the issues. We can only measure outcomes.

As a college instructor with over 200 students and no assistants, I’m in an impossible situation to assess learning. I can only assess outcome achievement. I pretend that I can create assignments that will produce symbolic artifacts of learning. Then I grade the artifacts.

But it’s all a ruse. A student comes in with certain skills. Perhaps they already know how to learn, or have already learned the subject. They get As and Bs because they are engaged and eager to learn. When I give them an A for producing excellent outcomes, I have no idea whether I am grading their learning. What if they already knew or had examined the material before my class? What if they did all the work, but it didn’t change their mind or approach in any way? The “A” is a measurement of outcome achievement, regardless of background.

Similarly, the student who turns in no work at all may have learned something, something amazing, something that may or may not have related to what I taught, but was connected to my class. I’ve had military wives who learned, not history, but how important it was for them to have somewhere to be each day. I had a surfer guy who learned that if he synthesized information and then created his own interpretation, his conclusions were valid and could be important to others. I have students who learn that if they are polite to me and treat me with respect, they will in turn be treated with respect, and students who learn that faceless institutions don’t have to be impersonal.

If my measurement for that were individual, it wouldn’t relate to their grade in History. If my measurement were societal, I’d need to look to society. When I look to society, I see an awful lot of people behaving as if they’ve learned nothing from history. So instead I hope that they learned what they needed, whether or not I was able to assess it.

(this post related to the Rhizo15 class)