I still remember, from all those years ago, my very first computer experience. Booting up the system—waiting for the system to boot—waiting more—hearing screeching noises—waiting more—growing angry—getting hungry—hearing the lunch bell—going to lunch—coming back—waiting for the system to boot—smelling ozone—rebooting the system: hours later, we had the classroom Apple up and running! Maybe the term “classroom” isn’t grandiose enough. “School” computer is probably more descriptive. Maybe even “town” would work. Despite growing up in a very small town and attending a small school, competition for use of the newfangled machine was intense. The novelty took awhile to wear off, though it did. Somewhere along the line dying on the Oregon Trail lost its glitz.
Growing up in a relatively low tech part of America, computers seemed like unattainable objects of supreme power and status. Nobody owned one—it could cost more than a thousand (!) dollars. Not only that, but they seemed utterly impractical. Sure, one could type, but that’s what typewriters were for. Being a creature of either too little imagination or too much at the worst possible moment, I could think of fewer than five tasks that one could possibly use a computer for: games, typing, number crunching—okay, fewer than four tasks. To make matters worse, it seemed to take hours to start the damn things. In short, they were toys for schools and scientists.
It wasn’t until I was in the 7th grade before I’d even heard of a PC, but already by this time the uses of computers were being made manifest to me. If necessity is the mother of invention, it’s also the mother of having to learn to type when the teacher assigns you four pages of text to copy. Matters only got worse as I grew older.
More out of fear than a sense of responsibility, I decided to take a typing class my freshman year of high school. Our instructor was most of the time disarmingly kind, but a fierce battle-hardened matron of oppression when in class. Ms. Fine was what many would term “old school,” and her singular task was to make typists out of the sorry lot of us. Armed with smelly old HPs, black on green text, and WordPerfect, we ventured forth on our first typing lessons: [Program a metronome at 70-80 BMP] “Jay jay jay space, atch atch atch…Jay jay jay, atch atch atch…Jay Jay JAY, atch atch atch, jaaaay jay jay, atch atch atch RETURRRN!” she would trill. On and on we went like this for months and months, and being on a computer lost its novelty again. Though, probably out of any “skill” I learned in high school—since I never took wood shop—Ms. Fine’s typing course was by far the most useful.
My parents eventually bought a computer so that all of us could theoretically function in the increasingly e-oriented nineties (which were waning fast by now). I began doing more and more work on the computer, typing instead of handwriting. This process enacted a slow but major change. My impeccable spelling atrophied, I grew more loquacious on paper with my thoughts, and I started seeing writing products as wholes, not just as paragraphs strung together. It seemed as if a bigger picture had been revealed to me.
Eventually, computers became a ubiquitous technology in my work and play time. They blurred the boundaries of workspace, living space, and play space. Not only that, but they seemed absolutely crucial to success—so much so, that having one break meant having to buy another to fill the gap (despite the heavy financial burden). Not even televisions shared such priority, or even vehicles. The internet—at first a novelty in its own right—became the juggernaut that dominated communication and commerce giving us the ability to be both everywhere at once and imprisoned in only one place, tied to a box.
Despite its rather sedentary requirements (one must sit or stand still in front of a computer to use it), moving faster became the medium’s obsession. Typing allowed me to write much more quickly, more voluminously in much less time with much less pain. Tasks that were once relegated to other mediums were now almost exclusively conducted through the computer, mail being the most exceptional example. Computers for me became the “do everything” technology, and it was almost all in an effort, a gasping one at that, in keeping up.
I heard somebody say (maybe in 597 actually…or was it at the 302 meeting?) that email conveys a (false) sense of urgency. As a managing editor of a relatively small regional academic journal, I can attest to having a vast reserve of this feeling. With dozens of messages arriving each day from members with a variety of demands, I feel weighed heavily by the need to get to them all at once. Of course, this lack of system breaks down, and so does “productivity,” at least in the electronic age sense. And even where I think I’m trendy, I find rapidly that I’m actually quite dated by my e-malapropisms. A Cougar Quest student of mine recommended that I use IDK for the Tolkien creative writing class I teach via MUSH (Multi-User Simulated Hallucination—think old school text-based RPG on the computer with green on black text and vague cardinal directions, such as “obvious exits are NORTH and WEST). I had never even heard the term IDK until yesterday in 597. I should feel lucky that I didn’t open my evals until right after class. I might’ve ignored its significance.
It’s sort of scary to think of it this way, but every class I teach is affected and effected by computer technology. It’s a strange system with a great proportion of feedback—effecting trends that affect itself, computer technology seems to reinvent, in some manner or another, at a very rapid pace. I try to use different technologies every semester, which on one hand keeps me on my toes and makes teaching no two classes the same, but on the other makes it difficult to find the right pedagogical tools for the right jobs.
The increased ceiling for speed and multi-tasking proportionally increases its capacity to endure change by making itself invaluable. Wow—that last passage reads like gibberish. Let me try again: Computer technology has helped free our attention by reallocating it elsewhere, so that should we find ourselves bereft of it for any reason, readjusting our attention could become a serious problem, both in terms of habits and available resources. Who has a typewriter lying around, or the time to write a 20 page seminar paper in pencil? Maybe everyone has these things, or things like these things. But standards of living are hard to change, especially when our perception indicates that change in the “wrong” or “backwards” direction. Unlike many other technologies, computers adapt to shifting needs, expanding their repertoire. Pens, for example, still do what pens have done (for the most part) for the past thousands of years. When I first saw a computer, I never imagined anything like the internet was possible, much less than Skype (it’s like a Star Trek viewing screen!).
It’s difficult to precisely quantify just what exactly I use computer technology for in a pedagogical context (or any other for that matter). Aside from my own grading, presentation, and research materials, I’ve made using computers a requirement for my courses. This requirement seems natural, especially given how widespread it is. On other hand, computer literacy isn’t necessarily as widespread. It’s obvious to state that not everyone has the same access and experience to particular technologies, but the implications of this fact are heavy. As an instructor, I often don’t feel very technology literate in some contexts, and I’m sure many of the students in my courses feel the same. I feel as if I’m asking students to use a tool that I don’t quite know how to properly wield. So when using computer technology, I often wonder if we are in effect using wrenches to nail in wall tacks.
“Right tool for the right job” adages notwithstanding, computer technology literacies are ubiquitous, and it’s my responsibility as a teacher to make them applicable. However, that applicability need not derail the work of a semester. If an integrated approach just doesn’t work, sometimes abandoning it in favor of something else is better than slogging through an entire semester. Of course, that kind of shift produces problems in its own right, such as spending more time on figuring out technology rather than teaching or writing. However, as technological changes continue to rapidly mount (in some respects more so than others) and the demands for a varied computer literacy increase, adaptability may be as valuable a composition tool as any.
Jacob, I love the kind of pragmatism that informs your personal history with computers and computer technology and your thoughts on how those things might be integrated into a classroom.
ReplyDeleteA thought on your thought:
You bring up the very real pedagogical problem of trying to make sure that technological instruction does not outweigh (in terms of time) writing instruction. This, for me, speaks to the necessary programmatic changes that would need to occur in order for definitions of composition (I assume you are speaking about comp. classrooms) to be expanded (as Yancey, Selfe and Wysocki suggest). Because you are right, if the school or department has outlined certain learning goals for a class is it your responsibility as a member of that department and as an instructor to try your hardest to make sure that those learning goals are met? If you are spending time with technological instruction can you meet those goals? However, if the comp. classroom is not primarily focused on traditionally written assignments, but on "composing" than the time issue is not really as relevant. I think.
What I am trying to say is: I agree with you. Without proper training, programmatic course redesign or redefinition, without instructor support the integration of new media in the comp. classroom might conveniently take second fiddle to more traditional means of composition. But, as you conclude, and as Yancey says, eventually the presence, power and/or necessity of new media may be too large for the periphery.
I'm always interested in the ways in which income changes the way people relate to/deal with technology. I think that in the 90s, it seemed obvious. The schools and the more middle/upper middle class had access, but nobody else did. I felt the same way growing up. It was a momentous occasion when we finally acquired a persona computer, and I knew exactly how much it was costing my grandparents (I remember they paid that Gateway bill for at least a year, maybe two).
ReplyDeleteI wonder, what do you think about how this translates now? We think of technology as being highly accessible. Everybody has a computer, right? I mean, we have at times required our students to use these digital communication tools, but how many of them are fluent in this language? How many of them have access, even now? You talk about making computers a requirement in your courses, and that it seems natural because the use is so widespread. But I'm not convinced it always is. I am reminded of what Kristen said I think our first day of class about Flash and how it never made it to community colleges.
I wonder what we say to our students when we tell them that they have to be fluent or at least knowledgeable in these kinds of discourses to even be able to participate in an English 101...?
All of that to say: I also require my students to use technology, but I am afraid that I discount those who did not necessarily have access to it. I mean, we try to be respectful in other ways, not assuming all of our students are this or that, and so we change the way we speak to them to be respectful of differences. But we push this. It seems strange to me, and I do it as well. Blah blah blah. And you thought you were a rambler.
Yeah, I'm always worried about my tech-requirements. Plenty of people don’t have computer access even at college. I think part of the reason my parents decided to commit to a computer was because of the stress--so many more assignments were required to be typed, and to boot, so much information was needed on the internet. Come to think of it, I find it odd that our teachers were making us type/research online as much as we were, especially considering that we lived in a pretty poor community and the school didn't have a public lab.
ReplyDeleteThough I’ll often spew that the AML counts as having computer access, I don’t really believe this. It’s not the same as living with a computer, as working with a computer in your own space and time. I can’t work in the AML—not on anything requiring deep thought anyway.
Normally, I do leave the option open for hand-written assignments, but it’s nothing I publicize. So in this respect, I feel that I marginalize hand-writing, prioritizing the almighty keyboard. This is a point of much ethical wrangling for me.
Perhaps weirdly, and probably because of these apprehensions, I’ll sometimes abandon using certain technologies at the drop of a hat. In one course, for example, my ANGEL discussion boards proved too unwieldy for students, so I just dumped them and re-did my syllabus. The strange part to me is that I felt relieved, even though I spent hours essentially reconstructing the entire course. Maybe that wasn’t the best approach (for a variety of reasons), but I felt that I was somehow compromising. I’ve done the same kind of overhaul since, and I’ll probably do it again. But maybe that process will grow easier with experience, though I’ve found that I can’t teach a class the same way twice. Just doesn’t work.
Am I going insane?
Jacob,
ReplyDeleteEarly in your post, you describe your involvement with computers as a kind of growth that benefited you practically and in a more conceptually abstract capacity, noting that "I grew more loquacious on paper with my thoughts, and I started seeing writing products as wholes, not just as paragraphs strung together. It seemed as if a bigger picture had been revealed to me." You seem to be talking about a change in cognition here - an initiation, possibly, into a set of habits and thought processes that dissolve some of the restrictions between, say, our ideas and our expressions (texts). Elsewhere, you suggest that our recent advances might be bittersweet: "Computer technology has helped free our attention by reallocating it elsewhere, so that should we find ourselves bereft of it for any reason, readjusting our attention could become a serious problem, both in terms of habits and available resources." What do we stand to lose by trading one kind of adaptability for another in our embrace of new technology? Though I agree that print and digital literacies should receive equal emphasis in the classroom, I wonder if we might be diverting our attention in a way that is, on some level, irreversible. What are your thoughts?
I'll expand on my comment by saying that while I find the idea of functioning without computer (or other) technology very enticing - theoretically - I really am quite sold on computers, the Internet, new media, and whatever else we might try to include in the scope of our discussion in Teaching with Technology. As romantic as a return to a "simpler life" can sound, I have never been completely sold on the idea that we should default to technophobia whenever a technology with potentially detrimental applications becomes available, and I certainly prefer a multimedia classroom (and life experience) to something "simpler." On the other hand, I do believe that the misapplication of a technology - including exclusive concentrations on the mastery of a particular tool - can severely limit instruction, and our development as thinkers.
ReplyDeleteAck, ran out of space. I've continued my responses in a separate post. Now, time for sleep. See you lot on the morn.
ReplyDelete