The points Gardner Campbell brought up in his essay “A Personal Cyberinfrastrcuture” and the lecture “No Digital Facelifts” were some of the most obvious things I’d never heard anyone say.
I found myself agreeing with everything he said regarding the current educational outlook on teaching in the evolving digital age, yet wondering why this was the first time I was hearing anyone say these things. It all seems like such common sense. Of course students should be required to learn how to host their own websites; of course we should be spending our money on an education that will teach us how to cultivate a digital presence—something actually necessary for many of us as we get ready to face the rejection of attempting to join the workforce.
There was an article in USA Today recently about a study (with somewhat dubious research that I’m going to cite nonetheless) that concluded, among their relatively small sample pool (3,000 university students from 29 different schools), 45% of students “showed no significant gains in learning” after two years of college. After four years that number dropped to 36%. The findings are based on surveys, academic transcripts and the results of a standardized test, the Collegiate Learning Assistant.
The article also notes that students are studying for less time than students of previous decades, yet the average GPA of those surveyed was 3.2.
None of this information surprised me. When I reflect on all of the classes I’ve taken at UMW in the past three and a half years, with the exception of creative writing, journalism and a few traditional English lit. classes that were particularly interesting, I’ve exerted little effort beyond occasionally being in the same room as the professor and my peers all engaged in what was supposed to resemble the learning process, and my non-efforts have consistently been rewarded with A’s and B’s.
Rather than doing work for those unnecessary classes I was required to “elect” to take, most of my time in the past four years was spent dicking around on the Internet. In fact, already this semester, I’ve had considerably less time to dick around on the Internet because I’ve been learning how to actually use the Internet. And I don’t even mind!
The closest I’ve come to learning about digital identity and cyberinfrastructure versus “digital facelifts” was in the course “Principles of Newspaper Writing” I took last semester. We were required to build websites that reported on different aspects of life at UMW, making them interactive and taking them beyond just posting print stories to our WordPress blog newspaper thing.
Mostly, though, we just embedded a lot of hyperlinks.
Our group made a valiant attempt at creating the most amazing website the Internet has ever seen, but somehow I think we came up a little short. We found ourselves lamenting the fact that we didn’t have time to learn coding so we could use a more attractive theme. We were discouraged by how much effort the most basic technical aspects of the site required, eventually ending up with something my wildest nightmares couldn’t have predicted (that’s an exaggeration, but, with the exception of the site’s beautiful header, the whole thing was considerably more underwhelming than I’d have preferred).
We all did our bests, but Gardner Campbell confirmed what I’d been suspecting all along: our bests should have been better.
The fault doesn’t lie with us, nor is it our professor’s, or even the university’s. The problem is how low the bar is in general for people to do exceptional work in this medium. Growing up, we weren’t exposed to webservers and cyberinfrastructures because they didn’t exist; instead we were taught that everyone and everything online is just a 49-year-old man with a mustache sitting in his mother’s basement lurking around chatrooms, trying to expose himself to us.
Instead of waiting for the Internet to take advantage of us, I’m excited to finally start learning how to take advantage of the possibilities it has to offer.