|Editing Out Obscenity: Wikipedia and Writing Pedagogy|
|Home | Revision in Thinking | 24 May through 6 June 2007 | Explanation in Process
Assessment in Progress | Wikipedia in Composition | Notes | Acknowledgments | Works Cited
Alan Liu's "Student Wikipedia Use Policy," a draft of which he posted to Humanist on June 29, 2006, represents a thoughtful example of a professor's guide to college students who use Wikipedia articles as resources when writing essays and research papers.
I have inserted links in the text of this document to current versions of Wikipedia articles. My "Works Cited" page contains links to the versions of the same articles that I used for my research. Exact citations, therefore, may only be accessible in versions archived in article history. I decided to link in this way for reasons that I discuss in "Assessment in Progress." In other words, even though a Wikipedia citation appearing in this piece may not appear word for word in the current Wikipedia article, that syntax underlines the current presentation of article content, and consequently retains value as readable text. Although "hidden," it marks what shows.
I have described the fragment of text that I edited out of the entry on thermodynamics as obscene; however, it also contained hate language. The fragment denigrates gays and people of African descent, suggesting that members of both groups practice bestiality (“Thermodynamics” 24 May 2007 16:33 UTC). In his article “The Charms of Wikipedia,” Nicholson Baker describes some of the vandalism he replaced; at times, he reverted rude language, at other times, nonsensical combinations of words, both often containing profanities. He does not discuss the presence of hate language in any entry he edited or saved from deletion. Unshaken by vandalism in Wikipedia entries, though, and convinced that the efforts of Wikipedia administrators, of regular contributors, and even the efforts of users like me, maintain the integrity of individual entries, Baker insists that “the ‘unhelpful’ or ‘inappropriate’ – sometimes stoned, racist, violent, metalheaded – changes are quickly fixed by human stompers and algorithmicized helper bots” (Baker). To demonstrate the efficiency of quick fixes, Baker recounts actually witnessing anti-vandalism software VoABotII flag and revert vandalized text in the entry on bedbugs “less than a minute” after posting (Baker). He considers vandalism “a game” and although “Wikipedians see vandalism as a problem,” he argues that “Wikipedia would never have been the prodigious success it has been without [it]” (Baker).
Generally, I agree with Baker. Vandalism is an expression of free speech. Secondly, “bad” writing does have a place in the process of producing “good” writing. As well, desire to edit vandalized text may be one of the primary motives for users to join the ranks of Wikipedia volunteers, as it was for Baker. And like Baker, I find the mechanisms to override vandalism built into the contributory structure and technology of Wikipedia largely sufficient to sustain the encyclopedia, but I also recognize the distinctive nature (violence in language) and consequences of hate language (extra-textual discrimination; in extreme instances physical or sexual assault) when compared to the nature of and consequences resulting from other forms of regularly occurring vandalism on Wikipedia. Baker’s examples – for instance, his discussion of the “flutter” on the Pop Tarts page – elicit laughs; the fragment that I edited out of the entry on thermodynamics does not. Furthermore, Wikipedia’s definition of vandalism fails to distinguish hate from other textual interference: “vandalism is any addition, removal, or change of content made in a deliberate attempt to compromise the integrity of Wikipedia….types of vandalism include the addition of obscenities or crude humor, page blanking, or the insertion of nonsense into articles” (“Wikipedia: Vandalism”). Obscenity and crude humor fall into the category “Silly Vandalism” (“Wikipedia: Vandalism”). None of the categories of vandalism mention the addition of hate language to Wikipedia entries.
Hate language can be edited out of an entry as efficiently as can any other form of vandalism; I am confident that it is. However, hate language remained in the entry on thermodynamics for an hour and eight minutes before I removed it, surely long enough for others to have read it. Although the fragment no longer appears on the main page of the thermodynamics’ entry, the hate language still survives in the entry’s revision history, and, therefore, can be searched. The procedure for ridding an entry of vandalized text means that hate language gets hidden away like other forms of deliberate, and silly, tampering, an effect of correction that is both good (the offensive text is not readily apparent) and bad (this type of hiding typifies production of hate language). As is indicated in the Wikipedia entry “Hate Speech,” the influence of hate language follows directly from its containment (“Hate Speech”). Putting it away in an entry’s revision history, then, actually accomplishes the opposite of what Wikipedia intends; instead of eliminating the effects of hate, the editing process “creates an environment where discrimination is acceptable” (“Antilocution”).
I make this point for two reasons. First, Wikipedia administrators should be aware that hate language constitutes a category of vandalism distinct from those included on “Wikipedia: Vandalism” and, consequently, requires separate mention and editing protocol; not to rework this aspect of the collaborative writing process enhances the potential power of the hidden language. Second, hate language, which may be coded or disguised, can circulate both within and across Wikipedia entries, rather than, as is usual with other forms of vandalism, only focusing disruption – whether for ideological, corporatist, self-interested, gaming, or nonsensical reasons – on the writing of a single entry. As a result, editing hate language out of an entry, thus cleaning up the main text, enacts a form of circulation (to the revision history) that, potentially, contributes to the overall proliferation of hate in the encyclopedia, especially if archived text from one entry reappears embedded in the main text of another. Protection against vandalism should include a mechanism to flag and to revert this type of circulating hate language. I make both of these suggestions not to stifle any aspect of the Wikipedia project or to limit contributors’ expressions of free speech but to point out the insidiousness of a form of vandalism that has been ignored by Wikipedia administrators.
I chose to locate discussion of hate language at the end of this article, rather than insert it after the section subtitled “24 May through 6 June 2007,” because it warrants separate consideration. In the main text of this article, I focus on the ways in which the Wikipedia editing experience I chronicled in “24 May through 6 June 2007” prompted me to rethink my research writing pedagogy. The same experience also raises the concerns that I have discussed in this note. In an effort to distinguish the topic of identifying and editing out hate language in Wikipedia entries from that of research writing pedagogy, I elected to present the two discussions sequentially.
However, the two sets of issues do overlap: as more and more students in my writing courses contribute to and edit Wikipedia entries on the topics of their research, the likelihood that they will encounter or circulate hate language – by reverting offensive text – increases. This possibility has significant pedagogical ramifications, and deserves a fuller analysis than I am currently able to provide.
© Carra Leah Hood