Skip to main content
Intended for healthcare professionals
Open access
Research article
First published online July 2, 2020

It Takes a Village to Combat a Fake News Army: Wikipedia’s Community and Policies for Information Literacy

Abstract

The fake news crisis points to a complex set of circumstances in which new media ecologies struggle to address challenges related to authenticity, rhetorical manipulation and disinformation, and the inability of traditional educational models to adequately teach toward critical information literacy. While social media sites such as Facebook acknowledge the culpability of their platforms in spreading fake news, and create new strategies for addressing this problem, such measures are woefully inadequate. Wikipedia, nearing its 20th year, however, has developed numerous practices and policies to ensure information validity and verifiability. This article explores the connection between participation in the Wikipedia community, the development of critical information literacies, and the ability to navigate the current new media landscape. Analysis and review of Wikipedia’s community policies and the procedures resulting from these policies demonstrate the encyclopedia’s unique capacity to protect against problematic information. We ultimately argue that Wikipedia has become and remains one of the few places on the internet dedicated to combating fake news, and make recommendations on how to leverage Wikipedia practices and policies for information validation outside of the encyclopedia.

Introduction

A recent study by the Stanford History Education Group (2016) came to the frightening conclusion that “young people’s ability to reason about the information on the Internet can be summed up in one word: bleak” (p. 4). Contemporary global issues have highlighted that “disinformation” and “fake news” remain major concerns that face modern democratic society, and our current tools and efforts for teaching and communicating effective information literacy require updating (Jack, 2017). Furthermore in “It’s Complicated: The Social Lives of Networked Teens,” danah boyd (2014) points out that students are being told to “avoid Wikipedia” and do their own research. Noting that the students “heard that Google was trustworthy, but Wikipedia was not,” boyd (2017) wonders if media literacy might have “backfired,” and questions whether the critical lens that we tried to instill in students may have helped confuse information value.
As online information becomes increasingly complex and laden with misinformation (Jack, 2017), information literacy practices that actively combat misinformation, disinformation, and propaganda remain imperative to study and implement. Where to find these, in part at least, may lie in the community-driven space that we have been told to avoid, Wikipedia. Wikipedia is a decentralized commons-based peer production community that both advocates for the “don’t trust, do research” mantra of the potentially problematic “media literacy” that we have participated in and follows a set of rules, or policies, that relies on, understands, and engages with traditional epistemological foundations. Numerous studies have illustrated that Wikipedia’s community produces and maintains an encyclopedia that is as (or more) accurate as “traditional” encyclopedias (“Reliability of Wikipedia,” 2019), notwithstanding doubts regarding its reliability (Taraborelli, 2012). Despite dealing with a daily onslaught of misinformation, advertisements, and other false editing and authorship, Wikipedia’s community has maintained this reliability for nearly two decades.
In this article, we bridge contemporary education research that addresses the experiential epistemology of learning to use Wikipedia with an understanding of how the inception and design of the platform fights disinformation and fake news via its framework of community-mediated policies. To accomplish this, we review and analyze relevant community policies of Wikipedia that govern decisions about information representation and inclusion, as well as how such decisions are shaped through community procedures. When discussing “procedures,” we refer to examples of the enactment of socially mediated policies. We ultimately argue that Wikipedia has become one of the few places on the internet dedicated to combating problematic information. Furthermore, we make recommendations on how to leverage Wikipedia practices and policies for information literacy policy and education beyond higher education classroom applications.
Of course, Wikipedia has not been without its issues. The encyclopedia community acknowledges challenges related to systemic social biases regarding gender and race (“AfroCrowd,” 2019; Glott et al., 2010; Wadewitz, 2013) and its reliance on print-centric epistemologies (Graham, 2011; Prabhala, 2011; Raval, 2014). Researchers have also examined how harassment of women and trans-identified editors is normalized in the community (Menking & Erickson, 2015; Menking et al., 2019). Marginalized (gender) identities often take on extra emotional (Menking & Erickson, 2015) and identity-related labor to navigate a “spectrum of safe and unsafe spaces [in Wikipedia]” (Menking et al., 2019) and productively contribute to the community. In terms of community governance, while some researchers have suggested that peer production communities “follow [Robert] Michel’s iron law of oligarchy” rather than more “democratic organizational forms” (Shaw & Hill, 2014), others have suggested its resilience against such an evolution (Konieczny, 2009). Indeed, it is Wikipedia’s participatory affordances that guard against oligarchy: the “high level of empowerment of individual Wikipedia editors with regard to policy making, the ease of communication, and the high dedication to ideals of contributors succeed in making Wikipedia an atypical organization, quite resilient to the Iron Law” (Konieczny, 2009, p. 189). However, Wikipedia is not completely immune from a slide into oligarchy and requires continued efforts from multiple communities to sustain an active and vibrant volunteer base. Through analysis of relevant policy, this article strives to support such work, while acknowledging the problematic issues described above.

Misinformation and Disinformation

The current crisis of misinformation and disinformation points to a set of circumstances in which media ecologies, especially digital media ecologies, fail to address challenges pertaining to authenticity, rhetorical manipulation, and the inability of educational institutions to adequately teach critical media literacy. Misinformation refers to “information whose accuracy is unintentional”; disinformation, on the other hand, is “deliberately false or misleading” (Jack, 2017). Misinformation and disinformation are in no way a new problem. Fake news or “yellow journalism” has a long history within and outside of the United States (Soll, 2016). The more recent politicization of “fake news” (Lakoff & Duran, 2018), however, and the amplification of both misinformation and disinformation by algorithms and social media’s echo chambers (Nguyen, 2018) constitute a crisis with little historical precedent. Disinformation, as those following U.S. politics have seen, has the capacity to alter outcomes of national elections.
Internet giants such as Facebook and Google do not provide neutral access to information, but instead curate, market, and present information in ways to maximize the revenues made available by user data and advertising (Noble, 2018; O’Neil, 2016). In light of the persuasive and marketing roles of algorithms used by internet giants such as Facebook and Google, boyd’s (2014) finding regarding the trust students place in “research” conducted by Google Search is all the more troubling. Furthermore, more recent research on students’ digital and information literacy practices demonstrates how this trend has continued. Ibrar Bhatt and Alison MacKenzie, for instance, have applied the concept “epistemologies of ignorance” (Alcoff, 2007) to digital information practices of undergraduate students to explain their “ritualized” retrieval and lack of evaluation of content on the web. Such ritualized practice relies on the curation of content by other authorities—often a teacher or bibliometric algorithm to validate information. In other words, students depend on others to vett information rather than engaging in exploration and evaluation processes themselves (MacKenzie & Bhatt, 2020). The limited agency exercised by students tasked with information validation represents a problem that far surpasses educational spheres and activities. A society’s inability or unwillingness to exercise critical media literacy has broader implications for self-governance, political action, community-engagement, and other democratic responsibilities.

From the Classroom to the World

While the majority of studies on Wikipedia, information literacy, and knowledge production have been conducted on student learning outcomes and experience, this body of literature can also be read as controlled experiments on the experiential effects of engaging with Wikipedia as a knowledge community. What has been gleaned through research on student learning, for instance, can help shed light on the experience of Wikipedia in general, particularly for those who decide to learn to edit and participate in the Wikipedia knowledge community.
Recent research (Dawe & Robinson, 2017; Kamenetz, 2017; Oliver, 2015; Vetter et al., 2019) illustrates that students who engage with Wikipedia and its community are experiencing information literacy in much more effective ways, learning the necessary skills to combat misinformation and recognize valid information sources. While many students expressed having perceived the space as unreliable prior to editing Wikipedia (as they have been told “don’t use it”), their perception shifted through interaction with Wikipedia and the community, showing more trust in the reliability of Wikipedia as an information source. This trust was particularly due to their understanding of the Wikipedia community’s responsive and effective misinformation and disinformation combat practices, in addition to a better understanding of the structures of Wikipedia (and information in general).
Wikipedia allows for direct and transparent observation of the practices and concepts integral to combating misinformation, especially practices and concepts related to writing process, research, social collaboration, and digital rhetoric (Gruwell, 2015; Hood, 2007; Kill, 2012; Kuhne & Creel, 2012; Patch, 2010; Purdy, 2009; Tardy, 2010). Furthermore, the encyclopedia also provides opportunity for learning these skills and community integration through a public writing experience with an authentic audience which is tangible (Cummings, 2009; Sweeney, 2012; Vetter, 2013), and often results in increased motivation levels for completing assignments (Cummings, 2009; Vetter, 2014). Not only does participating in the Wikipedia community assist in learning digital/information literacy, critical research, teamwork, and technology skills, but students also reported pride in their work, spending more time, and more satisfaction with their class assignment than with traditional writing assignments. More than just learning topics and skills, they were motivated by their participation in a self-policing community dedicated to representing valid and verifiable information (Vetter et al., 2019). While such research has primarily focused on applications of Wikipedia-based assignments in formal educational contexts, such opportunities are also available to the broader public as they visit, read, interact with, and edit the encyclopedia. Whether done on their own, or in a group setting such as an editathon, it goes without saying that being a “student” does not require a formalized academic setting, and it is our position that anyone who studies how to participate in Wikipedia could benefit in a similar manner.
Moving from the classroom to the larger global Wikipedia user base (over 21 billion pageviews per month), we see that Wikipedia’s community is far different from many other digital media sites. Participants are motivated by a sense of belonging to a larger volunteer network seeking to, as the founder of Wikipedia Jimmy Wales describes it, “[i]magine a world in which every single person on the planet is given free access to the sum of all human knowledge” (Miller, 2004). This imagination is manifold in the participation within the Wikipedia community: participants are motivated by a grand goal, understand (and continue to learn anew about) their global and diverse audience, and they participate in and develop new ways of achieving that goal, particularly in how to represent knowledge equitably, accurately, and verifiably. Essentially, the experience of those who learn to edit Wikipedia and participate in the community is one of combating disinformation and systemic inequalities within the representation of that information. Furthermore, this experience is rooted in the community-generated policies, Wikipedia’s system of decision making, and the procedures that occur as a result of the enactment of policies.

Once a Child, Now Grown Up: How Wikipedia Helps Combat the Disinformation Crisis

As Wikipedia’s 20th year anniversary approaches, many of the site’s earlier peers have fallen by the wayside while the free encyclopedia founded on radical collaboration and reliable sourcing continues to persist. However, the online encyclopedia was often dismissed and disparaged during its first years—although some saw it as a vanguard, many thought the project was destined to fail (Black, 2010; Kamm, 2007). Some considered Wikipedia to be inaccurate or even dangerously open. Wikipedia did not fail, but it also did not become the utopian template for the web. In fact, as Hill (2013) points out, Wikipedia’s policies that we discuss here may have helped its rise in popularity and assisted in both attracting contributors and staving off the “demise” that took many of its progenitors and competitors (Waldman, 2004). What was once seen by the public as a utopian experiment has become the world’s largest and most popular reference work, supported by a growing and stable foundation. The “Wikimedia Foundation” (2019), the organization that hosts and runs Wikipedia, has also grown exponentially, from only US$80k in revenue in 2003, to over US$100 million in revenue in 2017.
Despite what naysayers claim about Wikipedia’s inaccuracies or dangerous openness, numerous studies have favorably compared Wikipedia’s accuracy to “traditional” encyclopedias (A. Brown, 2011; Giles, 2005; Hwang et al., 2014; Kräenbring et al., 2014; Taraborelli, 2012). That does not mean the encyclopedia does not continue to battle misinformation and inaccuracies, but that it has remained as or more reliable on major topics as other “more trustworthy” publishers. As we will discuss in this article, this is due to both relentless and ongoing efforts by volunteer editors, as well as by the design of the platform, community policies, and the enactment of those policies.
Pete Forsyth, the architect of Wikimedia Foundation’s Public Policy Initiative, which grew into the Wiki Education Foundation, stated “Wikipedia exists to battle fake news. That’s the whole point” (Forsyth, 2018)—a fairly bold statement which deserves some unpacking in a climate where disinformation and fake news is rampant. Wikipedia’s battle against fake news, misinformation, and disinformation is waged within and through community-mediated practices, and policies put into place in the encyclopedia to verify and validate information, to ensure accuracy, neutrality, and to guard against bias and misinformation.
Beyond just the community practices, Wikipedia functions differently than most other websites today. One of the top websites in the world for traffic, Wikipedia is the only one run by a nonprofit (“Wikipedia.org Is More Popular Than . . .,” 2018). Furthermore, Wikipedia does not try to predict what you encounter online and does not capture or analyze user data for advertising or content prediction. Unlike Google, Facebook, or other internet giants, Wikipedia does not curate, market, or algorithmically determine information in any way that restructures the results for users (Hill, 2013). Wikipedia, in sticking with these ancient (pre-tracking internet technology) technologies, effectively combats fake news by disincentivizing the ways in which fake news has been incentivized in every other major platform through advertising, pay-per clicks, and other techniques.
Relying on an explication and analysis of Wikipedia policy, this article explores links between (1) the policy structure of Wikipedia, (2) the Wikipedia community, and (3) how participation in the encyclopedia aids in users’ development of critical information literacies. These links help engender what boyd (2014) calls “antibodies to help people not be deceived.” Through illustrating how Wikipedia policies combat misinformation and disinformation, this article connects engagement with the Wikipedia community to a pedagogical practice that suggests that not only can we learn from Wikipedia but that Wikipedia’s successes can provide perspective in our current climate of problematic information.

Wikipedia Policy as Pedagogy

As the largest open educational resource in the world, Wikipedia is inherently pedagogical. Research on Wikipedia-based education has already demonstrated that academics can leverage the encyclopedia to teach toward outcomes related to information literacy, research, writing, and digital literacy, among others (Cummings & DiLauro, 2017; Garrison, 2015; Konieczny, 2012; Reilly, 2011; Roth et al., 2013; Vetter et al., 2019). Yet Wikipedia also serves the public beyond the classroom as readers and editors interact with its policies for information analysis and knowledge curation. In one way or another, everyone who comes into contact with Wikipedia is more or less a student of its content, policies, and procedures.
In an examination of credibility in Wikipedia, Ryan McGrady (2013) recognizes and explores the encyclopedia’s complex processes of ethos creation by attending to the rhetorical processes of “content-creation practices . . . that train new editors” (p. 120). McGrady argues that “the ethos of Wikipedia can be found in its community, and their system of rules that lead to the creation of content, rather than the content itself” (McGrady, 2013, p. 121). His framework for examining this system of rules, which analyzes both the Mediawiki platform (the open-source software that runs Wikipedia) and Wikipedia’s “Five Pillars,” adapts and broadens Ian Bogost’s (2008, 2010) theory of procedural rhetoric to justify rule-based processes in the encyclopedia as intrinsically persuasive. By focusing on policy and practice rather than the accuracy of content itself, McGrady acknowledges how the encyclopedia’s ethos has developed to the point where we no longer need to question or prove its reliability. Indeed, as discussed previously, numerous studies have already shown the encyclopedia’s accuracy (A. Brown, 2011; Giles, 2005; Hwang et al., 2014; Kräenbring et al., 2014; Taraborelli, 2012). While we do not see procedural rhetoric as a necessary framework for understanding Wikipedia’s capability to combat problematic information, we agree with McGrady on the fundamental argument that the Wikipedia community itself has created a series of policies that work toward credibility. Accordingly, this article strives to build on McGrady’s work by engaging in explication and analysis of Wikipedia policy as a pedagogy of information literacy. The Wikipedia community creates policies through socially mediated structures and processes, and these policies, in turn, shape encyclopedic content.
Identifying the policies of Wikipedia as a community affords a number of possibilities: the coded rules of the media wiki software and Wikipedia’s “Five Pillars” (as McGrady has examined), but also a number of other substantial rules more relevant to Wikipedia’s social practices. Our analysis, accordingly, extends the work of McGrady as misinformation, fake news, and other crises of authenticity become increasingly pervasive. In the following discussion and analysis, we trace the construction of credibility in the encyclopedia by examining (1) initial barriers to vandalism (spam edits, such as vulgar language) and misinformation (adding incorrect or misleading information, often unsourced or from disreputable sources) or disinformation (intentionally misleading misinformation), including Wikipedia bots and auto-confirmation “rules”; (2) Wikipedia’s policy of verifiability, which governs editors’ selection and engagement with secondary sources to create mainspace article content; (3) the policy of neutral point of view (NPOV), which influences the use of sources and encourages balanced coverage of topics; and (4) the policy of notability, which dictates how and when article content should be covered in the encyclopedia. As a whole, these systems reconstruct traditional models of authority (even ones that academia relies on) to retain Wikipedia’s credibility, even in the face of the current “fake news” crisis, through creating an open community of gatekeepers that enforce and police content that relies on traditional authoritarian knowledge hierarchies and values. They also provide a framework for understanding how Wikipedia can act pedagogically in regard to information literacy.

Governance: How Wikipedia’s Rules Are Ruled

Wikipedia’s governance has been the subject of an increasing amount of research in information science, sociology, and computer science. As discussed previously, scholars have questioned whether Wikipedia (and wikis in general) is susceptible to oligarchic tendencies as well as challenged Wikipedia’s rhetoric of “participatory and open” (Shaw & Hill, 2014) and the constant and dynamic evolution of the encyclopedia’s rules as both stabilizing and limiting in terms of governance (Keegan & Fiesler, 2017). Our own analysis is optimistic in acknowledging Wikipedia as “an atypical organization, quite resilient to the Iron Law” (Konieczny, 2009, p. 189). Following Konieczny, we acknowledge Wikipedia’s continued success as something of an anomaly when compared to other systems of mass peer production. And while the encyclopedia will continue to require constant support from a diverse set of volunteer communities, its policies, especially those that are more stable, provide an important framework for both shared governance and information processing.
Despite an increasing reliance on automation, Wikipedia is a fundamentally social project; its policies and procedures emerge from social, democratic operations of governance and administration, carried out by elected volunteers from the Wikipedia community. In one of the most extensive examinations to date, Rijshouwer (2019) contends that Wikipedia demonstrates a process of “self-organizing bureaucratization” through three distinct features of the self-organizing community. First, the Wikipedia community is “transient,” in that its organizations are dynamic and respond to new needs and challenges as the encyclopedia increases in size and complexity. Second, increasing bureaucratization in the community is deployed to mediate conflicts between (a) political differences regarding “community members’ autonomy and self-organizing character” and (b) other members’ introduction of “formal structures to pragmatically meet the project’s challenges and objectives” (Rijshouwer, 2019, p. 237). Essentially, bureaucratization mediates conflicts between conservative/pragmatic impulses, on the one hand, and anti-authoritarian impulses, on the other, doing double duty to assist in the self-organization of a volunteer system. Finally, the self-organizing principle of the Wikipedia community is rooted in the “inclination to meet the ideal to organize themselves and their work as democratically as possible,” as the community self-polices its democratic organization (Rijshouwer, 2019, p. 237). All told, the social project of Wikipedia relies on the prevailing ideology of democratic and transparent peer production through consensus and conflict mediation.
Rijshouwer’s identification of these features—transience, conflict mediation, and democratic peer production—helps to explain Wikipedia’s self-organizing bureaucratization as a method for general oversight and governance. Such bureaucratization also informs the ongoing development and revision of policies and guidelines in the encyclopedia. These are not seen as stable rules but “principles” developed through a social process requiring “discussion and a high level of community-wide consensus” (“Wikipedia:Administration,” 2019). In fact, and as will be discussed later, one of the “pillars” of Wikipedia is that “there are no firm rules.” Most of the content policies we will discuss “have been accepted as fundamental since Wikipedia’s inception” (“Wikipedia:Policies and Guidelines,” 2019). These policies are established through a variety of methods, including reorganizing existing policies as well as proposing new policies, but protocols always emerge through “strong community support” and are “seldom established without precedent” (“Wikipedia:Policies and Guidelines,” 2019).
Some of these policies are fairly simple and straightforward and can be policed by automated systems helping to create more rigid barriers against misinformation. However, many of these policies remain complex enough that they and their implementation require consistent renegotiation and interpretation by the community. Each of these content policies helps to frame how the encyclopedia “works” in a distributed, open, volunteer-driven space and demonstrates how Wikipedia protocols act to regulate and process information, providing a pedagogy that promotes information literacy.

Barriers to Vandalism and Promotion of Credibility in Wikipedia

Before moving into a discussion of some of the most influential policies on Wikipedia, it is important to acknowledge the automated labor performed by Wikipedia bots regarding user access levels. These systems emerge from community-created processes and policies, helping to promote credibility in Wikipedia by both responding to and pre-empting issues such as vandalism and user inexperience.

Automated Systems

A historic concern over Wikipedia’s crowdsourced model is that if “anyone can edit,” then “anything goes.” This concern is answered from a procedural point of view through user access levels. New users in Wikipedia, those that have accounts less than four days old and with less than 10 edits, are restricted in terms of the editorial actions they can take. Once they meet these requirements, new users become “autoconfirmed” and gain new privileges:
Autoconfirmed or confirmed users can create articles, move pages, edit semi-protected pages, and upload files (including new versions of existing files). Autoconfirmed users are no longer required to enter a CAPTCHA for most events and may save books to the Books namespace. In addition, the Edit filter has a number of warning settings that will no longer affect editors who are autoconfirmed. (“Wikipedia:User access levels,” 2020)
Auto-confirmation is one of the most basic and introductory processes in Wikipedia, and as such provides an initial barrier to vandalism and inexperienced and/or ineffective editing, especially in terms of new article creation. Most often these types of edits consist of vulgar language (vandalism) or simple mistakes (inexperienced and/or ineffective editing) that can be caught easily with bots. New users are both unable to create new pages and are severely limited in what pages they can access while editing, as well as their edits are more highly policed.
Another early line of defense in Wikipedia, one that is often unacknowledged or misunderstood, is the use of software robots (hereafter, “bots”). Wikipedia defines a bot as an “automated tool that carries out repetitive and mundane tasks to maintain the 47,329,838 pages of the English Wikipedia,” and there are quite a few of these tasks that require carrying out, as there are currently 2,345 bot tasks approved, and over 900 bots listed, the top having made over 4 million edits (“Wikipedia:Bots,” 2019; “Wikipedia:List of Bots by Number of Edits,” 2019). Because bots have the capability to make rapid changes to the encyclopedia, their creation (programming) and activity is closely monitored and governed by a community-devised “bot policy,” which lays out expectations that bots “meet high standards before they are approved for use on designated tasks” (“Wikipedia:Bot policy,” 2019). Some researchers have even noted that “Wikipedia would be in shambles without bots” (Nasaw, 2012, as cited in J. Brown, 2015, p. 497). Bots patrol editors’ contributions and alert administrators of potential trolls and vandals (Geiger, 2011; Martin, 2018). They also make significant contributions in the reduction of misinformation in the encyclopedia. In general, Wikipedia, as a volunteer-run community site, is heavily policed by bots so as to streamline tasks for editors and administrators.
A notable example of a bot that assists in policing Wikipedia is ClueBot NG, noted to assist in “practical vandalism prevention” (Geiger & Halfaker, 2013; “Wikipedia:Bots/Requests for Approval/Cluebot NG,” 2019). Launched by Wikipedia users Christopher Breneman and Cobi Carter, ClueBot NG is one of the more prolific bots used to combat vandalism in Wikipedia. As of September 2019, the bot has contributed a total of 5,368,611 edits, and was ranked as the fifth most productive Wikipedia bot (“Wikipedia:List of Bots by Number of Edits,” 2019). Geiger and Halfaker’s (2013) study on bot performance further revealed that ClueBot NG is particularly effective in removing fake information on Wikipedia by reverting (reversing the edit, ostensibly deleting) possible vandalism in Wikipedia articles.
The use of bots and user access levels in Wikipedia marks the extent to which the community has devised policies and socially deliberated rules (i.e., when should an editor have the capacity to create new articles) to prevent vandalism and ineffective editing, and in doing so increases the site’s overall credibility. However, these two features are merely the first level of defense against problematic information in the encyclopedia, as misinformation and disinformation are more difficult to police than vulgar words and editing mistakes. Bots and other automated implementation of policies simply fend off lower level issues and obvious vandalism that can be policed by a bot, as well as help to keep new and anonymous editors at bay when they wish to add in misinformation, which helps reduce the volume of problematic content. Wikipedia’s community-driven policies are far more influential in the complex system of information evaluation that contributes to Wikipedia’s lasting impact on information representation.

Community Policies

Wikipedia has dozens of policies that govern its community, in fifteen different categories, ranging from content and editing to behavioral guidelines. Three of the main policies that are responsible for engendering trust in Wikipedia through battling fake news and information are verifiability, NPOV, and notability. Wikipedia’s policy of verifiability guides editors’ uses of secondary sources to create new content, while NPOV encourages the balanced coverage of topics and protects against bias, and finally, notability influences decisions regarding what subjects should be included in the encyclopedia. All three of these policies emerge from the five pillars of Wikipedia, which Ryan McGrady has previously explicated through the lens of procedural rhetoric (McGrady, 2013).
1.
Wikipedia is an encyclopedia.
2.
Wikipedia is written from a NPOV.
3.
Wikipedia is free content that anyone can use, edit, and distribute.
4.
Wikipedia’s editors should treat each other with respect and civility.
5.
Wikipedia has no firm rules. (“Wikipedia:Five Pillars,” 2019)
These “Five Pillars” inform and are the basis for the policies that more closely help us understand how Wikipedia functions as a space to battle misinformation as well as provides an experience in information literacy pedagogy.

Verifiability (WP:V)

Information validation in Wikipedia is largely a process of its verifiability policy and related procedures. In Wikipedia, verifiability refers to the encyclopedia’s strict adherence to a “no original research” policy in which all content added to mainspace must be verified by any individual encountering that content through a secondary and reliable source. When an editor attempts to make an addition to the encyclopedia, even if that addition involves information that the editor is confident of through firsthand experience, the content must be verified and verifiable through a secondary source. Verifiability is ensured through the careful practice of citation and reference to published, secondary sources. Wikipedia policy further explains that the “burden to demonstrate verifiability lies with the editor who adds or restores material, and is satisfied by providing an inline citation to a reliable source that directly supports the contribution” (“Wikipedia:Verifiability,” 2019). Such an assignment of responsibility for the burden of verifiability demonstrates the community’s authorship of “rules of behavior”—specific procedural arguments meant to carefully and thoroughly vet the addition of new content.
Wikipedia’s policy on verifiability not only lays out the need for verifiable information taken from other published sources but also clarifies what counts as a reliable source. This policy specifically states that “articles must be based on reliable, independent, published sources with a reputation for fact-checking and accuracy” (“Wikipedia:Verifiability,” 2019), particularly naming academic sources as the most ideal sources for verifiability. Numerous pages link from this policy page, with dozens of pages of text listing instances of reliable and unreliable sources, how to think about information in context, and guidelines on how to make better decisions on sources. Despite being a seemingly simple statement about needing to verify information with an external source, this policy includes incredibly robust guidelines on the trustworthiness of information and how to make decisions about it.
The policy of verifiability is enacted in Wikipedia in numerous ways, as editors may challenge and revert unsourced content, annotate such content with a “[citation needed]” tag, or take editorial action to provide a verifiable reference for unsourced or poorly sourced content. All three constitute an immersive pedagogical experience in information literacy. Furthermore, readers of Wikipedia encountering the “[citation needed]” tag are also exposed to the pedagogy of Wikipedia as they question the accuracy of the information provided and are introduced to Wikipedia policy. If the reader chooses to click on the tag, for example, they are directed to the information page on “[citation needed]” which also references and links to WP:V. The pedagogical aspects of verifiability are further extended through Wikipedia subcultures and tools. The page for WikiProject Reliability, for instance, identifies the project’s primary goal as “ensur[ing] that content in articles is verifiable” (“Wikipedia:WikiProject Reliability,” 2019). The WikiProject asks its members to “[i]dentify and tag claims that require verification with appropriate templates,” “[p]erform fact and reference checks for articles with verification templates,” and “[p]rovide assistance with factual verification to editors” (“Wikipedia:WikiProject Reliability,” 2019). A final example of the policy’s enactment and educational features is the Citation Hunt Tool, also linked to from WikiProject Reliability. This tool aggregates samples of mainspace article content that has been tagged in need of a citation and provides readers and would-be editors with “snippets” so that they might add a verifiable reference. A leaderboard collects data on the editors who “fix” the most passages (Citation Hunt Leaderboard, n.d.), providing a gamified experience in information literacy intervention. These tasks engage members of the community in information literacy practices who work to ensure reliability and combat disinformation in the encyclopedia.

Neutral Point of View (WP:NPOV)

Both “no original research” and “verifiability” policies have their origins in NPOV. When secondary sources conflict, editors are encouraged to balance coverage by following NPOV, yet another policy that aids editors in validating and verifying information accuracy and controlling bias. One of the oldest policies in Wikipedia (appearing in 2001), NPOV attempts to provide balanced coverage of actual sources, and, in doing so, potentially combats against amplification and opinion biases. According to “Wikipedia:Neutral Point of View,” (2020),
[a]ll encyclopedic content on Wikipedia must be written from a neutral point of view (NPOV), which means representing fairly, proportionately, and, as far as possible, without editorial bias, all of the significant views that have been published by reliable sources on a topic.
Furthermore, NPOV asserts that articles should explain opposing viewpoints rather than favoring one or the other and that such favoring can happen in both the structure and the content of an article. NPOV forwards an epistemology in which editors are requested to “describe disputes” rather than “engage” them. Finally, editors are expected to provide complete information from multiple reliable sources to best represent controversial subjects. The policy article on NPOV offers the following “principles” to help “achieve the level of neutrality that is appropriate for the encyclopedia”:
Avoid stating opinions as facts.
Avoid stating seriously contested assertions as facts.
Avoid stating facts as opinions.
Prefer nonjudgmental language.
Indicate the relative prominence of opposing views. (“Wikipedia:Neutral Point of View,” 2020)
The policy goes beyond content to also suggest how an article’s structure might be carefully safeguarded against biases:
[p]ay attention to headers, footnotes, or other formatting elements that might unduly favor one point of view, and watch out for structural or stylistic aspects that make it difficult for a reader to fairly and equally assess the credibility of all relevant and related viewpoints. (“Wikipedia:Neutral Point of View,” 2020)
NPOV also requires equal weight for citing ideas, meaning that although the article should represent different aspects of the topic, only insofar as it is weighting these sides in a neutral manner. The part of the NPOV policy that deals with “due” or “undue” weighting is careful about guidelines on how to weigh articles appropriately, warning that “Wikipedia policy does not state or imply that every minority view or extraordinary claim needs to be presented along with commonly accepted mainstream scholarship as if they were of equal validity” (“Wikipedia:Neutral Point of View,” 2020). This policy helps to combat the “all sides are valid” claim that plagues many fringe political arguments with spurious claims and beliefs. It also helps to properly weigh articles such as “Climate Change” to accurately represent mainstream scholarship’s overwhelming consensus on the matter, while giving extremely little space for competing claims, as the scholarship for competing claims are few and far between.
The policy of NPOV is enacted in Wikipedia in numerous ways such as pointing out the problems on the talk page or the editor’s user page, annotating the page with a “[POV]” tag, and filing a request for comment or a report on the NPOV noticeboard. However, neutrality is an ongoing conversation that relies on consensus; hence, it is not something as cut and dry as verifiability. The noticeboard is encouraged to be used as a way to bring other editors in to discuss neutrality of an article, hoping to find a balance in both language and representation. Editors are encouraged to discuss their disputes over the neutrality of an article rather than simply reverting content, and document disputes over controversial subjects (“Wikipedia:NPOV Dispute,” 2020). Instead of taking sides in the argument, editors are encouraged to document the different sides (balanced with sources, of course). Ultimately, NPOV helps to bring discussion around facts and representation which helps ensure that information remains and continues to remain accurate and representative of what is available to summarize. Furthermore, in the enactment of NPOV policy toward a public information literacy pedagogy, Wikipedia also encourages and facilitates critical discussion of information neutrality.

Notability (WP:N)

Notability directly influences the creation (and deletion) of new articles. Notability is described on the policy page as a “test used by editors to determine whether a given topic warrants its own article” (“Wikipedia:Notability,” 2019). Wikipedia’s “general notability guideline” is as follows: “[i]f a topic has received significant coverage in reliable sources that are independent of the subject, it is presumed to be suitable for a stand-alone article or list” (“Wikipedia:Notability,” 2019). Within this policy page, each of the bolded terms are further explained and defined. Significant coverage, for instance, “addresses the topic directly and in detail [and] . . . is more than a trivial mention.” The construct of “reliability,” furthermore, calls for “editorial integrity” in line with Wikipedia’s separate “reliable source guideline.” Multiple (reliable) and secondary sources are expected and, in some cases, required to prove notability. The requirement that such sources should be “independent of the subject” does not allow for the usage or inclusion of sources created “by the article’s subject or someone affiliated with it” (“Wikipedia:Notability,” 2019).
Notability has come under fire lately, and rightfully so, as it is the policy that has been used to justify excluding numerous women’s biographies despite their male counterparts’ presence on Wikipedia. Most notably, Donna Strickland, the first female Nobel Laureate in Physics in 55 years, was noted to not have a Wikipedia page until after winning the Nobel Prize. Further investigation into this matter showed that her page had been drafted and submitted before, only to be taken down by the claim that she was not notable (Cecco, 2018). Katherine Maher, the executive director of the Wikimedia Foundation responded to the (understandable) outcry of concern, stating that “Wikipedia is a mirror of the world’s gender biases,” as the notability policy is based on the amount of press coverage a person has received. Since many women receive less press coverage than their male counterparts, Wikipedia’s representation of women is plagued by a larger systemic gender bias (Maher, 2018). In this manner, the notability policy is a double-edged sword, as it both acts as a gatekeeper against “everyone needing a page” and also keeps out potentially important biographies due to its reliance on independent journalistic coverage. Despite these problems, the policy of notability furthers public information literacy specifically because it relies on editors’ decisions concerning significant coverage through reliable sources.

Conclusion

Researchers have already argued that Wikipedia community-driven practices can be leveraged for educational purposes—especially in terms of critical media literacy (Cummings, 2009; Jiang & Vetter, 2019; Nelson, 2018; Vetter, 2018; Vetter et al., 2019). What we suggest here is that community-driven policies and the procedures following the enactment of those policies need to be better understood and explored beyond traditional models of education. Recognizing these policies as the construction of a particular credibility, an ethos should prompt academic researchers and teachers, as well as stakeholders in public policy and information literacy, to seek out methods and outlets for expanding and promoting Wikipedia literacy and participation.
In particular, by analyzing Wikipedia’s policies and how they value particular types of information and language, we have illustrated that one of the core strengths of the Wikipedia community is a strict adherence to traditional information value hierarchies as found in academia. Independent, verifiable sources are key to Wikipedia, as peer-reviewed journals take precedent, along with academic book publishers, and then high-quality national and international journalism. Information comes from multiple sources, and in Wikipedia, it is seen as a representation of the “conversation” of what is out there, much like an academic literature review.
As information has become more easily accessible, so has the ability to appear credible. This is not the first time where we fought disinformation nor will it be the last (Marche, 2018). For better or worse, Wikipedia has taken up the mantle of arbiter of truth and knowledge, and, for the most part, has done an excellent job of it—so much so that other platforms, especially personal assistants such as Alexa and Siri, use Wikipedia’s information to police and contribute data to their own systems (Bariso, 2018; Dickey, 2019; Farokhmanesh, 2018; McCracken, 2018; Simonite, 2019).
In light of Wikipedia’s communicative and community-driven resilience, we recommend three ways in which we can learn from Wikipedia when addressing other information platforms. First, by recognizing these community-mediated policies, we see an opportunity to contrast Wikipedia’s participatory social platform with other more commercialized digital media sites that users seek out information from. Problems related to disinformation and the fake news crisis are exacerbated in mainstream digital media sites (e.g., Facebook) by their inherent commercialization. The incentivization of disinformation is, in many ways, tied to systems of advertising in which revenue from click-throughs and pageviews serves as a dominant motive for content creation. Unlike other digital media sites, Wikipedia does not use cookies to track, collect data on, or predict behavior of its users. This is a radical departure from nearly every other space on the internet. Perhaps this is because Wikipedia has remained a veritable dinosaur while most spaces on the internet have moved away from the logics of Web 2.0, away from a participatory web and toward a predictive web based on advertising revenue. What that means for other digital media platforms’ war with disinformation is, however, more difficult, as their business model relies on user data to sell advertising and services. Platforms wishing to combat this type of disinformation have constructed a system in which they are perpetually in an adversarial position against those seeking to profit from advertising revenue—locked into an ongoing (and ultimately a losing) game. In short, those who wish to exploit the system will always be one step ahead, as there will always be new ways to generate revenue in this manner. However, policies and procedures that are enacted by large-scale platforms are a first step toward disincentivizing particular types of fake news and disinformation, as we have seen recently with Facebook’s policies around anti-vaccination groups and white nationalism (Ingber, 2019; “Standing Against Hate,” 2019).
In short, it would behoove platforms to tread carefully when it comes to utilizing and selling data, as well as build more robust policies that disincentivize fake news and disinformation. Platforms such as Facebook have paid much lip service to the security and ownership of user data but continue to allow sponsored spam advertisements and problematic paid and targeted content. We understand that the current economic model of the internet makes this tricky, but polluting user’s experiences with large quantities of disinformation does more than just spread it around, it actively disincentivizes users to think of the platform as a community.
Second, Wikipedia assists in learning and experiencing information literacy in much more effective and non-exploitative ways, and it all stems from the design of the platform, policies, and community’s dedication. The question is not whether students, young people, and everyone else is using Wikipedia, but whether and how people trust that information, and how folks are making decisions about the information they experience using these frameworks. It is clear that Wikipedia’s community is on to something with their commons-based peer production method of information production, as it reconfigures authoritarian knowledge structures while doubling down on many “traditional” sources of knowledge. Wikipedia illustrates that (1) returning to valuing traditional knowledge hierarchies can be incredibly helpful to “make sense” of information, (2) policies and practices are part of the “fake news” solution, but issues are always going to be baked into a system that monetizes and incentivizes clicks and advertising, and (3) these issues will always be problematic in controlled digital media platforms that are not community driven.
Wikipedia’s socially mediated policies and procedures help to reconstruct more traditional models of authority to uphold the credibility of the encyclopedia and protect against problematic information (Jack, 2017). These policies and procedures also provide pedagogical opportunities for those interacting with the encyclopedia beyond higher education institutions. While user-editors may engage more with Wikipedia policy as pedagogy, and thus learn more about the complex process of information validation in networked environments, casual visitors and readers can also benefit from being exposed to information vetting in the encyclopedia.
Platforms wishing to learn from Wikipedia should take note of Wikipedia’s commitment to secondary, independent, and reliable information as a basis for inclusion in the encyclopedia. Acknowledging that all information does not hold the same reliability illustrates that there is a need for a knowledge hierarchy and should be apparent in other platform’s designs. Whether this comes in the form of simply tagging information as “news” or “opinion” or going so far as greylisting and blacklisting certain sites, illustrating that a knowledge hierarchy exists, especially when it comes to disinformation and misinformation, can assist in combating many platforms’ designs that helps to encourage the belief that all information is equal.
Third, Wikipedia’s functioning as a community of practice, in contrast to other digital media platforms, highlights how Web 2.0 has all but disappeared in its most traditional sense. Digital spaces that were traditionally crowdsourced have been largely overtaken by commercial enterprises and platforms. For example, the YouTube “bargain” of contributing to a community in exchange for some advertising revenue has been replaced by the “new bargain” (Soha & McDowell, 2016) expecting different types of contributions and usage of data. As internet and technology giants such as Facebook, Google, and Apple continue to accumulate and consolidate power over more of the web, this trend will only worsen. In contrast to Web 2.0, which emphasized “user-generated content, ease of use, participatory culture and interoperability for end user” (“Web 2.0,” 2019, emphasis ours), the internet after Web 2.0 (which has been variously termed the Semantic Web, Web 3.0, and Web 4.) commodifies and capitalizes on users as data rather than contributors or participants.
Wikipedia remains an outlier in these popular digital media platforms, of course, as it is specifically an encyclopedia project and not a networking site, search engine, or other type of commercialized platform. In short, Wikipedia has a stated community purpose, while the others are shells to host a variety of information. That being said, Wikipedia is a community, and like all communities, it has rules, expectations, and norms. So, while comparing Wikipedia and Facebook or Google is not apples to apples (or even apples to oranges), they are all extremely popular spaces where people get information and that information is, to some extent, curated and annotated by the community.
Platforms wishing to learn from Wikipedia here must take note of the investment in the community of Wikipedia, as well as that of others in the past (e.g., YouTube before the “new bargain”). The participatory web was successful in many ways and engendered feelings of community with participants. Wikipedians rally around their space as a massive group project, but others have been successful in building spaces for a variety of projects and voices. Communities find ways to self-police as they care about the space they occupy and build community, reducing the overall load on the platform to police but not until the community feels like it has a space to defend.
All other aspects of these learnings dovetail into the final recommendation—platforms cannot simply police themselves and will always fail if the communities do not care about their space. From this, we can see how platforms selling data incentivizes disinformation spread through targeting, which creates a space that cannot be trusted, and how designing an information space to respect knowledge hierarchies will help to ensure that communities experience information in a way that prioritizes investigative journalism and peer-reviewed science, and de-prioritizes (or at least makes clear the difference) partisan opinionated pieces and other potential problematic information. These and other related policies and practices can help to foster community spaces that prioritize, share, and celebrate good information, as well as recognize when misinformation and disinformation sneak in.
Through examining how Wikipedia policies related to bots and auto-confirmation “rules,” verifiability, NPOV, and notability, we can not only see how Wikipedia helps to combat mis/disinformation but also see how these policies support Wikipedia’s role as an open educational resource. Furthermore, as students both traditional and outside the classroom interact with Wikipedia, we can understand how such interaction teaches strategies for identifying and combating problematic information. That being said, the English Wikipedia is accessed by an average of 858 million unique devices per month, while only 66,000 editors are considered active (5 or more edits in a given month), about 0.008% (“Wikipedia:Statistics,” 2019). Considering only a very small percentage of users are active editors on Wikipedia, both academics and Wikipedia community members need to continue to encourage more participation in the encyclopedia.
Ultimately, this acknowledgment of the changing web prompts us to encourage and explore more participatory digital practices that wiki platforms and other forms of media afford while attending to Wikipedia’s community-driven practices. Furthermore, the recommendations offered here open new avenues for learning from and with Wikipedia about how we might better investigate, research, and maintain a critical lens on the larger issues in representations of “truth”—especially in the current crisis of fake news and disinformation.

Declaration of Conflicting Interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding

The author(s) disclosed receipt of the following financial support for the research, authorship and/or publication of this article: Funding for the Open Access Article Processing Charges provided by the UIC Research Open Access Publishing Fund.

ORCID iD

References

AfroCrowd. (2019, December 22). Wikipedia, the Free Encyclopedia. Retrieved December 22, 2019, from https://en.wikipedia.org/w/index.php?title=AfroCrowd&oldid=931918305
Alcoff L. M. (2007). Epistemologies of ignorance: Three types. In Sullivan S., Tuana N. (Eds.), Race and epistemologies of ignorance (pp. 39–58). State University of New York.
Bariso J. (2018, September 27). Amazon just donated $1 million to Wikipedia. Here’s why it matters. Inc. https://www.inc.com/justin-bariso/amazon-wikimedia-wikipedia-donation-1-million-emotional-intelligence.html
Black E. (2010). Wikipedia—The dumbing down of world knowledge. History News Network. http://historynewsnetwork.org/article/125437
Bogost I. (2008). The rhetoric of video games. In Salen K. (Ed.), The ecology of games: Connecting youth, games, and learning (pp. 117–140). The MIT Press.
Bogost I. (2010). Persuasive games: The expressive power of videogames. The MIT Press.
boyd d. (2017, January 7). Did media literacy backfire? Data & Society [Blog post]. https://points.datasociety.net/did-media-literacy-backfire-7418c084d88d
Brown A. (2011). Wikipedia as a data source for political scientists: Accuracy and completeness of coverage. PS: Political Science & Politics, 44(2), 339–343. https://doi.org/10.1017/S1049096511000199
Brown J. (2015). Ethical programs: Hospitality and the rhetorics of software. University of Michigan Press.
Cecco L. (2018, October 3). Female Nobel prize winner deemed not important enough for Wikipedia entry. The Guardian. https://www.theguardian.com/science/2018/oct/03/donna-strickland-nobel-physics-prize-wikipedia-denied
Cummings R. E. (2009). Lazy virtues: Teaching writing in the age of Wikipedia. Vanderbilt University Press.
Cummings R. E., DiLauro F. (2017, June). Student perceptions of writing with Wikipedia in Australian higher education. First Monday, 22(6). https://firstmonday.org/ojs/index.php/fm/article/view/7488
Dawe L., Robinson A. (2017). Wikipedia editing and information literacy: A case study. Information and Learning Science, 118(1–2), 5–16.
Dickey M. R. (2019, January 22). Google.org donates $2 million to Wikipedia’s parent org. TechCrunch. https://techcrunch.com/2019/01/22/google-org-donates-2-million-to-wikipedias-parent-org/
Farokhmanesh M. (2018, March 14). YouTube didn’t tell Wikipedia about its plans for Wikipedia. The Verge. https://www.theverge.com/2018/3/14/17120918/youtube-wikipedia-conspiracy-theory-partnerships-sxsw
Forsyth P. (2018, August 23). How Wikipedia dodged public outcry plaguing social media platforms. Wiki Strategies. https://wikistrategies.net/how-wikipedia-dodged-public-outcry-plaguing-social-media-platforms/
Garrison J. C. (2015, October). Getting a “quick fix”: First-year college students’ use of Wikipedia. First Monday, 20(10). https://firstmonday.org/ojs/index.php/fm/article/view/5401/5003
Geiger R. S. (2011). The lives of bots. In Geert L., Tkacz N. (Eds.), Critical point of view: A Wikipedia reader (pp. 78–93). Institute of Network Cultures.
Geiger R. S., Halfaker A. (2013, August 5–7). When the levee breaks: Without bots, what happens to Wikipedia’s quality control processes? [Conference session]. Proceedings of the 9th International Symposium on Open Collaboration, Hong Kong, China.
Glott R., Schmidt P., Ghosh R. (2010). Wikipedia survey–overview of results. United Nations University: Collaborative Creativity Group, 8, 1158–1178.
Graham M. (2011). Wiki space: Palimpsests and the politics of exclusion. In Tkacz G. L. A. N. (Ed.), Critical point of view: A Wikipedia reader (pp. 269–282). Institute of Network Cultures.
Gruwell L. (2015). Wikipedia’s politics of exclusion: Gender, epistemology, and feminist rhetorical (in)action. Computers and Composition, 37, 117–131.
Hill B. M. (2013). Almost Wikipedia: What eight early online collaborative encyclopedia projects reveal about the mechanisms of collective action. Essays on volunteer mobilization in peer production [Doctoral dissertation]. Massachusetts Institute of Technology.
Hood C. L. (2007). Editing out obscenity: Wikipedia and writing pedagogy. Computers and Composition Online. http://www2.bgsu.edu/departments/english/cconline/wiki_hood/index.html
Hwang T. J., Bourgeois F. T., Seeger J. D. (2014). Drug safety in the digital age. New England Journal of Medicine, 370(26), 2460–2462. https://doi.org/10.1056/NEJMp1401767
Ingber S. (2019, March 27). Facebook bans White nationalism and separatism content from its platforms. Npr.org. https://www.npr.org/2019/03/27/707258353/facebook-bans-white-nationalism-and-separatism-content-from-its-platforms
Jiang J., Vetter M. A. (2019). The good, the bot, and the ugly: Problematic information and critical media literacy in the postdigital era. Postdigital Science and Education, 2, 78–94.
Kamenetz A. (2017, February 22). What students can learn by writing for Wikipedia. Npr.org. https://www.npr.org/sections/ed/2017/02/22/515244025/what-students-can-learn-by-writing-for-wikipedia
Keegan B., Fiesler C. (2017, May 15–18). The evolution and consequences of peer producing Wikipedia’s rules [Conference session]. Eleventh International AAAI Conference on Web and Social Media, Montréal, QC, Canada.
Kill M. (2012). Teaching digital rhetoric: Wikipedia, collaboration, and the politics of free knowledge. In Hirsch B. (Ed.), Digital humanities pedagogy: Practices, principles and politics (pp. 389–405). Open Book.
Konieczny P. (2009). Wikipedia: Community or social movement? Interface: A Journal for and about Social Movements, 1(2), 212–232.
Konieczny P. (2012, September). Wikis and Wikipedia as a teaching tool: Five years later. First Monday, 17(9). http://firstmonday.org/article/view/3583/3313
Kräenbring J., Monzon Penza T., Gutmann J., Muehlich S., Zolk O., Wojnowski L., . . . Sarikas A. (2014). Accuracy and completeness of drug information in Wikipedia: A comparison with standard textbooks of pharmacology. PLOS ONE, 9(9), Article e106930. https://doi.org/10.1371/journal.pone.0106930
Kuhne M., Creel G. (2012). Wikipedia, “The people formerly known as the audience.” Teaching English in the Two-Year College, 40(2), 177–189.
Lakoff P. G., Duran G. (2018, June 13). Trump has turned words into weapons. And he’s winning the linguistic war. The Guardian. https://www.theguardian.com/commentisfree/2018/jun/13/how-to-report-trump-media-manipulation-language
MacKenzie A., Bhatt I. (2020). Lies, bullshit and fake news: Some epistemological concerns. Postdigital Science and Education, 2, 9–13.
Maher K. (2018, December 6). Wikipedia is a mirror of the world’s gender biases. Wikimedia Foundation. https://wikimediafoundation.org/news/2018/10/18/wikipedia-mirror-world-gender-biases/
Marche S. (2018, April 23). How we solved fake news the first time. The New Yorker. https://www.newyorker.com/culture/cultural-comment/how-we-solved-fake-news-the-first-time
Martin B. (2018). Persistent bias on Wikipedia: Methods and responses. Social Science Computer Review, 36(3), 379–388.
McCracken H. (2018, March 13). YouTube will use Wikipedia to fact-check Internet hoaxes. Fast Company. https://www.fastcompany.com/40543971/youtube-will-use-wikipedia-to-fact-check-internet-hoaxes
McGrady R. (2013). Ethos [edit]: Procedural rhetoric and the Wikipedia project. In Folk M., Apostel S. (Eds.), Online credibility and digital ethos: Evaluating computer-mediated communication (pp. 114–130). IGI Global.
Menking A., Erickson I. (2015, April 18–20). The heart work of Wikipedia: Gendered, emotional labor in the world’s largest online encyclopedia [Conference session]. Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Korea.
Menking A., Erickson I., Pratt W. (2019, May). People who can take it: How women Wikipedians negotiate and navigate safety [Conference session]. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, UK.
Miller R. (2004, July 28). Wikipedia founder Jimmy Wales responds. Slashdot. https://slashdot.org/story/04/07/28/1351230/wikipedia-founder-jimmy-wales-responds
Nelson J. D. (2018). Rhetorical interventions: A project design for composing and editing Wikipedia articles. In Blair K. L., Nickoson L. (Eds.), Composing feminist inventions: Activism, engagement, praxis (pp. 489–503). University Press of Colorado.
Nguyen T. C. (2018). Echo chambers and epistemic bubbles. Episteme, 17, 141–161. https://doi.org/10.1017/epi.2018.32
Noble U. N. (2018). Algorithms of oppression: How search engines reinforce racism. New York University Press.
Oliver J. T. (2015). One-shot Wikipedia: An edit-sprint toward information literacy. Reference Services Review, 43(1), 81–97.
O’Neil C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. Crown.
Patch P. (2010). Meeting student writers where they are: Using Wikipedia to teach responsible scholarship. Teaching English in the Two-Year College, 37(3), 278–285.
Prabhala A. (2011). Research:Oral citations. Meta-Wiki. https://meta.wikimedia.org/wiki/Research:Oral_Citations
Purdy J. P. (2009). When the tenets of composition go public: A study of writing in Wikipedia. College Composition and Communication, 61(2), 351–373.
Raval N. (2014). The encyclopedia must fail!—Notes on queering Wikipedia. ADA: A Journal of Gender, New Media, & Technology. https://adanewmedia.org/2014/07/issue5-raval/
Reilly C. A. (2011, January). Teaching Wikipedia as a mirrored technology. First Monday, 16(1). http://firstmonday.org/article/view/2824/2746
Reliability of Wikipedia. (2019, February 4). Wikipedia, the Free Encyclopedia. Retrieved February 4, 2020, from https://en.wikipedia.org/w/index.php?title=Reliability_of_Wikipedia&oldid=938793197
Rijshouwer E.A. (2019). Organizing democracy: Power concentration and self-organization in the evolution of Wikipedia. Erasmus University Rotterdam. http://hdl.handle.net/1765/113937
Roth A., Davis R., Carver B. (2013, June). Assigning Wikipedia editing: Triangulation toward understanding university student engagement. First Monday, 18(6). https://firstmonday.org/ojs/index.php/fm/article/view/4340/3687
Shaw A., Hill B. M. (2014). Laboratories of oligarchy? How the Iron Law extends to peer production. Journal of Communication, 64(2), 215–238. https://doi.org/10.1111/jcom.12082
Simonite T. (2019, February 15). Inside the Alexa-friendly world of Wikidata. Wired. https://www.wired.com/story/inside-the-alexa-friendly-world-of-wikidata/
Soha M., McDowell Z. J. (2016). Monetizing a meme: YouTube, content ID, and the Harlem shake. Social Media + Society, 2(1). https://doi.org/10.1177/2056305115623801
Soll J. (2016, December 18). The long and brutal history of fake news. POLITICO Magazine. https://www.politico.com/magazine/story/2016/12/fake-news-history-long-violent-214535
Standing against hate. (2019, March 27). Facebook Newsroom. https://newsroom.fb.com/news/2019/03/standing-against-hate/
Stanford History Education Group. (2016). Evaluating information: The cornerstone of civic online reasoning. Executive summary [PDF file]. https://stacks.stanford.edu/file/druid:fv751yt5934/SHEG%20Evaluating%20Information%20Online.pdf
Sweeney M. (2012). The Wikipedia project: Changing students from consumers to producers. Teaching English in the Two-Year College, 39(3), 256–267.
Tardy C. M. (2010). Writing for the world: Wikipedia as an introduction to academic writing. English Teaching Forum, 48(1), 12–27.
Vetter M. A. (2013). Composing with Wikipedia: A classroom study of online writing. Computers and Composition Online. http://candcblog.org/mvetter/public_html/composingwithwikipedia/
Vetter M. A. (2014). What composition students and academic libraries can gain from digital- collaborative pedagogies. Composition Studies, 42(1), 35–53.
Vetter M. A. (2018). Teaching Wikipedia: Appalachian rhetoric and the encyclopedic politics of representation. College English, 80(5), 397–422.
Vetter M. A., McDowell Z., Stewart M. (2019). From opportunities to outcomes: The Wikipedia-based writing assignment. Computers and Composition: An International Journal, 52, 53–64.
Wadewitz A. (2013, April 9). Wikipedia is pushing the boundaries of scholarly practice but the gender gap must be addressed. Impact of Social Sciences Blog. https://blogs.lse.ac.uk/impactofsocialsciences/2013/04/09/change-the-world-edit-wikipedia/
Web 2.0. (2019, September 19). Wikipedia, the Free Encyclopedia. Retrieved September 19, 2019, from https://en.wikipedia.org/w/index.php?title=Web_2.0&oldid=916488578
Wikimedia Foundation. (2019, September 19). Wikipedia, the Free Encyclopedia. Retrieved September 24, 2019, from https://en.wikipedia.org/w/index.php?title=Wikimedia_Foundation&oldid=916474342
Wikipedia:Administration. (2019, September 19). Wikipedia, the Free Encyclopedia. Retrieved September 24, 2019, from https://en.wikipedia.org/w/index.php?title=Wikipedia:Administration&oldid=933559818
Wikipedia:Bot policy. (2019, September 9). Wikipedia, the Free Encyclopedia. Retrieved September 24, 2019, from https://en.wikipedia.org/w/index.php?title=Wikipedia:Bot_policy&oldid=928961433
Wikipedia:Bots. (2019, June 7). Wikipedia, the Free Encyclopedia. Retrieved April 10, 2019, from https://en.wikipedia.org/w/index.php?title=Wikipedia:Bots&oldid=934892560
Wikipedia:Bots/requests for approval/ClueBotNG. (2019, March 15). Wikipedia, the Free Encyclopedia. Retrieved September 24, 2019, from https://en.wikipedia.org/w/index.php?title=Wikipedia:Bots/Requests_for_approval/ClueBot_NG&oldid=887856776
Wikipedia:Five pillars. (2019, July 31). Wikipedia, the Free Encyclopedia. Retrieved September 24, 2019, from https://en.wikipedia.org/w/index.php?title=Wikipedia:Five_pillars&oldid=935643964
Wikipedia:NPOV dispute. (2020, January 29). Wikipedia, the Free Encyclopedia. Retrieved February 4, 2020, from https://en.wikipedia.org/w/index.php?title=Wikipedia:NPOV_dispute&oldid=938133795
Wikipedia.org is more popular than . . . . (2018, March 30). Wikipedia, the Free Encyclopedia. Retrieved January 28, 2020, from https://meta.wikimedia.org/w/index.php?title=Wikipedia.org_is_more_popular_than…&oldid=17884685
Wikipedia:Statistics. (2019, September 9). Wikipedia, the Free Encyclopedia. Retrieved January 24, 2020, from https://en.wikipedia.org/w/index.php?title=Wikipedia:Statistics&oldid=958184495
Wikipedia:User access levels. (2020, June 23). Wikipedia, the Free Encyclopedia. Retrieved June 23, 2020, from https://en.wikipedia.org/w/index.php?title=Wikipedia:User_access_levels&oldid=964005194

Biographies

Zachary J. McDowell (PhD, University of Massachusetts, Amherst) is an assistant professor of Communication at the University of Illinois at Chicago. His research interests include digital literacy, information policy, access, knowledge equity, and knowledge cultures.
Matthew A. Vetter (PhD, Ohio University) is an assistant professor of Composition and Applied Linguistics at Indiana University of Pennsylvania. His research interests include digital culture, writing studies, and critical media literacy.

Cite article

Cite article

Cite article

OR

Download to reference manager

If you have citation software installed, you can download article citation data to the citation manager of your choice

Share options

Share

Share this article

Share with email
EMAIL ARTICLE LINK
Share on social media

Share access to this article

Sharing links are not relevant where the article is open access and not available if you do not have a subscription.

For more information view the Sage Journals article sharing page.

Information, rights and permissions

Information

Published In

Article first published online: July 2, 2020
Issue published: July-September 2020

Keywords

  1. Wikipedia
  2. disinformation
  3. community practices
  4. platform studies

Rights and permissions

© The Author(s) 2020.
This article is distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 License (https://creativecommons.org/licenses/by-nc/4.0/) which permits non-commercial use, reproduction and distribution of the work without further permission provided the original work is attributed as specified on the SAGE and Open Access page (https://us.sagepub.com/en-us/nam/open-access-at-sage).
Request permissions for this article.

Authors

Notes

Zachary J. McDowell, University of Illinois at Chicago, 1007 W Harrison St, Chicago, IL 60607, USA. Email: zjm@uic.edu

Metrics and citations

Metrics

Journals metrics

This article was published in Social Media + Society.

VIEW ALL JOURNAL METRICS

Article usage*

Total views and downloads: 8715

*Article usage tracking started in December 2016


Altmetric

See the impact this article is making through the number of times it’s been read, and the Altmetric Score.
Learn more about the Altmetric Scores



Articles citing this one

Receive email alerts when this article is cited

Web of Science: 9 view articles Opens in new tab

Crossref: 0

  1. Deletion discussions on Hebrew Wikipedia: Negotiating global and local...
    Go to citation Crossref Google Scholar
  2. Open Educational Resources for Literacies, Diversity, Equity and Inclu...
    Go to citation Crossref Google Scholar
  3. Wikipedia as Open Educational Practice: Experiential Learning, Critica...
    Go to citation Crossref Google Scholar
  4. Disinformation and multiliteracy: A systematic review of the literatur...
    Go to citation Crossref Google Scholar
  5. Expert information, economic rumors, and market self-organization: A l...
    Go to citation Crossref Google Scholar
  6. Writing Against the ‘Epistemology of Deceit’ on Wikipedia: A Feminist ...
    Go to citation Crossref Google Scholar

Figures and tables

Figures & Media

Tables

View Options

View options

PDF/ePub

View PDF/ePub

Get access

Access options

If you have access to journal content via a personal subscription, university, library, employer or society, select from the options below:


Alternatively, view purchase options below: