Introduction
A recent study by the
Stanford History Education Group (2016) came to the frightening conclusion that “young people’s ability to reason about the information on the Internet can be summed up in one word: bleak” (p. 4). Contemporary global issues have highlighted that “disinformation” and “fake news” remain major concerns that face modern democratic society, and our current tools and efforts for teaching and communicating effective information literacy require updating (
Jack, 2017). Furthermore in “It’s Complicated: The Social Lives of Networked Teens,”
danah boyd (2014) points out that students are being told to “avoid Wikipedia” and do their own research. Noting that the students “heard that Google was trustworthy, but Wikipedia was not,”
boyd (2017) wonders if media literacy might have “backfired,” and questions whether the critical lens that we tried to instill in students may have helped confuse information value.
As online information becomes increasingly complex and laden with misinformation (
Jack, 2017), information literacy practices that actively combat misinformation, disinformation, and propaganda remain imperative to study and implement. Where to find these, in part at least, may lie in the community-driven space that we have been told to avoid, Wikipedia. Wikipedia is a decentralized commons-based peer production community that both advocates for the “don’t trust, do research” mantra of the potentially problematic “media literacy” that we have participated in and follows a set of rules, or policies, that relies on, understands, and engages with traditional epistemological foundations. Numerous studies have illustrated that Wikipedia’s community produces and maintains an encyclopedia that is as (or more) accurate as “traditional” encyclopedias (
“Reliability of Wikipedia,” 2019), notwithstanding doubts regarding its reliability (
Taraborelli, 2012). Despite dealing with a daily onslaught of misinformation, advertisements, and other false editing and authorship, Wikipedia’s community has maintained this reliability for nearly two decades.
In this article, we bridge contemporary education research that addresses the experiential epistemology of learning to use Wikipedia with an understanding of how the inception and design of the platform fights disinformation and fake news via its framework of community-mediated policies. To accomplish this, we review and analyze relevant community policies of Wikipedia that govern decisions about information representation and inclusion, as well as how such decisions are shaped through community procedures. When discussing “procedures,” we refer to examples of the enactment of socially mediated policies. We ultimately argue that Wikipedia has become one of the few places on the internet dedicated to combating problematic information. Furthermore, we make recommendations on how to leverage Wikipedia practices and policies for information literacy policy and education beyond higher education classroom applications.
Of course, Wikipedia has not been without its issues. The encyclopedia community acknowledges challenges related to systemic social biases regarding gender and race (
“AfroCrowd,” 2019;
Glott et al., 2010;
Wadewitz, 2013) and its reliance on print-centric epistemologies (
Graham, 2011;
Prabhala, 2011;
Raval, 2014). Researchers have also examined how harassment of women and trans-identified editors is normalized in the community (
Menking & Erickson, 2015;
Menking et al., 2019). Marginalized (gender) identities often take on extra emotional (
Menking & Erickson, 2015) and identity-related labor to navigate a “spectrum of safe and unsafe spaces [in Wikipedia]” (
Menking et al., 2019) and productively contribute to the community. In terms of community governance, while some researchers have suggested that peer production communities “follow [Robert] Michel’s iron law of oligarchy” rather than more “democratic organizational forms” (
Shaw & Hill, 2014), others have suggested its resilience against such an evolution (
Konieczny, 2009). Indeed, it is Wikipedia’s participatory affordances that guard against oligarchy: the “high level of empowerment of individual Wikipedia editors with regard to policy making, the ease of communication, and the high dedication to ideals of contributors succeed in making Wikipedia an atypical organization, quite resilient to the Iron Law” (
Konieczny, 2009, p. 189). However, Wikipedia is not completely immune from a slide into oligarchy and requires continued efforts from multiple communities to sustain an active and vibrant volunteer base. Through analysis of relevant policy, this article strives to support such work, while acknowledging the problematic issues described above.
From the Classroom to the World
While the majority of studies on Wikipedia, information literacy, and knowledge production have been conducted on student learning outcomes and experience, this body of literature can also be read as controlled experiments on the experiential effects of engaging with Wikipedia as a knowledge community. What has been gleaned through research on student learning, for instance, can help shed light on the experience of Wikipedia in general, particularly for those who decide to learn to edit and participate in the Wikipedia knowledge community.
Recent research (
Dawe & Robinson, 2017;
Kamenetz, 2017;
Oliver, 2015;
Vetter et al., 2019) illustrates that students who engage with Wikipedia and its community are experiencing information literacy in much more effective ways, learning the necessary skills to combat misinformation and recognize valid information sources. While many students expressed having perceived the space as unreliable prior to editing Wikipedia (as they have been told “don’t use it”), their perception shifted through interaction with Wikipedia and the community, showing more trust in the reliability of Wikipedia as an information source. This trust was particularly due to their understanding of the Wikipedia community’s responsive and effective misinformation and disinformation combat practices, in addition to a better understanding of the structures of Wikipedia (and information in general).
Wikipedia allows for direct and transparent observation of the practices and concepts integral to combating misinformation, especially practices and concepts related to writing process, research, social collaboration, and digital rhetoric (
Gruwell, 2015;
Hood, 2007;
Kill, 2012;
Kuhne & Creel, 2012;
Patch, 2010;
Purdy, 2009;
Tardy, 2010). Furthermore, the encyclopedia also provides opportunity for learning these skills and community integration through a public writing experience with an authentic audience which is tangible (
Cummings, 2009;
Sweeney, 2012;
Vetter, 2013), and often results in increased motivation levels for completing assignments (
Cummings, 2009;
Vetter, 2014). Not only does participating in the Wikipedia community assist in learning digital/information literacy, critical research, teamwork, and technology skills, but students also reported pride in their work, spending more time, and more satisfaction with their class assignment than with traditional writing assignments. More than just learning topics and skills, they were motivated by their participation in a self-policing community dedicated to representing valid and verifiable information (
Vetter et al., 2019). While such research has primarily focused on applications of Wikipedia-based assignments in formal educational contexts, such opportunities are also available to the broader public as they visit, read, interact with, and edit the encyclopedia. Whether done on their own, or in a group setting such as an editathon, it goes without saying that being a “student” does not require a formalized academic setting, and it is our position that anyone who studies how to participate in Wikipedia could benefit in a similar manner.
Moving from the classroom to the larger global Wikipedia user base (over 21 billion pageviews per month), we see that Wikipedia’s community is far different from many other digital media sites. Participants are motivated by a sense of belonging to a larger volunteer network seeking to, as the founder of Wikipedia Jimmy Wales describes it, “[i]magine a world in which every single person on the planet is given free access to the sum of all human knowledge” (
Miller, 2004). This imagination is manifold in the participation within the Wikipedia community: participants are motivated by a grand goal, understand (and continue to learn anew about) their global and diverse audience, and they participate in and develop new ways of achieving that goal, particularly in how to represent knowledge equitably, accurately, and verifiably. Essentially, the experience of those who learn to edit Wikipedia and participate in the community is one of combating disinformation and systemic inequalities within the representation of that information. Furthermore, this experience is rooted in the community-generated policies, Wikipedia’s system of decision making, and the procedures that occur as a result of the enactment of policies.
Once a Child, Now Grown Up: How Wikipedia Helps Combat the Disinformation Crisis
As Wikipedia’s 20th year anniversary approaches, many of the site’s earlier peers have fallen by the wayside while the free encyclopedia founded on radical collaboration and reliable sourcing continues to persist. However, the online encyclopedia was often dismissed and disparaged during its first years—although some saw it as a vanguard, many thought the project was destined to fail (
Black, 2010;
Kamm, 2007). Some considered Wikipedia to be inaccurate or even dangerously open. Wikipedia did not fail, but it also did not become the utopian template for the web. In fact, as
Hill (2013) points out, Wikipedia’s policies that we discuss here may have helped its rise in popularity and assisted in both attracting contributors and staving off the “demise” that took many of its progenitors and competitors (
Waldman, 2004). What was once seen by the public as a utopian experiment has become the world’s largest and most popular reference work, supported by a growing and stable foundation. The
“Wikimedia Foundation” (2019), the organization that hosts and runs Wikipedia, has also grown exponentially, from only US$80k in revenue in 2003, to over US$100 million in revenue in 2017.
Despite what naysayers claim about Wikipedia’s inaccuracies or dangerous openness, numerous studies have favorably compared Wikipedia’s accuracy to “traditional” encyclopedias (
A. Brown, 2011;
Giles, 2005;
Hwang et al., 2014;
Kräenbring et al., 2014;
Taraborelli, 2012). That does not mean the encyclopedia does not continue to battle misinformation and inaccuracies, but that it has remained as or more reliable on major topics as other “more trustworthy” publishers. As we will discuss in this article, this is due to both relentless and ongoing efforts by volunteer editors, as well as by the design of the platform, community policies, and the enactment of those policies.
Pete Forsyth, the architect of Wikimedia Foundation’s Public Policy Initiative, which grew into the Wiki Education Foundation, stated “Wikipedia exists to battle fake news. That’s the whole point” (
Forsyth, 2018)—a fairly bold statement which deserves some unpacking in a climate where disinformation and fake news is rampant. Wikipedia’s battle against fake news, misinformation, and disinformation is waged within and through community-mediated practices, and policies put into place in the encyclopedia to verify and validate information, to ensure accuracy, neutrality, and to guard against bias and misinformation.
Beyond just the community practices, Wikipedia functions differently than most other websites today. One of the top websites in the world for traffic, Wikipedia is the only one run by a nonprofit (
“Wikipedia.org Is More Popular Than . . .,” 2018). Furthermore, Wikipedia does not try to predict what you encounter online and does not capture or analyze user data for advertising or content prediction. Unlike Google, Facebook, or other internet giants, Wikipedia does not curate, market, or algorithmically determine information in any way that restructures the results for users (
Hill, 2013). Wikipedia, in sticking with these ancient (pre-tracking internet technology) technologies, effectively combats fake news by disincentivizing the ways in which fake news has been incentivized in every other major platform through advertising, pay-per clicks, and other techniques.
Relying on an explication and analysis of Wikipedia policy, this article explores links between (1) the policy structure of Wikipedia, (2) the Wikipedia community, and (3) how participation in the encyclopedia aids in users’ development of critical information literacies. These links help engender what
boyd (2014) calls “antibodies to help people not be deceived.” Through illustrating how Wikipedia policies combat misinformation and disinformation, this article connects engagement with the Wikipedia community to a pedagogical practice that suggests that not only can we learn from Wikipedia but that Wikipedia’s successes can provide perspective in our current climate of problematic information.
Wikipedia Policy as Pedagogy
As the largest open educational resource in the world, Wikipedia is inherently pedagogical. Research on Wikipedia-based education has already demonstrated that academics can leverage the encyclopedia to teach toward outcomes related to information literacy, research, writing, and digital literacy, among others (
Cummings & DiLauro, 2017;
Garrison, 2015;
Konieczny, 2012;
Reilly, 2011;
Roth et al., 2013;
Vetter et al., 2019). Yet Wikipedia also serves the public beyond the classroom as readers and editors interact with its policies for information analysis and knowledge curation. In one way or another, everyone who comes into contact with Wikipedia is more or less a student of its content, policies, and procedures.
In an examination of credibility in Wikipedia,
Ryan McGrady (2013) recognizes and explores the encyclopedia’s complex processes of ethos creation by attending to the rhetorical processes of “content-creation practices . . . that train new editors” (p. 120). McGrady argues that “the ethos of Wikipedia can be found in its community, and their system of rules that lead to the creation of content, rather than the content itself” (
McGrady, 2013, p. 121). His framework for examining this system of rules, which analyzes both the Mediawiki platform (the open-source software that runs Wikipedia) and Wikipedia’s “Five Pillars,” adapts and broadens
Ian Bogost’s (2008,
2010) theory of procedural rhetoric to justify rule-based processes in the encyclopedia as intrinsically persuasive. By focusing on policy and practice rather than the accuracy of content itself, McGrady acknowledges how the encyclopedia’s ethos has developed to the point where we no longer need to question or prove its reliability. Indeed, as discussed previously, numerous studies have already shown the encyclopedia’s accuracy (
A. Brown, 2011;
Giles, 2005;
Hwang et al., 2014;
Kräenbring et al., 2014;
Taraborelli, 2012). While we do not see procedural rhetoric as a necessary framework for understanding Wikipedia’s capability to combat problematic information, we agree with McGrady on the fundamental argument that the Wikipedia community itself has created a series of policies that work toward credibility. Accordingly, this article strives to build on McGrady’s work by engaging in explication and analysis of Wikipedia policy as a pedagogy of information literacy. The Wikipedia community creates policies through socially mediated structures and processes, and these policies, in turn, shape encyclopedic content.
Identifying the policies of Wikipedia as a community affords a number of possibilities: the coded rules of the media wiki software and Wikipedia’s “Five Pillars” (as McGrady has examined), but also a number of other substantial rules more relevant to Wikipedia’s social practices. Our analysis, accordingly, extends the work of McGrady as misinformation, fake news, and other crises of authenticity become increasingly pervasive. In the following discussion and analysis, we trace the construction of credibility in the encyclopedia by examining (1) initial barriers to vandalism (spam edits, such as vulgar language) and misinformation (adding incorrect or misleading information, often unsourced or from disreputable sources) or disinformation (intentionally misleading misinformation), including Wikipedia bots and auto-confirmation “rules”; (2) Wikipedia’s policy of verifiability, which governs editors’ selection and engagement with secondary sources to create mainspace article content; (3) the policy of neutral point of view (NPOV), which influences the use of sources and encourages balanced coverage of topics; and (4) the policy of notability, which dictates how and when article content should be covered in the encyclopedia. As a whole, these systems reconstruct traditional models of authority (even ones that academia relies on) to retain Wikipedia’s credibility, even in the face of the current “fake news” crisis, through creating an open community of gatekeepers that enforce and police content that relies on traditional authoritarian knowledge hierarchies and values. They also provide a framework for understanding how Wikipedia can act pedagogically in regard to information literacy.
Governance: How Wikipedia’s Rules Are Ruled
Wikipedia’s governance has been the subject of an increasing amount of research in information science, sociology, and computer science. As discussed previously, scholars have questioned whether Wikipedia (and wikis in general) is susceptible to oligarchic tendencies as well as challenged Wikipedia’s rhetoric of “participatory and open” (
Shaw & Hill, 2014) and the constant and dynamic evolution of the encyclopedia’s rules as both stabilizing and limiting in terms of governance (
Keegan & Fiesler, 2017). Our own analysis is optimistic in acknowledging Wikipedia as “an atypical organization, quite resilient to the Iron Law” (
Konieczny, 2009, p. 189). Following Konieczny, we acknowledge Wikipedia’s continued success as something of an anomaly when compared to other systems of mass peer production. And while the encyclopedia will continue to require constant support from a diverse set of volunteer communities, its policies, especially those that are more stable, provide an important framework for both shared governance and information processing.
Despite an increasing reliance on automation, Wikipedia is a fundamentally social project; its policies and procedures emerge from social, democratic operations of governance and administration, carried out by elected volunteers from the Wikipedia community. In one of the most extensive examinations to date,
Rijshouwer (2019) contends that Wikipedia demonstrates a process of “self-organizing bureaucratization” through three distinct features of the self-organizing community. First, the Wikipedia community is “transient,” in that its organizations are dynamic and respond to new needs and challenges as the encyclopedia increases in size and complexity. Second, increasing bureaucratization in the community is deployed to mediate conflicts between (a) political differences regarding “community members’ autonomy and self-organizing character” and (b) other members’ introduction of “formal structures to pragmatically meet the project’s challenges and objectives” (
Rijshouwer, 2019, p. 237). Essentially, bureaucratization mediates conflicts between conservative/pragmatic impulses, on the one hand, and anti-authoritarian impulses, on the other, doing double duty to assist in the self-organization of a volunteer system. Finally, the self-organizing principle of the Wikipedia community is rooted in the “inclination to meet the ideal to organize themselves and their work as democratically as possible,” as the community self-polices its democratic organization (
Rijshouwer, 2019, p. 237). All told, the social project of Wikipedia relies on the prevailing ideology of democratic and transparent peer production through consensus and conflict mediation.
Rijshouwer’s identification of these features—transience, conflict mediation, and democratic peer production—helps to explain Wikipedia’s self-organizing bureaucratization as a method for general oversight and governance. Such bureaucratization also informs the ongoing development and revision of policies and guidelines in the encyclopedia. These are not seen as stable rules but “principles” developed through a social process requiring “discussion and a high level of community-wide consensus” (
“Wikipedia:Administration,” 2019). In fact, and as will be discussed later, one of the “pillars” of Wikipedia is that “there are no firm rules.” Most of the content policies we will discuss “have been accepted as fundamental since Wikipedia’s inception” (
“Wikipedia:Policies and Guidelines,” 2019). These policies are established through a variety of methods, including reorganizing existing policies as well as proposing new policies, but protocols always emerge through “strong community support” and are “seldom established without precedent” (
“Wikipedia:Policies and Guidelines,” 2019).
Some of these policies are fairly simple and straightforward and can be policed by automated systems helping to create more rigid barriers against misinformation. However, many of these policies remain complex enough that they and their implementation require consistent renegotiation and interpretation by the community. Each of these content policies helps to frame how the encyclopedia “works” in a distributed, open, volunteer-driven space and demonstrates how Wikipedia protocols act to regulate and process information, providing a pedagogy that promotes information literacy.
Neutral Point of View (WP:NPOV)
Both “no original research” and “verifiability” policies have their origins in NPOV. When secondary sources conflict, editors are encouraged to balance coverage by following NPOV, yet another policy that aids editors in validating and verifying information accuracy and controlling bias. One of the oldest policies in Wikipedia (appearing in 2001), NPOV attempts to provide balanced coverage of actual sources, and, in doing so, potentially combats against amplification and opinion biases. According to “Wikipedia:Neutral Point of View,” (2020),
[a]ll encyclopedic content on Wikipedia must be written from a neutral point of view (NPOV), which means representing fairly, proportionately, and, as far as possible, without editorial bias, all of the significant views that have been published by reliable sources on a topic.
Furthermore, NPOV asserts that articles should explain opposing viewpoints rather than favoring one or the other and that such favoring can happen in both the structure and the content of an article. NPOV forwards an epistemology in which editors are requested to “describe disputes” rather than “engage” them. Finally, editors are expected to provide complete information from multiple reliable sources to best represent controversial subjects. The policy article on NPOV offers the following “principles” to help “achieve the level of neutrality that is appropriate for the encyclopedia”:
•
Avoid stating opinions as facts.
•
Avoid stating seriously contested assertions as facts.
•
Avoid stating facts as opinions.
•
Prefer nonjudgmental language.
The policy goes beyond content to also suggest how an article’s structure might be carefully safeguarded against biases:
[p]ay attention to headers, footnotes, or other formatting elements that might unduly favor one point of view, and watch out for structural or stylistic aspects that make it difficult for a reader to fairly and equally assess the credibility of all relevant and related viewpoints. (
“Wikipedia:Neutral Point of View,” 2020)
NPOV also requires equal weight for citing ideas, meaning that although the article should represent different aspects of the topic, only insofar as it is weighting these sides in a neutral manner. The part of the NPOV policy that deals with “due” or “undue” weighting is careful about guidelines on how to weigh articles appropriately, warning that “Wikipedia policy does not state or imply that every minority view or extraordinary claim needs to be presented along with commonly accepted mainstream scholarship as if they were of equal validity” (
“Wikipedia:Neutral Point of View,” 2020). This policy helps to combat the “all sides are valid” claim that plagues many fringe political arguments with spurious claims and beliefs. It also helps to properly weigh articles such as “Climate Change” to accurately represent mainstream scholarship’s overwhelming consensus on the matter, while giving extremely little space for competing claims, as the scholarship for competing claims are few and far between.
The policy of NPOV is enacted in Wikipedia in numerous ways such as pointing out the problems on the talk page or the editor’s user page, annotating the page with a “[POV]” tag, and filing a request for comment or a report on the NPOV noticeboard. However, neutrality is an ongoing conversation that relies on consensus; hence, it is not something as cut and dry as verifiability. The noticeboard is encouraged to be used as a way to bring other editors in to discuss neutrality of an article, hoping to find a balance in both language and representation. Editors are encouraged to discuss their disputes over the neutrality of an article rather than simply reverting content, and document disputes over controversial subjects (
“Wikipedia:NPOV Dispute,” 2020). Instead of taking sides in the argument, editors are encouraged to document the different sides (balanced with sources, of course). Ultimately, NPOV helps to bring discussion around facts and representation which helps ensure that information remains and continues to remain accurate and representative of what is available to summarize. Furthermore, in the enactment of NPOV policy toward a public information literacy pedagogy, Wikipedia also encourages and facilitates critical discussion of information neutrality.
Conclusion
Researchers have already argued that Wikipedia community-driven practices can be leveraged for educational purposes—especially in terms of critical media literacy (
Cummings, 2009;
Jiang & Vetter, 2019;
Nelson, 2018;
Vetter, 2018;
Vetter et al., 2019). What we suggest here is that community-driven policies and the procedures following the enactment of those policies need to be better understood and explored beyond traditional models of education. Recognizing these policies as the construction of a particular credibility, an ethos should prompt academic researchers and teachers, as well as stakeholders in public policy and information literacy, to seek out methods and outlets for expanding and promoting Wikipedia literacy and participation.
In particular, by analyzing Wikipedia’s policies and how they value particular types of information and language, we have illustrated that one of the core strengths of the Wikipedia community is a strict adherence to traditional information value hierarchies as found in academia. Independent, verifiable sources are key to Wikipedia, as peer-reviewed journals take precedent, along with academic book publishers, and then high-quality national and international journalism. Information comes from multiple sources, and in Wikipedia, it is seen as a representation of the “conversation” of what is out there, much like an academic literature review.
As information has become more easily accessible, so has the ability to appear credible. This is not the first time where we fought disinformation nor will it be the last (
Marche, 2018). For better or worse, Wikipedia has taken up the mantle of arbiter of truth and knowledge, and, for the most part, has done an excellent job of it—so much so that other platforms, especially personal assistants such as Alexa and Siri, use Wikipedia’s information to police and contribute data to their own systems (
Bariso, 2018;
Dickey, 2019;
Farokhmanesh, 2018;
McCracken, 2018;
Simonite, 2019).
In light of Wikipedia’s communicative and community-driven resilience, we recommend three ways in which we can learn from Wikipedia when addressing other information platforms. First, by recognizing these community-mediated policies, we see an opportunity to contrast Wikipedia’s participatory social platform with other more commercialized digital media sites that users seek out information from. Problems related to disinformation and the fake news crisis are exacerbated in mainstream digital media sites (e.g., Facebook) by their inherent commercialization. The incentivization of disinformation is, in many ways, tied to systems of advertising in which revenue from click-throughs and pageviews serves as a dominant motive for content creation. Unlike other digital media sites, Wikipedia does not use cookies to track, collect data on, or predict behavior of its users. This is a radical departure from nearly
every other space on the internet. Perhaps this is because Wikipedia has remained a veritable dinosaur while most spaces on the internet have moved away from the logics of Web 2.0, away from a participatory web and toward a predictive web based on advertising revenue. What that means for other digital media platforms’ war with disinformation is, however, more difficult, as their business model relies on user data to sell advertising and services. Platforms wishing to combat this type of disinformation have constructed a system in which they are perpetually in an adversarial position against those seeking to profit from advertising revenue—locked into an ongoing (and ultimately a losing) game. In short, those who wish to exploit the system will always be one step ahead, as there will always be new ways to generate revenue in this manner. However, policies and procedures that are enacted by large-scale platforms are a first step toward disincentivizing particular types of fake news and disinformation, as we have seen recently with Facebook’s policies around anti-vaccination groups and white nationalism (
Ingber, 2019;
“Standing Against Hate,” 2019).
In short, it would behoove platforms to tread carefully when it comes to utilizing and selling data, as well as build more robust policies that disincentivize fake news and disinformation. Platforms such as Facebook have paid much lip service to the security and ownership of user data but continue to allow sponsored spam advertisements and problematic paid and targeted content. We understand that the current economic model of the internet makes this tricky, but polluting user’s experiences with large quantities of disinformation does more than just spread it around, it actively disincentivizes users to think of the platform as a community.
Second, Wikipedia assists in learning and experiencing information literacy in much more effective and non-exploitative ways, and it all stems from the design of the platform, policies, and community’s dedication. The question is not whether students, young people, and everyone else is using Wikipedia, but whether and how people trust that information, and how folks are making decisions about the information they experience using these frameworks. It is clear that Wikipedia’s community is on to something with their commons-based peer production method of information production, as it reconfigures authoritarian knowledge structures while doubling down on many “traditional” sources of knowledge. Wikipedia illustrates that (1) returning to valuing traditional knowledge hierarchies can be incredibly helpful to “make sense” of information, (2) policies and practices are part of the “fake news” solution, but issues are always going to be baked into a system that monetizes and incentivizes clicks and advertising, and (3) these issues will always be problematic in controlled digital media platforms that are not community driven.
Wikipedia’s socially mediated policies and procedures help to reconstruct more traditional models of authority to uphold the credibility of the encyclopedia and protect against problematic information (
Jack, 2017). These policies and procedures also provide pedagogical opportunities for those interacting with the encyclopedia beyond higher education institutions. While user-editors may engage more with Wikipedia policy as pedagogy, and thus learn more about the complex process of information validation in networked environments, casual visitors and readers can also benefit from being exposed to information vetting in the encyclopedia.
Platforms wishing to learn from Wikipedia should take note of Wikipedia’s commitment to secondary, independent, and reliable information as a basis for inclusion in the encyclopedia. Acknowledging that all information does not hold the same reliability illustrates that there is a need for a knowledge hierarchy and should be apparent in other platform’s designs. Whether this comes in the form of simply tagging information as “news” or “opinion” or going so far as greylisting and blacklisting certain sites, illustrating that a knowledge hierarchy exists, especially when it comes to disinformation and misinformation, can assist in combating many platforms’ designs that helps to encourage the belief that all information is equal.
Third, Wikipedia’s functioning as a community of practice, in contrast to other digital media platforms, highlights how Web 2.0 has all but disappeared in its most traditional sense. Digital spaces that were traditionally crowdsourced have been largely overtaken by commercial enterprises and platforms. For example, the YouTube “bargain” of contributing to a community in exchange for some advertising revenue has been replaced by the “new bargain” (
Soha & McDowell, 2016) expecting different types of contributions and usage of data. As internet and technology giants such as Facebook, Google, and Apple continue to accumulate and consolidate power over more of the web, this trend will only worsen. In contrast to Web 2.0, which emphasized “user-generated content, ease of use, participatory culture and interoperability
for end user” (
“Web 2.0,” 2019, emphasis ours), the internet after Web 2.0 (which has been variously termed the Semantic Web, Web 3.0, and Web 4.) commodifies and capitalizes on users as data rather than contributors or participants.
Wikipedia remains an outlier in these popular digital media platforms, of course, as it is specifically an encyclopedia project and not a networking site, search engine, or other type of commercialized platform. In short, Wikipedia has a stated community purpose, while the others are shells to host a variety of information. That being said, Wikipedia is a community, and like all communities, it has rules, expectations, and norms. So, while comparing Wikipedia and Facebook or Google is not apples to apples (or even apples to oranges), they are all extremely popular spaces where people get information and that information is, to some extent, curated and annotated by the community.
Platforms wishing to learn from Wikipedia here must take note of the investment in the community of Wikipedia, as well as that of others in the past (e.g., YouTube before the “new bargain”). The participatory web was successful in many ways and engendered feelings of community with participants. Wikipedians rally around their space as a massive group project, but others have been successful in building spaces for a variety of projects and voices. Communities find ways to self-police as they care about the space they occupy and build community, reducing the overall load on the platform to police but not until the community feels like it has a space to defend.
All other aspects of these learnings dovetail into the final recommendation—platforms cannot simply police themselves and will always fail if the communities do not care about their space. From this, we can see how platforms selling data incentivizes disinformation spread through targeting, which creates a space that cannot be trusted, and how designing an information space to respect knowledge hierarchies will help to ensure that communities experience information in a way that prioritizes investigative journalism and peer-reviewed science, and de-prioritizes (or at least makes clear the difference) partisan opinionated pieces and other potential problematic information. These and other related policies and practices can help to foster community spaces that prioritize, share, and celebrate good information, as well as recognize when misinformation and disinformation sneak in.
Through examining how Wikipedia policies related to bots and auto-confirmation “rules,” verifiability, NPOV, and notability, we can not only see how Wikipedia helps to combat mis/disinformation but also see how these policies support Wikipedia’s role as an open educational resource. Furthermore, as students both traditional and outside the classroom interact with Wikipedia, we can understand how such interaction teaches strategies for identifying and combating problematic information. That being said, the English Wikipedia is accessed by an average of 858 million unique devices per month, while only 66,000 editors are considered active (5 or more edits in a given month), about 0.008% (
“Wikipedia:Statistics,” 2019). Considering only a very small percentage of users are active editors on Wikipedia, both academics and Wikipedia community members need to continue to encourage more participation in the encyclopedia.
Ultimately, this acknowledgment of the changing web prompts us to encourage and explore more participatory digital practices that wiki platforms and other forms of media afford while attending to Wikipedia’s community-driven practices. Furthermore, the recommendations offered here open new avenues for learning from and with Wikipedia about how we might better investigate, research, and maintain a critical lens on the larger issues in representations of “truth”—especially in the current crisis of fake news and disinformation.