Iron Law of Bureaucracy: the downwards deletionism spiral discourages contribution and is how Wikipedia will die.
English Wikipedia is in decline. As a long-time editor & former admin, I was deeply dismayed by the process. Here, I discuss UI principles, changes in Wikipedian culture, the large-scale statistical evidence of decline, run small-scale experiments demonstrating the harm, and conclude with parting thoughts.
Wikipedia is declining, fundamentally, because of its increasingly narrow attitude as to what are acceptable topics and to what depth those topics can be explored, combined with a narrowed attitude as to what are acceptable sources, where academic & media coverage trumps any consideration of other factors. This discourages contributors—the prerequisite for any content whatsoever—and cuts off growth; perversely, the lack of contributors becomes its own excuse for discouraging more contribution (since who will maintain it?), a self-fulfilling norm (we focus on quality over quantity here!) and drives away those with dissenting views, since unsurprisingly those who advocate more content tend to also contribute content and be driven away when their content is. One bad editor can destroy in seconds what took many years to create. The inclusionists founded Wikipedia, but the deletionists froze it.
I started as an anon, making occasional small edits after I learned of WP from Slashdot in 2004. I happened to be a contributor to Everything2 at the time, and when one of my more encyclopedic articles was rejected, I decided it might as well go on Wikipedia, so I registered an account in 2005 and slowly got more serious about editing as I became more comfortable with WP and excited about its potential. Before I wound down my editing activity, dismayed by the cultural changes, I had done scores of articles & scores of thousands of edits. And old Wikipedia was exciting.
You can see this stark difference between old Wikipedia and modern Wikipedia: in the early days you could have things like articles on each chapter of Atlas Shrugged or each Pokemon. Even if you personally did not like Objectivism or Pokemon, you knew that you could go into just as much detail about the topics you liked best—Wikipedia was not paper! We talked idealistically about how Wikipedia could become an encyclopedia of specialist encyclopedias, the superset of encyclopedias. “would you expect to see a Bulbasaur article in a Pokemon encyclopedia? yes? then let’s have a Bulbasaur article”. The potential was that Wikipedia would be the summary of the Internet and books/media. Instead of punching in a keyword to a search engine and getting 100 pages dealing with tiny fragments of the topic (in however much detail), you would get a coherent overview summarizing everything worth knowing about the topic, for almost all topics.
But now Wikipedia’s narrowing focus means, only some of what is worth knowing, about some topics. Respectable topics. Mainstream topics. Unimpeachably Encyclopedic topics.
These days, that ideal is completely gone. If you try to write niche articles on certain topics, people will tell you to save it for Wikia. I am not excited or interested in such a parochial project which excludes so many of my interests, which does not want me to go into great depth about even the interests it deems meritorious—and a great many other people are not excited either, especially as they begin to realize that even if you navigate the culture correctly and get your material into Wikipedia, there is far from any guarantee that your contributions will be respected, not deleted, and improved. For the amateurs and also experts who wrote wikipedia, why would they want to contribute to some place that doesn’t want them?
The WikiMedia Foundation (WMF) seems unable to address this issue. I read their plans and projections, and I predicted well in advance that they would totally fail, as they have. Their ‘solutions’ were band-aids which didn’t get at what I or others were diagnosing as the underlying problems. The “barriers to entry” like the complex markup are not the true issue. They are problems, certainly, but not the core problem—if they were resolved, Wikipedia’s decline would continue. WMF seems to think that a little more lipstick on the pig will fix everything. Barriers to entry are a problem for non-technical new users, yes, but it does not explain why technical new users are also not appearing. Where are all the young programmers? They can easily learn the markup and handle the other barriers—if those barriers were the only barriers, Wikipedia should be having no problems. Plenty of potential editors in that sea. But if you go to programmer hangouts like Hacker News, you’re not going to find everyone going “I don’t know what people are complaining about, editing Wikipedia works just great for me!”, because they’re quite as embittered and jaded as other groups.
What is to be done? Hard to say. Wikipedia has already exiled hundreds of subject-area communities to Wikia, and I’d say the narrowing began in 2007, so there’s been a good 6 years of inertia and time for the rot to set in. And I haven’t thought much about it because too many people deny that there is any problem, and when they admit there is a problem, they focus on trivial issues like the MediaWiki markup. Nothing I can do about it, anyway. Once the problem has been diagnosed, time to move on to other activities.
Wikipedia will still exist. The corpus is too huge and valuable to rot easily. A system can decline without dying. MySpace still exists, and there is no reason Wikipedia cannot be MySpace—useful for some purposes, a shell of its former glory, a major breakthrough in its time, but fundamentally bypassed by other sources of information. I don’t know what the Facebook to Wikipedia’s MySpace is, but the Internet survived for decades without Wikipedia, we’ll get along without a live Wikipedia. Even though it is a huge loss of potential.
Friction
A perennial lure of technology is its promise to let us do things that we couldn’t do before, and in ways we wouldn’t before.
An example here would be Wikipedia and wikis in general: by lowering the ‘cost’ of changing a page, and using software that makes undoing most vandalism far easier than doing it, the participation goes through the roof. It’s not the technology itself that really matters, but how easy and comfortable it is to contribute. Benjamin Mako Hill has been investigating why Wikipedia, out of 8 comparable attempts to write an online encyclopedia, succeeded; his conclusion seems to be that Wikipedia succeeded by focusing on developing content and making contribution easy. “The contribution conundrum: Why did Wikipedia succeed while other encyclopedias failed?”:
One answer, which seems obvious only in retrospect: Wikipedia attracted contributors because it was built around a familiar product—the encyclopedia. Encyclopedias aren’t just artifacts; they’re also epistemic frames. They employ a particular—and, yet, universal—approach to organizing information. Prior to Wikipedia, online encyclopedias tried to do what we tend to think is a good thing when it comes to the web: challenging old metaphors, exploding analog traditions, inventing entirely new forms…Another intriguing finding: Wikipedia focused on substantive content development instead of technology. Wikipedia was the only project in the entire sample, Hill noted, that didn’t build its own technology. (It was, in fact, generally seen as technologically unsophisticated by other encyclopedias’ founders, who saw themselves more as technologists than as content providers.) GNUpedia, for example, had several people dedicated to building its infrastructure, but none devoted to building its articles. It was all very if you build it, they will come…There are two other key contributors to Wikipedia’s success with attracting contributors, Hill’s research suggests: Wikipedia offered low transaction costs to participation, and it de-emphasized the social ownership of content. Editing Wikipedia is easy, and instant, and virtually commitment-free. “You can come along and do a drive-by edit and never make a contribution again,” Hill pointed out. And the fact that it’s difficult to tell who wrote an article, or who edited it—rather than discouraging contribution, as you might assume—actually encouraged contributions, Hill found. “Low textual ownership resulted in more collaboration,” he put it. And that could well be because Wikipedia’s authorless structure lowers the pressure some might feel to contribute something stellar. The pull of reputation can discourage contributions even as it can also encourage them. So Wikipedia “took advantage of marginal contributions,” Hill noted—a sentence here, a graf there—which, added up, turned into articles. Which, added up, turned into an encyclopedia.
I’ve often thought that if the ‘barriers to entry’ were charted against ‘contributed effort’, one would see an exponentially inverse relation. An entire essay could likely be written on how the Wikipedia community put up small barriers—each individually reasonable, and not too onerous even in the aggregate—of referencing, of banning anonymous page creation, etc. led to the first sustained drop in contributors and contribution. The effect is nonlinear.
New Regimes
The best rule of thumb here is perhaps the one cited by Stewart Brand in The Clock of the Long Now:
According to a rule of thumb among engineers, any tenfold quantitative change is a qualitative change1, a fundamentally new situation rather than a simple extrapolation.
Clear as mud, eh? Let’s try more quotes, then:
The human longing for freedom of information is a terrible and wonderful thing. It delineates a pivotal difference between mental emancipation and slavery. It has launched protests, rebellions, and revolutions. Thousands have devoted their lives to it, thousands of others have even died for it. And it can be stopped dead in its tracks by requiring people to search for “how to set up proxy” before viewing their anti-government website.
I was reminded of this recently by Eliezer’s Less Wrong Progress Report. He mentioned how surprised he was that so many people were posting so much stuff on Less Wrong, when very few people had ever taken advantage of Overcoming Bias’ policy of accepting contributions if you emailed them to a moderator and the moderator approved. Apparently all us folk brimming with ideas for posts didn’t want to deal with the aggravation.2
We examine open access articles from three journals at the University of Georgia School of Law and confirm that legal scholarship freely available via open access improves an article’s research impact. Open access legal scholarship—which today appears to account for almost half of the output of law faculties—can expect to receive 50% more citations than non-open access writings of similar age from the same venue.34
There are tools to just say, “Give me your social security number, give me your address and your mother’s maiden name, and we send you a physical piece of paper and you sign it and send it back to us.” By the time that’s all accomplished, you are a very safe user. But by then you are also not an user, because for every step you have to take, the dropoff rate is probably 30%. If you take ten steps, and each time you lose one-third of the users, you’ll have no users by the time you’re done with the fourth step.5
For example, usability theory holds that if you make a task 10% easier, you double the number of people that can accomplish it. I’ve always felt that if you can make it 10% easier to fill in a bug report, you’ll get twice as many bug reports. (When I removed two questions from the Joel On Software signup page, the rate of new signups went up dramatically).6
Think of these barriers as an obstacle course that people have to run before you can count them as your customers. If you start out with a field of 1000 runners, about half of them will trip on the tires; half of the survivors won’t be strong enough to jump the wall; half of those survivors will fall off the rope ladder into the mud, and so on, until only 1 or 2 people actually overcome all the hurdles. With 8 or 9 barriers, everybody will have one non-negotiable deal killer…By incessant pounding on eliminating barriers, [Microsoft] slowly pried some market share away from Lotus.7
The vast majority of raters were previously only readers of Wikipedia. Of the registered users that rated an article, 66% had no prior editing activity. For these registered users, rating an article represents their first participatory activity on Wikipedia. These initial results show that we are starting to engage these users beyond just passive reading, and they seem to like it…Once users have successfully submitted a rating, a randomly selected subset of them are shown an invitation to edit the page. Of the users that were invited to edit, 17% attempted to edit the page. 15% of those ended up successfully completing an edit. These results strongly suggest that a feedback tool could successfully convert passive readers into active contributors of Wikipedia. A rich text editor could make this path to editing even more promising.8
Toeing the Precipice
It may take only a few restrictions before one has inched far enough the ‘barriers’ axis that the ‘contributions’ does in fact fall by tenfold. One sees Wikipedia slowly adding restrictions:
-
2005: we ban anonymous page creation;
-
2006: anonymous users must solve CAPTCHAs if they wish to add URLs;
-
2007: use of
{{fact}}
templates institutionalized, and tougher referencing guidelines; -
2008: harsher AfDs mean a banner year for deletionists such as User:TTN;
-
2009: flagged revisions on some wikis, in some areas of English Wikipedia. The end of live changes, prior restraint on publication.
-
201?: new editors banned from article creation (for their own good, of course), with additional measures like ‘pending changes’
As predicted, this change was eventually pushed through in 2017, supposedly as a trial, and made permanent in 2018.
Each of these steps seems harmless enough, perhaps, because we can’t see the things which do not happen as a result (this is a version of Frédéric Bastiat’s fallacy of the invisible). The legalistic motto “that which is not explicitly permitted is forbidden” has the virtue of being easy to apply, at least.
Few objected to the banning of anonymous page creation by Jimbo Wales during the Seigenthaler incident (we had to destroy the wiki to save it), and most of those were unprincipled ones. The objector was all for a tougher War on Drugs—er, I mean Terror, or was that Vandalism? (maybe Poverty)—but they didn’t want to be stampeded into it by some bad PR. Too, few objected to CAPTCHAs: ‘take that you scumbag spammers!’ The ironic thing is, as a fraction of edits, vandalism shrunk from 2003–2008 (remaining roughly similar since) and similarly, users specializing in vandal fighting and their workload of edits have shrunk; graphing new contributions by size, one finds that for both registered and anonymous users, the apogee was 2007 and vandalism has been decreasing ever since. (A more ambiguous statistic is the reduced number of actions by new page patrollers.)
Falling
Who alive can say,
“Thou art no Poet—may’st not tell thy dreams?”
Since every man whose soul is not a clod
Hath visions, and would speak, if he had loved,
And been well nurtured in his mother tongue.
But by 2007 the water had become hot enough to be felt by devotees of modern fiction (that is, anime & manga franchises, video games, novels, etc.), and even the great Jimbo could not expect to see his articles go un-AfD’d.
But who really cares about what some nerds like? What matters is Notability with a capital N, and the fact that our feelings were hurt by some Wikigroaning! After all, clearly the proper way to respond to the observation that Lightsaber combat was longer than Sabre is to delete its contents and have people read the short, scrawny—but serious!—Lightsaber article instead.
If it doesn’t appear in Encarta or Encyclopedia Britannica, or isn’t treated at the same (proportional) length, then it must go!
By the Numbers
Imagine a world in which every single person on the planet is given free access to the sum of all human knowledge. That’s what we’re doing.
Jimmy Wales, 2004
…inclusionism generally is toxic. It lets a huge volume of garbage pile up. Deletionism just takes out the trash. We did it with damn Pokemon, and we’ll eventually do it with junk football ‘biographies’, with ‘football’ in the sense of American and otherwise. We’ll sooner or later get it done with ‘populated places’ and the like too.
Todd Allen, 2019-07-05 (WP editor 2004–, admin 2007–, 2014–2016)
Deleting based on notability, fiction articles in particular, doesn’t merely ill-serve our readers (who are numerous; note how many of Wikipedia’s most popular pages are fiction-related, both now and in 2007 or 2011, or how many Internet searches lead to Wikipedia for cultural content9), but it also damages the community.
We can see it indirectly in the global statistics. The analyses (2007, 2008) show it. We are seeing fewer new editors, few new articles, fewer new images; less of everything, except tedium & bureaucracy.
Worse, it’s not that the growth of Wikipedia has stopped accelerating in important metrics. The rate of increase has in some cases not merely stopped increasing, but started dropping!
“…the size of the active editing community of the English Wikipedia peaked in early 2007 and has declined somewhat since then. Like Wikipedia’s article count, the number of active editors grew exponentially during the early years of the project. The article creation rate (which is tracked at Wikipedia:Size of Wikipedia) peaked around August 2006 at about 2400 net new articles per day and has fallen since then, to around under 1400 in recent months. [The graph is mirrored at Andrew Lih’s “Wikipedia Plateau?”.]
…User:MBisanz has charted the number of new accounts registered per month, which tells a very similar story: March 2007 recorded the largest number of new accounts, and the rate of new account creation has fallen significantly since then. Declines in activity have also been noted, and fretted about, at Wikipedia:Requests for adminship…”
This been noted in multiple sources, such as Felipe 2009 thesis, “Wikipedia: A Quantitative Analysis”:
So far, our empirical analysis of the top ten Wikipedias has revealed that the stabilization of the number of contributions from logged authors in Wikipedia during 2007 has influenced the evolution of the project, breaking down the steady growing rate of previous years…
Unfortunately, this results raise several important concerns for the Wikipedia project. Though we do not have empirical data from 2008, the change in the trend of births and deaths [new & inactive editors] will clearly decrease the number of available logged authors in all language versions, thus cutting out the capacity of the project to effectively undertake revisions and improve contents. Even more serious is the slightly decreasing trend that is starting to appear in the monthly number of births of most versions. The rate of deaths, on the contrary, does not seem to leave its ascending tendency. Evaluating the results for 2008 will be a key aspect to validate the hypothesis that this trend has changed indeed, and that the Wikipedia project needs to put in practice more aggressive measures to attract new users, if they do not want to see the monthly effort decrease in due course, as a result of the lack of human authors.10
Ortega notes indications that this is a pathology unique to En:
“In the first place, we note the remarkable difference between the English and the German language versions. The first one presents one of the worst survival curves in this series, along with the Portuguese Wikipedia, whereas the German version shows the best results until approximately 800 days. From that point on, the Japanese language version is the best one. In fact, the German, French, Japanese and Polish Wikipedias exhibits some of the best survival curves in the set, and only the English version clearly deviates from this general trend. The most probable explanation for this difference, taking into account that we are considering only logged authors in this analysis, is that the English Wikipedia receives too contributions from too many casual users, who never come back again after performing just a few revisions.”11
Erik Moeller of the WMF tried to wave away the results in November 2009 by pointing out that “The number of people writing Wikipedia peaked about two and a half years ago, declined slightly for a brief period, and has remained stable since then”, but he also shoots himself in the foot by pointing out that the number of articles keeps growing. That is not a sustainable disparity. Worse, as the original writers leave, their articles become legacy code—on which later editors must engage in archaeology, trying to retrieve the original references or understand why something was omitted, or must simply remove content because they do not understand the larger context or are ignorant. (I have had considerable difficulty answering some straightforward questions about errors in articles I researched and wrote entirely on my own; how well could a later editor have handled the questions?)
The numbers have been depressing ever since, from the 2010 informal & Foundation study12 on editor demographics to 2011 article contributions; the WSJ’s statistician Carl Bialik wrote in September 2011 that “the number of editors is dwindling. Just 35,844 registered editors made five or more edits in June, down 34% from the March 2007 peak. Just a small share of Wikipedia editors—about 3%—account for 85% of the site’s activity, a potential problem, since participation by these heavy users has fallen even more sharply.”
Only in 2010 and 2011 has the Foundation seemed to wake up and see what the numbers were saying all along; while Wales says some of the right things like “A lot of editorial guidelines…are impenetrable to new users”, he also back-handedly dismisses it—“We are not replenishing our ranks. It is not a crisis, but I consider it to be important.” By December 2011, Sue Gardner seems to reflect a more realistic view in the WMF, calling it the “holy-shit slide”; I think she is worth quoting at length to emphasize the issue. From the 2011-12-19 “The Gardner interview”:
Much of the interview concerned the issues she raised in a landmark address in November to the board of Wikimedia UK, in which she said the slide showing a graph of declining editor retention (below) is what the Foundation calls “the holy-shit slide”. This is a huge, “really really bad” problem, she told Wikimedia UK, and is worst on the English and German Wikipedias.
A prominent issue on the English Wikipedia is whether attempts to achieve high quality in articles—and perceptions that this is entangled with unfriendly treatment of newbies by the community—are associated with low rates of attracting and retaining new editors. Although Gardner believes that high quality and attracting new editors are both critical goals, her view is that quality has not been the problem, although she didn’t define exactly what article quality is. What we didn’t know in 2007, she said, was that “quality was doing fine, whereas participation was in serious trouble. The English Wikipedia was at the tail end of a significant drop in the retention of new editors: people were giving up the editing process more quickly than ever before.”
Participation matters because it drives quality. People come and go naturally, and that means we need to continually bring in and successfully orient new people. If we don’t, the community will shrink over time and quality will suffer. That’s why participation is our top priority right now.
…Deletions and reversions might be distasteful to new editors, but how can we, for instance, maintain strict standards about biographies of living people (BLP) without reverting problematic edits and deleting inappropriate articles? Gardner rejected the premise:
I don’t believe that quality and openness are inherently opposed to each other. Openness is what enables and motivates people to show up in the first place. It also means we’ll get some bad faith contributors and some who don’t have the basic competence to contribute well. But that’s a reasonable price to pay for the overall effectiveness of an open system, and it doesn’t invalidate the basic premise of Wikipedia: that openness will lead to quality.
…While staking the Foundation’s claim to the more technical side of the equation, Gardner doesn’t shrink from providing advice on how we can fix the cultural problem.
If you look at new editors’ talk pages, they can be pretty depressing—they’re often an uninterrupted stream of warnings and criticisms. Experienced editors put those warnings there because they want to make Wikipedia better: their intent is good. But the overall effect, we know, is that the new editors get discouraged. They feel like they’re making mistakes, that they’re getting in trouble, people don’t want their help. And so they leave, and who can blame them? We can mitigate some of that by toning down the intimidation factor of the warnings: making them simpler and friendlier. We can also help by adding some praise and thanks into the mix. When the Foundation surveys current editors, they tell us one of the things they enjoy most about editing Wikipedia is when someone they respect tells them they’re doing a good job. Praise and thanks are powerful.
…[Around the time of the Seigenthaler and Essjay controversies] Jimmy went to Wikimedia and said “quality … we need to do better”, [and through the distortions of the ripple-effect in the projects] there was this moral panic created around quality … what Jimmy said gave a whole lot of people the license to be jerks. … Folks are playing Wikipedia like it’s a video game and their job is to kill vandals … every now and again a nun or a tourist wanders in front of the AK47 and gets murdered …
Many people have complained that Wikipedia patrollers and administrators have become insular and taken on a bunker mentality, driving new contributors away. Do you agree, and if so, how can this attitude be combated without alienating the current core contributors?
I wouldn’t characterize it as bunker mentality at all. It’s just a system that’s currently optimized for combating bad edits, while being insufficiently concerned with the well-being of new editors who are, in good faith, trying to help the projects. That’s understandable, because it’s a lot easier to optimize for one thing (no bad edit should survive for very long) than for many things (good edits should be preserved and built upon, new editors should be welcomed and coached, etc.). So I don’t think it’s an attitudinal problem, but more an issue of focusing energy now on re-balancing to ensure our processes for patrolling edits, deleting content, etc. are also designed to be encouraging and supportive of new people.
How can a culture that has a heavy status quo bias be changed? How can the community be persuaded to become less risk-averse?
My hope is that the community will become less risk-averse as the Foundation makes successful, useful interventions. I believe the Vector usability improvements are generally seen as successful, although they of course haven’t gone far enough yet. Wikilove is a small feature, but it’s been adopted by 13 Wikipedia language-versions, plus Commons. The article feedback tool is on the English Wikipedia and is currently being used in seven other projects. The new-editor feedback dashboard is live on the English and Dutch Wikipedias. New warning templates are being tested on the English and Portuguese Wikipedias. And the first opt-in user-facing prototype of the visual editor will be available within a few weeks. My hope is all this will create a virtuous circle: support for openness will begin to increase openness, which will begin to increase new editor retention, which will begin to relieve the workload of experienced editors, which will enable everyone to relax a little and allow for more experimentation and playfulness.
Regaining our sense of openness will be hard work: it flies in the face of some of our strongest and least healthy instincts as human beings. People find it difficult to assume good faith and to devolve power. We naturally put up walls and our brains fall into us-versus-them patterns. That’s normal. But we need to resist it. The Wikimedia projects are a triumph of human achievement, and they’re built on a belief that human beings are generally well-intentioned and want to help. We need to remember that and to behave consistently with it.
I am skeptical that Gardner’s initiatives will change the curves (although they are not bad ideas); my general belief is that deleting pages, and the omnipresent threat of deletion, are far more harmful than complex markup. (I should note that Gardner has read and praised this essay, but also that much of this essay is based on my feelings and may not generalize.)
Regardless of whether the WMF really understands the issue, it is almost unintentionally hilarious to look at the proposed solutions—for example, one amounts to restoring early Wikipedia culture & practices in private sandboxes, protected from the regulars & their guidelines! Band-aids like Wikilove or article rating buttons are not getting at the core of the problem; a community does not live on high-quality rating tools (Everything2) or die on poor ones (YouTube). The Foundation/developers sometimes do the right thing, like striking down an English Wikipedia ‘consensus’ to restrict article creation even further, but will it be enough? To quote Carl Bialik again:
Adding more editors “is one of our top priorities for the year,” says Howie Fung, senior product manager for the Wikimedia Foundation, which aims to increase the number of editors across all languages of Wikipedia to 95,000 from 81,450 by June of next year.
The subsequent research has in some respects vindicated my views: some have tried to argue that the declines are due to picking all the low-hanging fruit in articles or in available editors, that lower quality editors merited additional procedures. But what we see is not that new editors are worse or lower-quality, but that they are as high-quality and useful as they have been since 2006; nor is this due to a declining supply of new editors plus better procedures for winnowing them out, from “Kids these days: the quality of new Wikipedia editors over time” (“Research:Newcomer quality”):
What we found was encouraging: the quality of new editors has not substantially changed since 2006. Moreover, both in the early days of Wikipedia and now, the majority of new editors are not out to obviously harm the encyclopedia (~80%), and many of them are leaving valuable contributions to the project in their first editing session (~40%). However, the rate of rejection of all good-faith new editors’ first contributions has been rising steadily, and, accordingly, retention rates have fallen. What this means is that while just as many productive contributors enter the project today as in 2006, they are entering an environment that is increasingly challenging, critical, and/or hostile to their work. These latter findings have also been confirmed through previous research.
(I am struck by the fall in newbie survival rates for the highest-quality—‘golden’—editors in 2006–2007. The Seigenthaler affair was, recollect, November–December 2005.)
I suspected that Fung’s objective would not be reached, as indeed it was not13.
Remember, most measures are directed against casual users. Power users can navigate the endless processes, or call in powerful friends, or simply wait a few years14 The most powerful predictor of whether an editor will stop editing is… how much they are editing.15 User:Resident Mario (joined 2008) points in his December 2011 essay “Openness versus quality: why we’re doing it wrong, and how to fix it”16 to a dramatic graph of editor counts17:
And it’s casual users who matter. We lost the credentialed experts years ago, if we ever had them. Surveys asking why are almost otiose; they will do so if they are exceptional or if they are managing PR around a discovery. But Wikipedia is not Long Content; why would they contribute if they can get the traffic they desire just by inserting links18? Why would they build their intellectual houses on sand?19 They get the best of both worlds—gaining traffic and avoiding the toxic deletionists.
And we can see this quite directly: when the general population of editors get solicited to contribute to AfD, their !votes are different from the AfD regulars, and in particular, when keep !voters spread the word about an AfD, their recruits are much more likely to !vote keep a well, while would-be deleters do their cause no favor with publicity20. Can there be any more convincing proof that deletionism and its manifestations are a cancer on the Wikipedia corpus?
The Editing Community Is Dead; Who Killed It?
Having discussed the broad trend of deletionism and problems with editors, let’s look at one specific deletionist practice which has, as far as I know, never been examined before, despite being a classic deletionist practice and, like most deletionist practices, one that by the numbers turns out to badly misserve both editors and readers: the practice of moving links from External Links to the Talk page.
The reason for my interest in this minor deletionist practice is that I no longer edit as much as I used to, and so frequently when I find an excellent citation (article, review, interview etc.) I will often just copy it into the External Links section or (if I am feeling especially energetic) I will excerpt the important bits onto the article’s Talk page. I realized that this constitutes what one might call a “natural experiment”: I could go back and see how often the excerpts were copied by another editor into the article. This is better than just looking at “how often anime editors edit” or “how often anime articles are edited” because it is less related to outside events—perhaps anime news was simply boring over that period or perhaps some new bots or scripts were rolled out. Whereas if there are no anime editors who will edit even when presented with gift-wrapped RSs (links & excerpts specifically called out for their attention, and trivially copy-pasted into the article), then that’s pretty convincing evidence that there is no longer a ‘there’ there—that the editors are no longer active.
Sins of Omission: Experiment 1
On at least two articles (Talk:Gurren Lagann#Interviews & Talk:Royal Space Force: The Wings of Honnêamise#Sources), I have been strenuously opposed by editors who object to having more than a handful of links in the designated External Links section; they acknowledged the links were (mostly) all undoubted RSs and relevant to the article—but they refused to incorporate the links into the article. This is bad from every angle, yet few other editors were interested in helping me.
So I’ve begun going through my old mainspace Talk edits using Special:Contributions
, starting all the way back in April 2007 (>4 years ago, more than enough time for editors to have made use of my gifts!), looking for cases where I’ve dumped such references. I compiled two lists, of 146 anime-related edits, and 102 non-anime-related edits.
Before going any further, it’s worth asking—to avoid hindsight bias and post hoc rationalization—what you expect my results to be.
When asking yourself, remember that these edits, and a larger set of edit we’ll soon examine, are selected edits; they are high-quality edits, ones where I thought the relevant article must cover it. They are not low-quality dumps of text or links by a passing anonymous editor or done out of idle amusement. What percentage would you expect to have been used after a week, enough time that most article-watchlisting editors will have seen the diff and had leisure to deal with task more complex than reverting vandalism? 50% doesn’t seem like a bad starting point. How about after a year? Or two? Maybe 70% or 90%? After that, if it hasn’t been dealt with, it’s probably not ever going to be dealt with (even assuming the section hasn’t been stuffed in an archive page). Hold onto your estimate.
Once the lists were compiled and weeded, I wrote a Haskell program to do the analysis. The program loads the specified Talk page URLs and extracts all URLs from the Talk diff so it can check whether any of them were linked in the Article (which, incidentally, leads to false positives and an overestimation21).
Results
The results for my edits when run on the two lists:
- anime: of 146 edits, 11 were used, or <8%
- non-anime: 102 edits, 3 used, or <3%
For comparison, we can look at an editor who has devoted much of her time to finding references for anime articles—but made the colossal mistake of believing the EL partisans when they said external links should either be incorporated into article text or listed on the talk page. User:KrebMarkt has made perhaps thousands of such edits from impeccable RSs; it is possible that my own contributions are skewed downwards, say, by a congenital inability to select good references. Hence, looking at her reference-edits will provide a cross-check.
I compiled her most recent 1000 edits to the article talk space with a quick download: elinks -dump 'https://en.wikipedia.org/w/index.php?title=Special:Contributions&limit=500&contribs=user&target=KrebMarkt&namespace=1' 'https://en.wikipedia.org/w/index.php?title=Special:Contributions&offset=20110227162151&limit=500&contribs=user&target=KrebMarkt&namespace=1' | grep '&diff='
. Then I manually removed edits which were minor or did not seem to be her usual reference-edits, resulting in the following list of 958 edits from December 2010 to December 2011. (KrebMarkt almost exclusively adds anime-related references, so I did not prepare a non-anime list.) The results:
- Of the 958 edits adding references, 36 were used in the article, or <4%
- Combining my anime & non-anime with KrebMarkt’s edits, we have 1206 edits adding references, of which less than 50 were used in the article, or <4.15%
Besides it being surprising that KrebMarkt (not a particularly committed inclusionist, if she be an inclusionist at all) had a success rate half mine, <4.15% is shockingly low.
1156 ignored edits represents a staggering waste of editor-time22. This cannot be explained as our faults: we are both experienced editors (I began editing in 2004, and KrebMarkt in 2008), who know what good RSs are. And all of the edits contain good RSs. (The reader is invited to check edits and see for himself whether they are solid and valuable RSs, like reviews by the Anime News Network.) That perhaps 1⁄10 of our suggested references are included is due solely to the apathy or nonexistence of other editors. (If such a rate is a ‘success’, may the Almighty preserve us from a failure!)
Since that will not soon change for the better, this leads to one conclusion: the idea that references hidden on Talk pages will one day be used is false.
Sins of Omission: Experiment 2
Somebody remarked: ‘I can tell by my own reaction to it that this book is harmful.’ But let him only wait and perhaps one day he will admit to himself that this same book has done him a great service by bringing out the hidden sickness of his heart and making it visible.
We have looked at what suggesting additions results in: abject failure. The Wikipedia community is failing at incorporating new links. Some attempted to justify my experiment above: it’s OK because at least the existing External Links sections are quality sections. This is desperate special pleading, but we should test it. How is the editing community at the flip side of the coin—retaining old links? If inclusionists’ suggestions are being ignored, is this at least fairly applied, with deletionists’ edits also futile?
Unfortunately, testing this requires destructive editing. (We can’t simply suggest on talk pages that external links be removed because that is both not how deletionists operate and likely will result in no changes, per the previous experiment demonstrating inaction on the part of editors.)
The procedure: remove random links and record whether they are restored to obtain a restoration rate.
-
Editors might defer to other editors, so I will remove links as an anonymous IP user from multiple proxies; the restoration rate will naturally be an underestimate of what a registered editor would be able to commit, much less a tendentious deletionist.
-
To avoid issues with cherry-picking or biased selection of links23, I will remove only the final external link on pages selected by
Special:Random#External_links
which have at least 2 external links in an ‘External links’ section, and where the final external link is neither an ‘official’ link nor template-generated. (This avoids issues where pages might have 5 or 10 ‘official’ external links to various versions or localizations, all of which an editor could confidently and blindly revert the removal of; template-generated links also carry imprimaturs of authority.) -
The edit summary for each edit will be
rm external link per [[WP:EL]]
—which has the nice property of being meaningless to anyone capable of critical thought (by definition, a link removal should be per one of WP:EL’s criterions—but which criterion?) but also official-looking like many deletionist edit-summaries.This point is very important. We are not interested in “vandalism in general”, nor “all possible forms of external link vandalism” (like adding spam links, inserting gibberish, breaking syntax), but in bad edits which mimic how a deletionist would edit. A deletionist would avoid certain links, and would be sure to make some allusion to policy. (Shades of Poe’s law: it is impossible to distinguish an actual deletionist’s edits from random deletions accompanied by repetitive jargon.) If our experiment does not mimic these traits, our final measurement of bad-edit reversion rate will simply not be measuring what we hoped to measure.
-
To avoid flooding issues and be less noticeable, no more than 5 or 10 links a day will be removed with at least 1 minute between each edit.
-
To avoid building up credibility, I will not make any real edits with the anonymous IPs
-
After the last of the 100 links have been removed, I will wait 1 month (long enough for the edit to drop off all watchlists and reversion rates become close to nonexistent24) and restore all links. I predict at least half will not be restored and certainly not more than 90%.
The full list of URL diffs is available as an appendix.
After finishing the link removals, I briefly looked over the edits contribution pages for (top)
, which specifies whether an edit is still the latest edit for that page (all reverted removals will by definition not still be the latest edit, but some non-reverted edits will have unrelated edits stealing the status, so the number gives an upper bound on how many removals were reverted). It looked like <10%.
I was also struck during the process of going through Special:Random
by how many ‘External Links’ sections have been, in wretched subterfuges, renamed ‘Sources’, ‘References’, ‘Further reading’, or the article has a long References section stuffed with external links which are used once; perhaps editors collectively know that putting a link into a section named ‘External Links’ is painting a cross-hair on its forehead. Too, I was struck by the general quality of the links: of the 100, I would have assented to the removal of no more than 5 (10 at the most). In general, articles err far on the side of including too few external links rather than too many.
How many readers were affected by my experiment over the course of the month of waiting? Feel free to estimate or give a range—1,000 or 10,000 or maybe 100,000 readers? The articles are randomly picked, so it seems highly unlikely that there is significant overlap. But my best estimate, based on stats.grok.se
data for the 100 articles’ traffic in March 2012, is that somewhere around >~335,000 readers were affected25.
How many editors were affected? The 100 articles edited were watchlisted by a median of 5 editors each; unfortunately, in lieu of technologies like Patrolled Revisions, we cannot estimate how many times each edit was checked by a human (as many of those editors no doubt are inactive or do not monitor their watchlist closely).
What was the early reaction when I mentioned this experiment? Ian Woollard said
…if you’d have picked something other than external links, that might, or might not have been a good test.
Last time I checked (which admittedly was a while ago) Wikipedia had a noticeboard whose entire purpose, was essentially to delete as many external links as possible, they’d even added a policy that said they could do that in every single case unless you could get a majority in a poll to keep individual links; oh and in practice they pretty much !vote-stuffed those polls too by announcing the polls on the noticeboard, so the chances of a clear majority was low. Oh, and there was a bunch of shady anonymous IPs involved as well that swing around after the fact to edit war them away anyway if an external link they didn’t favor gets through all that.
Basically, external links are one of the most hated parts of Wikipedia, and if hardly any of them got fixed it wouldn’t surprise me, and wouldn’t prove anything very much.
Exaggeration? Well, consider what the active administrator User:Future Perfect at Sunrise wrote in the WP:AN/I discussion:
Hmm, strange experiment. Given the huge number of inappropriate external links we have, I really wonder: wouldn’t a random removal of a hundred links catch so many bad links objectively worthy of removal that the net effect of the “vandalism” might be more benefit than harm? If the experiment is meant to measure how good the community is at reverting vandalism, I can’t see how they can do that without having a measure for these random beneficial hits.
None of the commenters rose to my challenge to estimate what the revision rate should be, with the exception of the administrator User:Horologium (who identifies as an transwiki-ing exclusionist26, which in practice means deletionism) who looked at 19 articles and estimated that ~30% of ELs were bad by his standards (so we can infer that a reversion rate of anything but 70% will highly likely either be allowing good links to be deleted or defending bad links by his standards).
Results
-
neither IP address was contacted at any point in the experiment, blocked, or banned
-
One article was deleted; my edit was not reverted before deletion (according to the admin Toby Bartel)
-
Of the 100 edits, 3 were reverted:
3% is far worse than I had predicted, and statistically suggests that the true rate is no higher than 7%27. This leads to one conclusion: external links are highly vulnerable to deletionism.
Followup
A month after this experiment, I resurveyed the 100 edits to see how many restorations had been reverted. 4 had been reverted:
- Castell Dinas Bran
- Protector (2009 film) (no explanation)
- Osprey Publishing (part of a wholesale deletion of links)
- Marilyn vos Savant (this one is questionable as well; linking to the Parade homepage seems distinctly less useful to the reader than linking to Parade’s back-archives where vos Savant’s columns are…)
Those who think that 3% was the correct reversion rate for the removals are invited to explain how 4% could be the correct reversion rate for the re-adding of the same links—if it was acceptable for 97% to be removed in the first place, how could it also be acceptable for 94% to then be restored?
Tallying the Damage
Ignoti, Sed Non Occulti
One might try to defend this wasteful practice by claiming that some editors and readers will go to the Talk page and there might notice and visit the deleted links. This could only ameliorate the problem slightly, but it’s worth investigating just how rarely Talk pages are visited so we can explode this particular instance of the ‘fallacy of the invisible’. How many of our readers actually look at the talk page as well? (Do a quick estimate, as before, so you can know if you were right or wrong, and by how much.) I know some writers writing articles on Wikipedia have mentioned or rhapsodized at length on the interest of the talk pages for articles, but they are rare birds and statistically irrelevant.
It might be enough simply to know how much traffic to talk pages there is period. I doubt editors make up much of Wikipedia’s traffic, with the shriveling of the editing population, which never kept pace with the growth into a top 10/20 website, so that would give a good upper bound. It would seem to be very small; there’s not a single Talk page in the top 1000 on stats.grok.se
’s top articles. We can look at individual articles; Talk:Anime has 273 hits over one month while the article Anime has 128,657 hits (a factor of 471); or Talk:Barack Obama with 1800 over that month compared to Barack Obama with its 504,827 hits (a factor of 280).
The raw stats used by stats.grok.se
are available for download, so we can look at all page hits, sum all article and all Talk hits and see what the ratio is for the entire English Wikipedia is on one day. (each file seems to be an hour of the day so I downloaded 24 and gunzip
ped them all.) We do some quick shell scripting. To find the aggregate hits for just talk pages:
grep -e '^en Talk:' -e '^en talk:' pagecounts-* | cut -d ' ' -f 3 | paste -sd + | bc
582771
To find aggregate hits for non-talk pages:
grep -e '^en ' pagecounts-* | grep -v -e '^en Talk:' -e '^en talk:' | cut -d ' ' -f 3 | paste -sd + | bc
202680742
The numbers look sane—58,2771 for all talk page hits versus 2,0268,0742 for all non-talk page hits. A factor of 347 is pretty much around where I was expecting based on those previous 2 pages. The traffic data developer, Domas, says the statistics exclude API hits but includes logged-in editor hits, so we can safely say that anonymous users made far fewer than 58k page views that day and hence the true ratios are worse than our previous ratios of 471/280/347. To put the relative numbers into proper perspective, we can convert into percentages:
- If we take the absolutely most favorable ratio, Obama’s at 280, and then further assume it was looked at by 0 logged-in users (yeah right), then that implies something posted on its talk page will be seen by <0.35% of interested readers ().
- If we use the aggregate statistic and say, generously, that registered users make up only 90% of the page views, then something on the talk page will be seen by <0.028% of interested readers ().
Measuring Talk Page Clicks: Dual N-back Experiment
Page views don’t tell us the most interesting thing, how many people would have clicked on the link if it had been on the article and not the Talk page. It’s impossible to answer this question in general, unfortunately, since Wikipedia does not track clicks.
However, I have approximated the ratio for at least one article: the dual n-back article links to my DNB FAQ. There are a few dozen visitors each day from Wikipedia, Google Analytics tells me. What will happen if the link is removed to the Talk page? The article and general interest in n-back haven’t changed—those variables are still the same. The same sort of people will be visiting the article and (not) visiting the Talk page. The visitor count will dramatically fall, probably to less than 1 a day. The link was in the article for perhaps half a year, since ~2011-07-14; on 2012-02-09, I shifted it to the Talk page with a fake message praising the contents, to mimic how an editor might genuinely post the link on the Talk page (asking the forbearance & cooperation of my fellow editors in hidden comments). I then scheduled a followup for 100 days: 2012-05-19.
It ought to be trivial and pointless—everyone should acknowledge that essentially no readers also read Talk pages, but it’s still worth precommitting: I predict that Talk click-throughs will average <5% of Article click-throughs, and the difference between the 2 datasets will be statistically-significant at p < 0.05.
As promised, on 2012-05-20 I restored my FAQ link and began analysis:
-
Before:
Between 2011-07-14 and 2012-02-08 (a longer period), the totals were 31,454/23,538 (pageview/unique pageview), with 1,910/1,412 from the English Wikipedia and as one would expect, a lesser 740/618 from the German Wikipedia28. n = 209, so the daily average click from the English Wikipedia is
-
After:
Between 10 February and 12:50 PM 2012-05-20, my DNB FAQ received from all sources 21,803/16,899 page views (raw/unique). 327/164 page views were from the German Wikipedia, and there were 161/155 page views from the English Wikipedia. n = 100, so the daily average is .
Dividing the two averages shows that the average clicks in this period were ~17.6%, not <5% as I had predicted. This difference between the two groups is statistically-significant at p < 0.001, needless to say29.
So, Talk page click-throughs are indeed lower than Article click-throughs, but almost 3 times larger than I expected. What happened? We know this can’t be the general case from looking at the states.grok.se
data—there just isn’t enough traffic to Talk pages for any reason.
My best guess is that the dual n-back article is simply a bad example. If we look at the April 2012 data as an example, we see that it gets something like 15 page views a day with occasional spikes and throughs, 568 visits over 30 days averaging 19 visits a day. There were 9 click-throughs on average during the previous sample—suggesting that something like half the readers are clicking through to one external link! This does not sound like “normal” article behavior, and suggests to me that the very short and incomplete nature of the dual n-back Wikipedia article is causing readers to look for further better information like my FAQ, which might cause readers to also resort to checking the talk page for information (where they would run into my glowing fake blurb visible on the first screen). Unfortunately, I cannot check this theory because currently only one article links to my site where I can gather Google Analytics information.
The Forgotten Reader
More instructive is estimating how many readers have been deprived of the chance to use the references for just the subset of 1206 edits we have already looked at above. We can reuse stats.grok.se
with a little more programming; we will ask it how many hits/page-views, in total, there were in November 2011 of the 472 unique articles covered by those 1206 edits.
The total: 8,480,394.
Extrapolating backwards to 2007/2008 is left as an exercise for the reader.
When we consider how false the idea that this practice serves the editor, and when we consider how many readers are ill-served, they suggest that the common practice of ‘moving reference/link to the Talk page’ be named for what it is: a subtle form of deletion.
It would be a service to our readers to end this practice entirely: if a link is good enough to be hidden on a Talk page (supposedly in the interests of incorporating it in the future, which we have seen is an empty promissory note), then it is good enough to put at the end of External Links or a Further Reading section, and the literally millions of affected readers will not be deprived of the chance to make use of them.
I fully expect to see this practice for years to come.
No Club That Would Have Me
Elaborate euphemisms may conceal your intent to kill, but behind any use of power over another the ultimate assumption remains: ‘I feed on your energy.’
Frank Herbert’s Dune Messiah (“Addenda to Orders in Council—The Emperor Paul Muad’dib”)
This result will come as no surprise to longtime inclusionists. The deletion process deletes most articles which enter it, and has long been complained about by outsiders. Entire communities (such as the web comics30 or MUD online communities31) have been alienated by purges of articles—purges which not infrequently result in abuse of process, much newbie biting, and comical spectacles like AfD regulars (usually deletionists) insisting a given article is absolutely non-notable and experts in the relevant field demurring; a particularly good AfD may see statements of experts dismissed on speciously procedural grounds such as having been made in the expert’s blog (and so failing WP:RS, or perhaps simply being dismissed as WP:OR) and not a traditional medium (despite the accelerating abandonment of ‘traditional’ RSs by experts in many fields32). The trend has been clear. Andrew Lih, who has been editing Wikipedia even longer than myself (since 2003) and who wrote a book on Wikipedia, writes in “Unwanted: New articles in Wikipedia”:
’It’s incredible to me that the community in Wikipedia has come to this, that articles so obviously “keep” just a year ago, are being challenged and locked out. When I was active back on the mailing lists in 2004, I was a well known deletionist. “Wiki isn’t paper, but it isn’t an attic,” I would say. Selectivity matters for a quality encyclopedia. But it’s a whole different mood in 2007. Today, I’d be labeled a wild eyed inclusionist. I suspect most veteran Wikipedians would be labeled a bleeding heart inclusionist too. How did we raise a new generation of folks who want to wipe out so much, who would shoot first, and not ask questions whatsoever? [If Lih can write this in 2007, you can imagine how people who identified as inclusionists in 2004, such as myself or The Cunctator, look to Wikipedians who recently joined.]
It’s as if there is a Soup Nazi culture now in Wikipedia. There are throngs of deletion happy users, like grumpy old gatekeepers, tossing out customers and articles if they don’t comply to some new prickly hard-nosed standard. It used to be if an article was short, someone would add to it. If there was spam, someone would remove it. If facts were questionable, someone would research it. The beauty of Wikipedia was the human factor—reasonable people interacting and collaborating, building off each other’s work. It was important to start stuff, even if it wasn’t complete. Assume good faith, neutral point of view and if it’s not right,
{{sofixit}}
. Things would grow.’
I was particularly depressed to read in the comments things from administrators whose names I recognize due to their long tenure on Wikipedia, like Llywrch (joined 2002):
“I’m sorry that you encountered that, Andrew—but not surprised. I had my own encounter with the new generation of”quote policy, not reasoning” deletionists; I feel as if I encountered (to quote from the song) “the forces of evil from a bozo nightmare.” No one—including me—looked good after that exchange. (I keep thinking that I should have said something different, but the surrealism of the situation multiplied with the square of my frustration kept me from my best.)”
Or Stbalbach:
“I’m a long time editor, since 2003, ranked in the top 300 by number of edits (most in article space). On May 11th 2007 I mostly gave up on Wikipedia—there is something wrong with the community, in particular people deleting content. I’d never seen anything like it prior to late 2006 and 2007. Further, the use of”nag tags” at the top of articles is out of hand. It’s easier to nag and delete than it is to research and fix. Too many know-nothings who want to “help” have found a powerful niche by nagging and deleting without engaging in dialog and simply citing 3 letter rules. If an user is unwilling or incapable of working to improve an article they should not be placing nag tags or deleting content.”
Also interesting is Ta bu shi da yu’s comment, inasmuch as Ta bu invented the infamous {{fact}}
:
“I have also seen this happening. It’s incredible that those who are so incredibly stupid can get away with misusing the speedy deletion tag! As for DRV… don’t make me laugh. It seems to be slanted to keep articles deleted. I can’t agree more with your sentiments that if you know all the codes to WP:AFD, then you are a menace to Wikipedia.”
Why is this culture changing? In part because article writing seems to get no more respect. A review article summarizes the findings of Burke and 200833:
…it is proving increasingly hard to become a Wikipedia administrator: 2,700 candidates were nominated between 2001 and 2008, with a success rate of 53%. The rate has dropped from 75.5% until 2005 to 42% in 2006 and 2007. Article contribution was not a strong predictor of success. The most successful candidates were those who edited the Wikipedia policy or project space; such an edit is worth ten article edits.
What sort of editor, with a universe of fascinating topics to write upon, would choose to spend most of his time on the policy namespace? What sort of editor would choose to stop writing articles?34 Administrators with minimal experience in creating content—and much experience in destroying it and rewriting the rules to permit the destruction of even more. Is this not almost the opposite of what one wants? And imagine how the authors must feel! An article is not a trivial undertaking; sometime sit down, select a random subject, and try to write a well-organized, fluent, comprehensive, and accurate encyclopedia article on it. It’s not as easy as it looks, and it’s even harder to write a well-referenced and correctly formatted one. To have an article deleted is bad enough; I can’t imagine any neophyte editors wanting to have anything to do with Wikipedia if an article of theirs got railroaded through AfD. It is easier to destroy than to create, and destruction is infectious. (In the study et al2012 of 3.3 years of the online SF game Pardus, players were found to ‘pay it forward’ when the subject of negative actions; the community was only saved from an epidemic of attacks by the high mortality & quitting rate of negative editors—I mean, negative players35.)
Deleting articles and piling on policy after guideline after policy are both directly opposed to why Wikipedians contribute! When surveyed in 2011:
The two most frequently selected reasons for continuing to edit Wikipedia were “I like the idea of volunteering to share knowledge” (71%) and “I believe that information should be freely available to everyone” (69%), followed by “I like to contribute to subject matters in which I have expertise” (63%) and “It’s fun” (60%).
And ironically, the more effort an editor pours into a topic and the longer & more detailed the article becomes, the more blind hatred it inspires in deletionists. If you look at AfDs for small articles or stubs, the deletionists seem positively lucid & rational; but make the article 50kB long, and watch the rhetoric fly. I call this the fancruft effect: deletionists are mentally allergic to information they do not care about or like.
If a deletionist sees an article on “Lightsaber combat”36 and it’s just a page long, then he has little problem with it. It may strike him as too big, but reasonable. But if the article dares to be comprehensive, if it is clearly the product of many hours’ labor on the part of multiple editors, if there are touches like references and quotes—then something is wrong on the Internet, the very universe is out of joint that this article has been so well-developed when so many more deserving topics languish, it is a cosmic injustice. A dirty beggar is parading around acting like an emperor. The article does not know its place. It needs to be smacked down and hard. And who better than the deletionist?
What is the ultimate status-lowering action which one can do to an editor, short of actually banning or blocking them? Deleting their articles.
In a particular subject area, who is most likely to work on obscurer articles? The experts and high-value editors—they have the resources, they have the interest, they have the competency. Anyone who grew up in America post-1980 can work on [[Darth Vader]]
; many fewer can work on [[Grand Admiral Thrawn]]
. Anyone can work on [[Basho]]
; few can work on [[Fujiwara no Teika]]
.
What has Wikipedia been most likely to delete in its shift deletionist over the years? Those obscurer articles.
The proof is in the pudding: all the high-value/status Star Wars editors have decamped for somewhere they are valued; all the high-value/status Star Trek editors, the Lost editors… the list goes on. They left for a community that respected them and their work more; these specific examples are striking because the editors had to make a community, but one should not suppose such departures are limited to fiction-related articles. There may be evaporative cooling of the community but it’s not towards the obsessive fans.
The greatest pleasure is to vanquish your enemies and chase them before you, to rob them of their wealth and see those dear to them bathed in tears, to ride their horses and clasp to your bosom their wives and daughters.
Attributed to Genghis Khan
Outsiders! I realize it might sound like a stretch that anyone enjoys the power of nominating articles, that being a deletionist could be a joyful role. You say you understand how administrators (with their ability to directly delete, to ban, to rollback etc.) could grow drunk on power, but how could AfD nominations lead to such a feeling?
But I know from personal experience that there is power exercised in nominating for deletion. Well do I know the dark arts of gaming the system: of the clever use of templates, of the process of deleting the article by carefully challenging and removing piece after piece, of invoking the appropriate guidelines and policies to demolish arguments and references.
I have seen the wails and groans in the edit summaries & comments of my opponents, and exulted in their defeat. It’s very real, the temptation of exercising this power. It’s easy to convince yourself that you are doing the right thing, and merely enforcing the policies/guidelines as the larger community set them down. (Were all my nominations just? No, but I have succeeded in fooling myself so well that I can no longer tell which ones truly did deserve deletion and which ones were deleted just because I disliked them or their authors.)
Who can say how many authors take it personally? The deletion process is inherently insulting: “Out of 2.5 million articles, yours stands out as sucking so badly that it is irredeemable and must be obliterated.” And it is ultimately sad37—life is short but must that be true of articles as well as men?
A Personal Look Back
Once more and they think to thank you.
As mentioned I have ~100k edits on the English Wikipedia, so I think I can speak from first-hand experience here.
The problem with devoting this much effort to Wikipedia is not that your time is wasted. If you get this far, you’ve absorbed enough that you know how to make edits that will last and how to defend your material, and this guy in particular is making edits in areas particularly academic and safe from deletionists; and your articles will receives hundreds or thousands of visits a month (see stats.grok.se—I was a little shocked at how many page hits my articles collectively represent a month).
The problem is that the benefits are going entirely to your readers. It’s a case-study in positive externalities. Unlike FLOSS or other forms of creation which build a portfolio, you don’t even get intangibles like reputation—to the extent any reader thinks about it, they’ll just mentally thank the Wikipedia collective. When you make 10,000 edits to your personal wiki, you will probably have written some pretty decent stuff, you will have established a personal brand, etc. Maybe it’ll turn out great, maybe it’ll turn out to be worth nothing. But when you make 10,000 edits to Wikipedia, you are guaranteed to get nothing.
No doubt one can point to the occasional Wikipedia editor who has benefited with a book contract or a job or something. But what about all the other editors in Wikipedia:List of Wikipedians by number of edits?
To again turn to myself; when I was pouring much of my free energy and research interest into improving Wikipedia, I got nothing back except satisfaction and being able to point people at better articles during discussions. I began writing things that didn’t fit on Wikipedia and got a personal website because I didn’t want to use some flaky free service, and the world didn’t end. I now have an actual reputation among some people; on occasion, people even email me with job offers to write things, having learned of me from my website. I owe my current (very modest) living to my writings being clearly mine, and not “random stuff on Wikipedia”. I’m not saying any of this is very impressive, but I am saying that these are all benefits I would not have received had I continued my editing on Wikipedia. Now I occasionally add external links, and I try to defend articles I previously wrote. Once in a blue moon I post some highly technical or factual material I believe will be safe against even hardcore deletionists. But my glory days are long over. The game is no longer worth the candle.
Wikipedia is wonderful, but it’s sad to see people sacrificing so much of themselves for it.
What Is To Be Done?
Wikipedia was enabled by software. It enabled a community to form. This community did truly great work; it’s often said Wikipedia is historic, but I think most people have lost sight of how historic Wikipedia is as it fades into the background of modern life; perhaps only scholars of the future have enough perspective on this leviathan, in the same way that Diderot’s encyclopedia was—for all the controversy and banning—not given its full due at publication. (But how could it? Encyclopedias are more processes than finished works, and of no encyclopedia is this more true than Wikipedia.)
That community did great work, astonishing in breadth and depth, I said. But that community is also responsible for misusing the tools. If vandalism is easier to remove than create, then it will tend to disappear. But AfD is not vandalism. There are no technical fixes for deletionist editors. As long as most editors have weak views, are willing to stand by while ‘nerdy’ topics feel the ax, who think ‘deletionists mostly get it correct’, then the situation will not change.
Could deletion be a positive feedback cycle? Will the waves of deletion continue to encourage editors to leave, to not sign up, to let the deletionists continue their grisly work unopposed, until Wikipedia is a shell of what it was?
Like the cooling dwarf star left by a supernova—its lost brilliance traveling onwards to eternity.
See Also
External Links
-
- “A Group Is Its Own Worst Enemy”, 2003
- Cognitive Surplus, 2010 (talk transcript)
-
Discussion:
-
“The great decline in Wikipedia pageviews (full version)” (March 2015)
-
“Revisiting ‘The Rise and Decline’ in a Population of Peer Production Projects [769 wikis]”, et al2018 (discussion)
-
“A Large-Scale Characterization of How Readers Browse Wikipedia”, et al2021
-
From “Digital Filters II” in The Art of Doing Science and Engineering, Richard W. Hamming 1997:
This is exactly the same mistake which was made endlessly by people in the early days of computers. I was told repeatedly, until I was sick of hearing it, computers were nothing more than large, fast desk calculators. “Anything you can do by a machine you can do by hand.”, so they said. This simply ignores the speed, accuracy, reliability, and lower costs of the machines vs. humans. Typically a single order of magnitude change (a factor of 10) produces fundamentally new effects, and computers are many, many times faster than hand computations. Those who claimed there was no essential difference never made any significant contributions to the development of computers. Those who did make significant contributions viewed computers as something new to be studied on their own merits and not as merely more of the same old desk calculators, perhaps souped up a bit.
-
Yvain, “Beware Trivial Inconveniences”. The connection to Wikipedia is obvious.
-
Abstract of “Citation Advantage of Open Access Legal Scholarship”, 2011. Are legal scholars lazy? Are law libraries ill-funded? Do legal scholars have little incentive to write well-researched papers? And yet, making papers a little easier to access results in a dramatic difference in citation.
2010 surveyed a 31 studies and found 27 showing benefits to OA. For example, benefits to open access were found in biology by Gunther Eysenbach, and Steve Lawrence found similar results for computer science articles online or offline:
The mean number of citations to offline articles is 2.74, and the mean number of citations to online articles is 7.03, or 2.6 times greater than the number for offline articles. These numbers mask variations over time—in particular, older articles have more citations on average, and older articles are less likely to be online. When considering articles within each year, and averaging across all years 1990–2000, we find that online articles are cited 4.5 times more often than offline articles.
We also analyzed differences within each publication venue, where multiple years for the same conference are considered as separate venues. We computed the percentage increase in the average number of citations to online articles compared to offline articles. When offline articles were more highly cited, we used the negative of the percentage increase for offline articles. For example, if the average number of citations for offline articles is 2, and the average for online articles is 4, the percentage increase would be 100%. For the opposite situation, the percentage increase would be -100%. Figure 2 shows the results. Averaging the percentage increase across 1,494 venues containing at least five offline and five online articles results in an average of 336% more citations to online articles compared to offline articles published in the same venue [the first, second (median), and third quartiles of the distribution are 58%, 158%, and 361%].
-
On the other hand, one economics study showed no benefit, et al2007 found no benefit in one physics subfield:
Three non-exclusive postulates have been proposed to account for the observed citation differences between OA and non-OA articles: an open access postulate, a selection bias postulate, and an early view postulate. The most rigorous study to date (in condensed matter physics) showed that, after controlling for the early view postulate, the remaining difference in citation counts between OA and non-OA articles is explained by the selection bias postulate. No evidence was found to support the OA postulate per se; i.e. article OA status alone has little or no effect on citations. Further studies using a similarly rigorous approach are required to determine the generality of this finding.
-
Max Levchin, PayPal co-founder; pg 11, Founders at Work
-
Joel on Software, “FogBugz”
-
Joel on Software, “Strategy Letter III: Let Me Go Back!”
-
“Rate this Page” is Coming to the English Wikipedia, WMF blog
-
“The search queries that took Australian Internet users to Wikipedia”, 2011:
This exploratory study analyses the content of the search queries that led Australian Internet users from a search engine to a Wikipedia entry. The study used transaction logs from Hitwise that matched search queries with data on the lifestyle of the searcher. A total sample of 1760 search terms, stratified by search term frequency and lifestyle, was drawn…The results of the study suggest that Wikipedia is used more for lighter topics than for those of a more academic or serious nature. Significant differences among the various lifestyle segments were observed in the use of Wikipedia for queries on popular culture, cultural practice and science.
-
pg 136, “4.4 Demographic Analysis of the Wikipedia Community”
-
The Kaggle background information on the “Wikipedia’s Participation Challenge” includes an interesting extract from the WMF report:
“Between 2005 and 2007, newbies started having real trouble successfully joining the Wikimedia community. Before 2005 in the English Wikipedia, nearly 40% of new editors would still be active a year after their first edit. After 2007, only about 12-15% of new editors were still active a year after their first edit. Post-2007, lots of people were still trying to become Wikipedia editors. What had changed, though, is that they were increasingly failing to integrate into the Wikipedia community, and failing increasingly quickly.”
-
Rather than reaching 95k editors, the actual March-July 2012 numbers were 76,274/75,141/76,956/74,402/76,400. In retrospect, my pessimistic 75% prediction that 95k would not be reached was actually ludicrously optimistic, given that the 95k editor mark has never been reached: the high-water mark seems to have been March 2007 with 90,618 editors >5 edits that month. So we have been shrinking ~2.8k editors a year:
((91 - 77) / (2012 - 2007))
. -
The successful recreation of Mzoli’s article and the endless deletion debates about Daniel Brandt (crowned in success for the deletionists) again come to mind.
-
In June 2011, Kaggle and the WMF announced a “Wikipedia’s Participation Challenge” to develop a better statistical model for predicting editor retention; while the training data was biased, the results are not too surprising: the single best predictor is the frequency of any edits prior to the cutoff. See 2nd place, Ernest Shackleton or contestant Keith T. Herring:
A randomly selected Wikipedia editor that has been active in the past year has approximately an 85% probability of being inactive (no new edits) in the next 5 months. The most informative features (w/r/t the features I considered) captured both the edit timing and volume of an editor. More specifically the exponentially weighted edit volume of an user (edit weight decreases exponentially with increased time between the edit and the end of the observation period) with a half-life of 80 days provided the most predictive capability among the 206 features included in the model.
Other attributes of the edit history, such as uniqueness of articles, article creation, comment behavior, etc. provided some additional useful information, although roughly an order of magnitude or less than the edit timing and volume when measured as global impact across the full non-conditioned editor universe.
-
I disagree with parts of Mario’s essay; for example, his first example is wrong as there are countless articles to write from the sister wikis (Emijrp estimates in 2021 that WP has 6.2m articles out of 104.7m possible articles), and many specialist sources like The Encyclopedia of Science Fiction have hundreds or thousands of entries that Wikipedia does not (I counted a dozen or so just linking to the articles written by Jonathan Clements—eg. many of the biography redlinks in “Seiun Award” or “Nihon SF Taisho Award”.) And every day, sites like the Anime News Network or New York Times post dozens of reviews or other references that can be easily & profitably worked into articles—but aren’t.
One comment makes the good point that the theory of completeness would not predict any flatlining in the smaller and less complete wikis, yet we seem to observe a general flatlining.
However, his reason 2 is similar to my own theory about the Seigenthaler affair and the BLP reaction, and his reason 3 is my previous point about process & the fallacy of the invisible/broken window fallacy.
-
Gardner’s December UK address contained other graphs worth looking at.
-
Which as links to credentialed sources will be uncontroversial and require little defense, vastly improving the ROI of editing Wikipedia. Wikipedia gets a great deal of traffic, and even highly obscure articles exert surprising influence; one can look at the traffic rates on specific pages with
stats.grok.se
. -
To quote the great computer scientist Donald Knuth in 2006:
I think that Wikipedia’s enormously successful, but it’s so brittle, you know, if I was, if I spent a lot of time writing an article for the Wikipedia, and I wanted to make sure nobody screwed it up, I would have to check that article every day to make sure that it was still okay, and you know, after I’ve done that I want to move on and go on to other, other things in my life. With TeX, I wanted stability especially urgently because people are depending on it to be a fixed point that they can build on, so in that respect, I differ from the GNU Public License.
(The GPL contains clauses that users of GPLed code may use the terms of later versions of the GPL, which may fix any legal vulnerabilities or exploits discovered. This is a common practice among copyleft licenses and in fact, the WMF itself cross-licensed the entire set of Wikipedias and other projects from the GFDL to Creative Commons as well based on an one-time provision GNU added at WMF’s request.)
-
“The Effects of Group Composition on Decision Quality in a Social Production Community”, et al2010, pg 7:
“We also found that there have been two bots (computer programs that edit Wikipedia)—
BJBot
andJayden54Bot
—that automatically notified article editors about AfD discussions and recruited them to participate per the established policy. These bots performed AfD notifications for several months, and offer us an opportunity to study the effect of recruitment that is purely policy driven. We use a process like one described above to detect successful instances of bot-initiated recruitment: if a recruitment bot edited an user’s talk page, and that user !voted in an AfD within two days, then we consider that user to have been recruited by the bot.Using the above processes, we identified 8,464 instances of successful recruiting. Table 2 shows a summary of who did the recruiting, and how their recruits !voted. We see large differences in !voting behavior, which suggests that there is bias in who people choose to recruit. (From these data we cannot tell whether the bias is an intentional effort to influence consensus, or the result of social network homophily [14].) Participants recruited by keep !voters were about four times less likely to support deletion as those recruited by delete !voters. The participants that bots recruited also appear unlikely to support deletion, which reflects the policy bias we observed earlier.
To see what effect participant recruitment has on decision quality, we introduce four binary variables:
BotRecruit
,NomRecruit
,DeleteRecruit
, andKeepRecruit
. These variables indicate whether a bot, the AfD nominator, a delete !voter, or a keep !voter successfully recruited somebody to the group, respectively.Looking back to table 1, we find that regardless of the decision, none of the first three variables has a statistically-significant effect. On the other hand, when a keep !voter recruited someone to the discussion, we see a significant effect: delete decisions are more likely to be reversed. We offer two possible explanations: the first is that recruitment by keep !voters, biased as it may appear, is a sign of positive community interest, and suggests that the article should be kept. If the community decides otherwise and deletes the article, then decision quality suffers. An alternative explanation is that keep !voter recruitment is a sign of activism among those who prefer to keep the article. These proponents may be especially persistent in maintaining the article’s existence in Wikipedia, even if it requires working to reverse a delete decision.”
-
It was too hard to extract only the URL(s) being added by a diff, so the script simply extracts all URLs it can find in the diff part of the HTML; so if an editor made 4 edits adding URLs A, B, C, and D, and only A were added to the article, then the script would 4 times extract A-D, spot A in the article, and declare victory. This may account for KrebMarkt’s increased success rate compared to my edits, because she is accustomed to piling up her suggested links in one tidy section.
-
I added a few links to Talk pages to time how long it took for a KrebMarkt-style edit: to go from the ANN page to a saved and reloaded page which I had checked by eye that the edit was correct was upwards of 30 seconds. >30 seconds times 958 edit is >479 minutes or >8 hours; my excerpting edits take at least 5 minutes to do, so those 248 edits represent >21 hours of work.
-
Someone might object that picking the last link in an External Link section is not random at all. I am reminded of an anecdote describing a court case involving the draft back in Vietnam, where the plaintiff’s lawyer argued that the little cage and balls method was not random and was unfair because the balls on top were much more likely to be selected. The judge asked, “Unfair to whom?”
As well, this methodology, while being quite as random as most methods, carries the usual advantages of determinism: anyone will be able to check whether I did in fact remove only last links which are not official or template-generated in External Link sections. This is evidence that I did not simply cherry pick the links that I thought were worst and so least likely to be restored.
(If I were going to cherry pick under this procedure, I would have had to invest a great deal more effort: for each removal, I would have to find multiple candidates each of which satisfied the criteria and only then could I pick the worst final link; and then I would have to start over for the next removal, and since I had to check ~10 random articles for a possible final link, this implies for every removal, I’d be looking at something like 40+ random articles to do one removal or 200+ random articles a day! And this deception would have to be deliberate & planned—while most cases of bias are unconscious.)
-
Some editors pride themselves on detecting vandalism weeks or months after creation; they are highly unusual. When I was spending time reading academic publications on Wikipedia a few years ago, a number of them dealt with quantifying vandalism and reversions; almost all vandalism was reverted within days, and reversions which took longer than a month were very rare (0-10%, to be very generous). This was why I chose to wait a month, because waiting longer added nothing. A week would have been adequate.
Relevant research on quantifying reversion rates over time:
- “Creating, Destroying, and Restoring Value in Wikipedia”, et al2007
- “Measuring Wikipedia”, 2005
- “Studying Cooperation and Conflict between Authors with history flow Visualizations”, et al2003
- “Detecting Wikipedia vandalism via spatio-temporal analysis of revision metadata?”, 2010
- “User Contribution and Trust in Wikipedia”, et al2009
- “He says, she says: conflict and coordination in Wikipedia”, et al2007
-
It’s not hard to estimate. Take the list of 100 diffs, and use an editor macro or a shell tool like
sed
to strip it down to a list of URL-encoded article names like so:Castell_Dinas_Bran Ron_O%27Neal HUD_(video_gaming) Protector_(2009_film) ...
Then, loop over the list to download the March 2012 summary page for that article, and filter out the total monthly hit-count (since we don’t care about dailies); example code:
$ for URL in `cat articles.txt` do elinks -dump "http://stats.grok.se/en/201203/$URL" | grep -F " has been viewed " done [1]Castell_Dinas_Bran has been viewed 914 times in 201203. [1]Ron_O'Neal has been viewed 7446 times in 201203. [1]HUD_(video_gaming) has been viewed 7579 times in 201203. ...
This output is also easy to process with a macro or regexp, and once we have the monthly number for each article, all that remains is totaling them:
sum [914,7446,7579,542,3103,91,1665,5291,2452,102,272,3344,16214,32268,863,10307,476, 3825,310,205,441,3028,187,94,115,211,207,522,269,182,1324,950,25660,162,14457 3881,200,3510,606,430,2048,164,214,136,77,8075,99,255,278,148,525,192,108,295 61,597,180,3491,753,527,766,113,1405,770,3683,288,873,26811,131,6625,93,212 538,313,7119,212,76,1130,7741,2136,179,263,632,870,714,338,2517,456,90,621 1323,316,1125,413,73223,122,12707,6573] -- 335445
Note that this is probably an underestimate. It took weeks to remove all the links, doing it just 5 or 10 at a time, and the 30 day timer only started when link #100 was removed. So for link #1, something closer to 2 months passed…
-
His user page states as of 2012-05-19 under “My activities on Wikipedia” that
…My Wikipedia philosophy is quite complex, and defies easy categorization. My ideal for a more perfect Wikipedia would be to create many wikis for pop culture topics and transwiki many of the related articles on Wikipedia to them. (Some of these already exist in a fairly substantial format, such as Memory Alpha and Wookieepedia). I see no reason why all 703 episodes of the live-action Star Trek (and 17 of the 22 animated episodes) should have articles on Wikipedia, when Memory Alpha exists. (Before you go hating on me for that, note that I own all 720 episodes on DVD, as well as all but three of the movies.) This does not make me a deletionist, however. I also believe in structurism, and a combination of two opposing philosophies mergism and seperatism; merging in small articles rather than deleting them and separating large articles rather than deleting content. I also agree with the tenets of exclusionism, although that also leads back to transwikism again.
-
Since no one noticed the 100 removals were connected, we can assume each removal was statistically independent; this lets us calculate a binomial proportion confidence interval. Specifically, with 3 successes and 100 samples, the 99% confidence interval is 0-7%. We can derive this from Wolfram Alpha or one’s favorite statistical package if one doesn’t want to crunch the formula oneself.
(Incidentally, Wikipedia has 3,960,143 as of 2012-06-01 according to Special:Statistics, and I went through perhaps 10 pages for each removal, so the total possible sample size is ~396,014. That 100 samples can give such a good estimate—as long as they are independent—is the same magic that makes things like opinion polls work; at least, as a child I found it magical that a sample of <1000 voters could predict so accurately the election results in a population of >300 million people.)
-
One might wonder why I had so much traffic to an English page; do just that many Germans know English? No, it turns out my link in their page didn’t come with an “English” warning. I added this warning on 2012-05-20, and while there was a major traffic spike after that and then a long outage June-September 2012 where the link was broken due to my own carelessness, the warning seems to have substantially reduced click-throughs according to my analytics.
-
It’s actually closer to p = 0.00000000000000022. Assuming one has cleaned up the two CSVs by removing the initial summary data and the final total line, the statistical analysis goes like this:
before <- read.table("https://gwern.net/doc/wikipedia/2012-gwern-dnb-wikipedia-before.csv", header=TRUE,sep=",") after <- read.table("https://gwern.net/doc/wikipedia/2012-gwern-dnb-wikipedia-after.csv", header=TRUE, sep=",") before$Pageviews [1] 1 0 2 3 12 3 9 3 3 2 3 0 3 1 9 11 5 6 7 7 7 5 5 7 0 [26] 1 9 21 3 6 6 12 5 9 7 13 11 11 11 10 12 5 12 16 13 4 14 14 9 3 [51] 9 11 4 10 5 11 4 21 15 3 7 1 7 4 5 2 4 7 4 5 5 12 14 9 5 [76] 7 3 3 16 9 6 15 12 6 7 4 14 5 13 5 11 3 2 12 2 19 5 5 9 14 [101] 6 6 14 11 17 5 3 2 3 6 8 26 5 8 5 10 9 3 7 11 7 7 17 14 16 [126] 7 3 4 5 13 8 7 11 3 6 7 8 6 11 16 13 15 11 9 5 6 3 11 7 7 [151] 6 7 6 9 11 6 8 16 10 4 5 9 10 3 6 5 11 25 9 9 17 17 23 21 23 [176] 34 8 15 10 21 20 10 12 21 17 11 30 17 6 7 9 17 12 19 6 7 13 12 12 10 [201] 14 11 13 14 13 9 10 6 10 8 after$Pageviews [1] 7 5 3 5 2 2 3 2 1 2 2 2 0 1 1 3 1 2 3 2 1 5 1 0 1 2 1 0 3 2 0 2 1 0 1 1 4 [38] 2 1 1 1 0 1 0 1 4 0 1 1 0 0 0 0 2 0 0 2 1 0 0 0 1 1 0 1 0 1 1 2 1 2 1 1 3 [75] 2 4 1 3 2 3 3 2 2 1 1 1 1 1 1 3 2 4 1 2 4 2 2 2 2 1 1 wilcox.test(before$Pageviews, after$Pageviews) Wilcoxon rank sum test with continuity correction data: before$Pageviews and after$Pageviews W = 20084, p-value < 2.2e-16
I’m not sure why R is reporting slightly different means than I listed previously, but the final result is not too surprising when you eyeball the data—this is a very large effect size. Specifically, the effect size as Cohen’s d is 1.28 (where 0.5 is described as “medium”, and >0.8 is “large”):
(mean(before$Pageviews) - mean(after$Pageviews)) / sd(append(before$Pageviews, after$Pageviews)) 1.275841
-
See Slashdot’s “Call For Halt To Wikipedia Webcomic Deletions” for an overview.
-
Anime and manga are particularly bad. The American and Japanese anime bubbles of the 2000s popped, and with them went a flood of magazines and books—the economic reality has set in that they are simply not sustainable in a modern environment, which of course is very useful to deletionists who want to apply rigid universal norms to articles sans any context. This leads to odd situations like experts self-publishing; from Brian Ruh’s ANN column “The Ghost with the Most”:
This time, though, instead of a fictional book about the supernatural I’m going to be examining a nonfiction book about Japanese ghosts—Patrick Drazen’s A Gathering of Spirits: Japan’s Ghost Story Tradition: From Folklore and Kabuki to Anime and Manga, which was recently self-published through the iUniverse service. This is Drazen’s second book; the first one, Anime Explosion! The What? Why? & Wow! of Japanese Animation, came out in 2002 from Stone Bridge Press and was an introduction to many of the genres and themes that can be found in anime.
I think the switch from a commercial press to self-publication may indicate the direction English-language anime and manga scholarship may be heading in. A few years ago, when Japanese popular culture seemed like the Next Big Thing, there were more publishers that seemed like they were willing to take a chance on books about anime and manga. Unfortunately, as I know firsthand (and as I’ve heard from other authors, confirming that it’s not just me) these books didn’t sell nearly as well as anyone was hoping, which in turn meant that these publishers didn’t want to take risks with additional books along these lines. After all, all publishers need to make money in one way or another to stay afloat. In the last few years, the majority of books on anime and manga have been published by university presses, perhaps most notably the University of Minnesota Press. But I already gushed about them in my last column, so I’ll spare you from any additional public displays of affection.
However, this puts books like Drazen’s in an odd predicament. It’s not really an academic book, since it lacks the references and theories something like that would entail, which means it’s not a good candidate for an university press. However, since few popular presses have seen their books on anime and manga reflect positively on their bottom lines, there aren’t many other options these days other than self-publishing. Of course, these days publishing a book on your own doesn’t have nearly the same connotations it did decades ago, when vanity presses were the domain of those with more money (and ego) than sense. These days you can self-publish a quality product, get it up on Amazon for all to see, and (if you’re savvy about these things) perhaps even make a tidy profit.
-
‘Taking Up the Mop: Identifying Future Wikipedia Administrators’, Moira Burke and Robert Kraut, in Proceedings of the Conference on Human Factors in Computing Systems, Florence, Italy, 5-10. April 2008, pp. 3441-6
-
From “Cultural Transformations in Wikipedia or ‘From Emancipation to Product Ideology’: An Interview with Christian Stegbauer”, collected in A Wikipedia Reader:
“Our 2006 research [Christian Stegbauer, ‘Wikipedia. Das Rätsel der Kooperation’ (‘Wikipedia: the mystery behind the cooperation’), Wiesbaden: VS, 2009, p. 279 et seq.] compared content on user pages from their original starting date to the present. 13 We noticed a transformation from emancipation to product ideology among those who had reached leadership status, but not for ones less integrated. Typical statements from an user site’s first days would be: ‘Wikipedia is a great idea’; ‘[a] never-ending encyclopedia created by many different authors’; ‘everyone should be able to exchange their knowledge for free’; ‘Wikipedia is like fulfilling a dream—a book in which everyone can write what they want’; ‘the Internet shouldn’t be regarded as a goldmine’; ‘Making information available free of charge is an important task’; ‘the project’s concept is fantastic’; ‘the idea behind Wikipedia is well worth supporting’.
Six out of seven users who changed their ideological statements were core users, and five of these were administrators. Half of them deleted their opinion on emancipation ideology in the same instance they became administrators. In five out of nine cases, they expressed the product ideology, including remarks about ‘unreasonable’ people damaging the project, about endless discussions that should not take place when energy should be invested in the articles instead, and about ‘difficult’ people who are not welcome at Wikipedia. We also found phrasing such as ‘certain level of expertise is necessary for writing the articles’ or that liberal processing is the reason behind low quality contributions.”
-
From pg 5 of et al2012 (or see popular coverage in eg. Technology Review):
Transition rates of actions of individuals show that positive actions strongly induces positive reactions. Negative behavior on the other hand has a high tendency of being repeated instead of being reciprocated, showing the ‘propulsive’ nature of negative actions. However, if we consider only reactions to negative actions, we find that negative reactions are highly overrepresented. The probability of acting out negative actions is about 10 times higher if a person received a negative action at the previous timestep than if she received a positive action.
…The analysis of binary timeseries of players (good-bad) shows that the behavior of almost all players is ‘good’ almost all the time. Negative actions are balanced to a large extent by good ones. Players with a high fraction of negative actions tend to have a significantly shorter life. This may be due to two reasons: First because they are hunted down by others and give up playing, second because they are unable to maintain a social life and quit the game because of loneliness or frustration. We interpret these findings as empirical evidence for self organization towards reciprocal, good conduct within a human society. Note that the game allows bad behavior in the same way as good behavior but the extent of punishment of bad behavior is freely decided by the players.
It’s worth noting the distinction between ‘reciprocation’ and ‘repeated’; otherwise this phenomenon might have an explanation as a statistical artifact resulting from an ordinary game activity like 1-on-1 fights or duels.
-
I bring up the ‘Lightsaber combat’ article because I did substantial work referencing it before its wiki-deletion, but because it was redirected the original page history still survives. It is worthwhile comparing the original page with its replacement section in the ‘Lightsaber’ article.
I am chuffed to note that the merge has resulted in inferior references! eg. the Nick Gillard quote in paragraph 2 is unsourced and has a
{{fact}}
template, but was referenced in the original. Further, that quote is trivially re-referenced (#3 hit in Google). My standards may be too high, but I can’t help but think that it takes real incompetence to not only lose a reference, but be unable to re-find such an easily found quote. -
The pathos has, at times, moved me to verse. To quote one of mine from WP:HAIKU (a homage to Basho’s famous verse in The Narrow Road to Oku):
Summer AFD - the sole remnant of many editors' hard work.
It is not a coincidence that I put that haiku before the final haiku on the page—a haiku commenting on editors who have abandoned or left the project:
The summer grasses. I edit my user page One last time - really.
-
One could also avoid compilation and run it much more slowly as
cat urls.txt | runghc script
.