Also known as "the institutional imperative." Quoting Warren Buffett's 1989 letter to shareholders:[1]
"My most surprising discovery: the overwhelming importance in business of an unseen force that we might call 'the institutional imperative.' In business school, I was given no hint of the imperative's existence and I did not intuitively understand it when I entered the business world. I thought then that decent, intelligent, and experienced managers would automatically make rational business decisions. But I learned over time that isn't so. Instead, rationality frequently wilts when the institutional imperative comes into play.
For example: (1) As if governed by Newton's First Law of Motion, an institution will resist any change in its current direction; (2) Just as work expands to fill available time, corporate projects or acquisitions will materialize to soak up available funds; (3) Any business craving of the leader, however foolish, will be quickly supported by detailed rate-of-return and strategic studies prepared by his troops; and (4) The behavior of peer companies, whether they are expanding, acquiring, setting executive compensation or whatever, will be mindlessly imitated.
Institutional dynamics, not venality or stupidity, set businesses on these courses, which are too often misguided. After making some expensive mistakes because I ignored the power of the imperative, I have tried to organize and manage Berkshire in ways that minimize its influence. Furthermore, Charlie and I have attempted to concentrate our investments in companies that appear alert to the problem."
Jeff Bezos talked about something very similar in his most recent letter to shareholders[0]:
"As companies get larger and more complex, there’s a tendency to manage to proxies. This comes in many shapes and sizes, and it’s dangerous, subtle, and very Day 2.
A common example is process as proxy. Good process serves you so you can serve customers. But if you’re not watchful, the process can become the thing. This can happen very easily in large organizations. The process becomes the proxy for the result you want. You stop looking at outcomes and just make sure you’re doing the process right. Gulp. It’s not that rare to hear a junior leader defend a bad outcome with something like, “Well, we followed the process.” A more experienced leader will use it as an opportunity to investigate and improve the process. The process is not the thing. It’s always worth asking, do we own the process or does the process own us? In a Day 2 company, you might find it’s the second."
I wonder how much of this could be counteracted simply by capping profits, such that all net revenue in excess of some fixed amount is forced by the corporate charter to be turned into either employee compensation or stockholder dividends.
Sure, such a company would have no chance of succeeding in the stock market, but what if it never plans on IPOing in the first place?
> I wonder how much of this could be counteracted simply by capping profits
Buffet's core complaint is about companies not making enough profits due to their profligate spending on other things. A profit cap would not have the impact you're hoping for here. :)
> such that all net revenue in excess of some fixed amount is forced by the corporate charter to be turned into either employee compensation or stockholder dividends.
By definition, only profits can be paid out as dividends, so again, a profit cap would prevent the thing you're you're trying to boost.
> By definition, only profits can be paid out as dividends, so again, a profit cap would prevent the thing you're you're trying to boost.
Er, sorry, I rather meant that profits after dividends would be capped. You can do anything you like with the money—other than keep it in the corporate coffers (or in commercial paper or anything else that's still liquid assets on the balance sheet.)
But yes, you're still right, it wouldn't have the correct effect.
How about a sliding window cap on non-compensation spending, together with a sliding window cap on headcount? So, in a year with record 10x net revenue, you would be allowed to 10x your salaries/bonuses/dividends, but you wouldn't be allowed to multiply your capital costs, or "new" labor costs, by more than, say, 1.3x. (And keep the fixed profit-after-dividend cap, because otherwise the corporation would just hold all its money in the bank until the sliding window grew enough to let it spend it.)
I mean, you took a simple rule, got on objection, and made it a lot more complex. That's usually a sign that the underlying issue is not all that simple, and a simple rule will not remedy it.
With all the thousands of corporations that get created and destroyed all the time, it seems like parent's proposed rule could probably work for one or two of them? If not, maybe we could come up with something more substantial than, "this could be hard!"
That guy gave solid advice. OP is welcome to implement OP's rule but adding epicycles isn't going to get meaningful responses. When you start adding epicycles it's worth reinvestigating if your original theory has a meaningful basis.
Profits don't generally have a simple/strict meaning in business terms, AFAIK. Maybe you mean EBITDA? (earnings before interest, taxes, depreciation and amortization) I think even that is maybe not what you mean/wouldn't acheive your goals, even if it's theoretically unambiguous, because of what is invested in R&D/aquisitions/etc.
Also, presumably you're talking about new laws in the US here? What stops businesses from just moving their HQ overseas to places with different laws they like better?
All the ways organisations waste money Buffet describes would be categorised as expenses or investments, so occur before profit is even calculated. Setting arbitrary goals independently of the cost structure of the business or its competitive environment is like setting a flight plan for a plane using just a map and then flying it blind without instruments or a weather report.
Profits are revenue minus expenses. In a company ruled by the model Buffet describes, "[c]orporate projects or acquisitions will materialize to soak up available funds" means that expensive boondoggles shall arise to stop any pesky excess profits from appearing in the first place.
> I wonder how much of this could be counteracted simply by capping profits, such that all net revenue in excess of some fixed amount is forced by the corporate charter to be turned into either employee compensation or stockholder dividends.
That seems likely to cause the exact _opposite_ effect of what you intend. Apple is sitting on a pretty big cash pile right now. If the only legal options they had were to invest it or return it as dividends, the original article's premise is that they would find ways to invest it (buying factories in China, bringing app development in-house, etc) rather than risk the shrinking of the bureaucracy.
The problem seems doubly bad with Wikipedia and other non-profits. The problem isn't that they have too much profit. It's that they are growing their expenses to match income (rather than capping expenses at some "rational" level and trying to maintain enough income above it). In theory, in for-profit companies shareholders will start to complain if expenses increase too high -- it's not clear who would make the same complaint at Wikipedia (other than the original article).
To achieve what you're describing you should cap market capitalization. No point in trying to grow past a point if you'll have to just give away all that growth in the form of dividends.
> To achieve what you're describing you should cap market capitalization.
That's not really it either. If you have a business which is generating high profits, it will have a high market cap. You can reduce it by transferring the corporation's liquid assets to shareholders, but if you've transferred the liquid assets and you're still above the limit, transferring non-liquid assets to shareholders is clearly the wrong direction.
What you really want is for separate businesses to be separate companies. Conglomerates are inefficient. Don't expand into new markets, just give the shareholders their money. The shareholders can invest it in the new markets if they want to.
The main impediment to this is the tax code, because if you give the shareholders the money it gets taxed -- twice -- but if you spend it internally or leave it inside the corporation in an offshore subsidiary, that doesn't happen.
Wikipedia gets to the same place via a different route. A non-profit doesn't have shareholders to pay dividends to, so instead of paying dividends being discouraged by the tax code, it just isn't possible, and you get the same results. The website generates more revenue than it actually needs to operate and the rest of the money has to go somewhere, so it goes somewhere inefficient as determined by internal politics.
It's interesting that you say that on a subthread discussing something Warren Buffet said. Berkshire Hathaway became an efficient conglomerate by acquiring and operating companies that eschewed the very principle we're discussing.
Relative to other public companies, it basically would have have zero money to re-invest into growth. So there'd be no expectation of returns on investment—no reason to buy the stock.
Dividends are nice, but dividends would stay small if the company itself stays small (because it's constrained from growing.)
My point was that all publicly-traded companies are. Speculation on corporate growth's geometric effects on corporate long-term profitability is what the stock market is.
Look at utilities and regional banks, which are usually priced based on return on assets or operating margins.
Growth oriented companies have advantages and disadvantages. Microsoft is a great example -- they have a business that's a monopoly cash cow, but they also need to hit high growth rates to prosper. They squandered a decade trying to both grow and milk the cow, and are now growing again, while breaking and eventually losing parts of the legacy business.
I think you've spent too much time focusing on tech companies. Many companies have a growth ceiling (utilities, etc) that investors are happy with as long as it pays a dividend.
That is the whole idea here, though. That Wikipedia's money, which is being "reinvested in growth" is not having an impact on the mission. Stopping businesses from reinvesting in growth -- at least, stopping them doing it the traditional way -- is the point that Buffett is trying to make.
(I'm not equipped to evaluate whether or not that point is true, especially in the case of any specific company. Just trying to point out that the argument you're making here kind of begs the question.)
When you interfere with price communication and incentivization mechanisms, you always break things.
Trying to come up with piecemeal regulatory "solutions" like this is like trying to invent a perpetual motion machine by coming up with increasingly more complicated contraptions. There are fundamental, local interactions that you have to contend with that basically rule out accomplishing what you want.
Your comment brings me joy. Society would benefit from inventing better ways of communicating the properties of large social/economic systems, so that this misconception - that top down economic planning, through cookie cutter mandates, can improve social welfare - becomes less common.
Everything from the growth of the welfare state, to populist revolutions to put socialists into power, can be blamed on how common this misconception is.
Wow, this really hits the nail on the head in regards to the small company I work for. Expanded rapidly from 30-40 to 120 in two years. The higher-ups have failed to address some core issues (with some refused on principle, such as an aversion to stronger organizational structure due to not wanting to seem "corporate") while still pursuing pet projects and spending freely (of money and others' time). Communication is also a massive issue - half the company (within the current structure) knows what's going on, the other half have to either do their own investigative work or wait. Would fine with me if you're a much larger company but at 120 or so it seems pretty silly and hypocritical to their "anti-corporate" ethos. Will be interesting when/if we hit Dunbar's number.
I don't remember the source but I read about there being several inflection points in terms of team size where communication overhead changes suddenly...IIRC the first one is 8: beyond this you start needing to have 'managers' (their title may be different), i.e. people responsible for organization & team communication rather than directly making the product/sales/etc.
Anecdotally I noticed that even with 5 people you usually have someone starting to act as adhoc part-time manager, and this role quickly become too much for a part time responsibility as the team grows.
Agreed. It's for this reason that we have to be as free as possible to create new institutions. All institutions, no matter how well-intentioned and self-aware their founders, succumb to these forces. The only way to prevent perverse misallocation of effort and resources is to allow for a steady turnover of institutions.
I think this problem derives from the nature of and incentives of people. What happens is that certain managers (and founders) have derived their success from pushing certain directions that have added a lot of value.
Usually long term initiative take a lot of domain knowledge and a specific set of connections. When something new becomes more important, they risk losing their personal position since they may not be the best person in terms of knowledge and connection to carry out the new direction. Instead they try to use their accumulated power to keep their direction supported even when it's not in the interest of the business for as long as they can.
To address this, perhaps there's merit in rewarding timely exits and somehow punishing people that have dragged things out. However the most that companies can do is usually just fire someone. And at that point they have probably sucked the company dry of the value they can personally extract.
Not the op, but I think another way of thinking about the problem is related to Tragedy of the Commons[0]. In general, it's an issue that arises out of the different goals of the various actors, which can be orthogonal or counter to the goals of the group. One way of thinking about it intuitively is to imagine multiple painters all trying to create their own artwork on the same canvas, leading to a very messy result. The solution is for the actors to have aligned goals and execute as a team. Achieving that seems to be an interesting and unsolved problem.
Elinor Ostrom received a Nobel Prize for researching into how societies have develop structures to manage commons sustainably. Unfortunately her conclusion was there is no default solution.
I've been thinking about this problem for a long time, but have come to realize that keeping groups small has its own major shortcoming. Namely, it limits the ability of the group to achieve greater things. The larger the group, the more influence it has and generally the more resources it has too.
There are also plenty of counter examples in corporations and governments throughout history where the size of the organization has not affected its ability to achieve its goals or compete with smaller organizations. Therefore, I think the issue and solution lies elsewhere. Somewhere between accountability, culture and trust.
Given the force-multiplying factors of technology, more and more can be accomplished by fewer and fewer people. IBM > Google > Facebook > Instagram
That's not to say I disagree with you completely: large corporations can achieve things small ones cannot (especially as you move out of software) but I've seen plenty of talk around accountability, culture and trust, and very few results.
I agree that more may be accomplished with less in certain circumstances aided by technology. One could go as far as to say one day management and governance may be taken over by machines (fun thought experiment: ponder the ramifications for democracy when machines are capable of making better decisions than humans). I think we're a ways away from that still, but the future is exciting.
Sure there's plenty of talk around accountability and proper management without much substance and I don't claim I have any specific solutions in mind to the problem. I think keeping organizations small (i.e. tribalism) is one hack that may help, but comes with different considerations as I mentioned.
Is this a response suggesting that anarchy is the only way to defeat institutional inertia? Or a link to something that actually addresses the subject constructively and realistically?
I ask because it's a large work, but I'd read it if it was the latter.
I was also curious; here's what Wikipedia says about Kropotkin's "The Conquest of Bread":
> "In this work, Kropotkin points out what he considers to be the defects of the economic systems of feudalism and capitalism, and how he believes they thrive on and maintain poverty and scarcity, as symbol for richness and in spite of being in a time of abundance thanks to technology, while promoting privilege. He goes on to propose a more decentralised economic system based on mutual aid and voluntary cooperation, asserting that the tendencies for this kind of organisation already exist, both in evolution and in human society. He also talks about details of revolution and expropriation in order not to end in a reactionary way."
Of course there's many ways to solve any problem, but GP asked for a resource, and I gave one. Anarchy has been used realistically, and the book was written to be a realistic approach organizing people.
Thanks for the link. He says something else very relevant to OP's point, too:
"We face another obstacle: In a finite world, high growth
rates must self-destruct. If the base from which the growth is
taking place is tiny, this law may not operate for a time. But
when the base balloons, the party ends: A high growth rate
eventually forges its own anchor."
Even in that case it can become very difficult to follow.
For example, there is - in my opinion - a low chance that the Gates Foundation will expend its resources within 20 years post the death of the last of the two.
They are likely to have something total equivalent to (in present dollars) perhaps $250-$300 billion to get rid of in the next 40-50 years. I'd be skeptical they can get rid of it that fast in their model. They're eroding that mass of capital (presently near $190b) so relatively slowly that by the time Gates is 80, they'll probably still be dealing with $200 billion in today's dollar.
A better more original source of pithy organizational rules would be Robert Conquests three laws of politics
Everyone is conservative about what he knows best.
Any organization not explicitly right-wing sooner or later becomes left-wing.
The simplest way to explain the behavior of any bureaucratic organization is to assume that it is controlled by a cabal of its enemies.
If you transform Conquest's rules from politics to running (ruining?) the wikipedia, the transformed rules are
"We can change nothing not even our exponential growth spiral, not our policies, nothing"
"LOL We're not doing the fiscal conservatism thing. I like how the current top discussion is about popularity and the need for a circular firing squad, not something financial. A direct quote of an attempt to avoid working on the issues "I'm reminded of the inflammatory, low-rent campaign of Donald Trump." Yeah buddy that'll fix it, that'll fix it real good."
"We're headed off the financial cliff now get out of the way I'm going gas pedal to the floor as you can see in the exponential graphs. The problem is we're a CRUD app and that's cutting edge CS just like quantum computing so naturally there's no possibility of criticism there. After all, the Egyptians didn't have flush toilets and they built a pyramid, so any criticism of the toilet in my bathroom is either making fun of the entire Egyptian culture or pretending the pyramids were not a logistical challenge."
There is some humor in that the world of paper encyclopedia publishing ran on mostly capitalist operational principles for decades, centuries. It turns out that running an online encyclopedia off donations and extreme hand waving is powerful enough to destroy an industry on its way to its inevitable collapse. Maybe someday in the future we'll have encyclopedias again, but the era after wikipedia and before the next encyclopedia will be a bit of a dark age. That's too bad.
I honestly cannot make any sense of your "translations" of these "laws" (which sound more like assertions to me).
Also, what scenario are you talking about when yo usay "after wikipedia"? There's plenty of copies of it on the internet, so the data won't suddently vanish, and wikis don't suddenly stop existing if the Wikipedia foundation implodes.
Pretty sure this article could have been called "Wikipedia's Costs Growing Unsustainably" instead of the clickbait headline.
But overall this oped is misplaced. Running the leanest possible operation shouldn't be Wikipedia's focus at this stage in its lifecycle, it's improving the quality of its content.
Back in 2005 Wikipedia had 438k articles and the focus was expanding the reach of its content to cover all topics; today the article count is 5.4 million it's quality that matters more. You can't improve quality just based on crowd-sourcing alone (see: Yelp, Reddit, etc), and the bigger it's gotten the more of a target it's become by disinformation activists.
This attitude on budgets over value strikes me as a classic engineer's POV. The OP is nostalgic about a time when the site was run by a single guy in his basement, but could 1 guy handle the assault of an army of political zealots or Russian hackers? DDoS attacks? Fundraising? Wikipedia is arguably one of the most coveted truth sources the world over, protecting and improving its content is more important than an optimal cost-to-profit ratio.
If the OP has specifics, by all means, share them, but this kind of generalized fearmongering about budgets isn't spectacularly useful, IMHO.
> Pretty sure this article could have been called "Wikipedia's Costs Growing Unsustainably" instead of the clickbait headline.
That's not what I got at all. And that's why the article is interesting.
Wikipedia's funding is what's growing unsustainably. It's higher funding that's pushing the costs higher. And that's what makes it interesting (and only a little click-baity.)
It seems, having taken people's money for a charity, you have a moral obligation to spend the money on the charity, whether it needs it or not. And as a manager of said charity, it's very easy to believe (or to convince yourself) it needs the money. Or otherwise why were we making plaintive pleas for money?
(And that happens in a world of good intentions. When fundraisers become cynical, you end up with the US political outrage machine, which operates simply to raise money rather than to effect political change....)
From the OP: "After we burn through our reserves, it seems likely that the next step for the WMF will be going into debt to support continued runaway spending, followed by bankruptcy."
If it was just about wasting donations, they'd never go into debt. It's costs, and specifically costs-to-income ratio he seems perturbed about.
In most non-profit organizations, costs never go down. Once a budget is set for programs, staffing, etc, they don't just go away and in fact, they often continue to grow without bound. See any government budget ever for evidence.
So if donations don't continue to grow to match or at least keep pace, they could start running a deficit to eat away those reserves in no time. And once a non-profit organization starts running at a deficit, some contributors will question their contributions and they may shrink accordingly.
Those reserves could disappear in just a few years.. unless there's a change, two years should show the direction and another couple years, the course will be set one way or another.
> And once a non-profit organization starts running at a deficit, some contributors will question their contributions and they may shrink accordingly.
That's backwards though. It's not unusual for a non-profit to run a deficit. A donor will question a non-profit that's running a surplus - why am I giving you money you don't need.
> If it was just about wasting donations, they'd never go into debt.
But there's no debt. The whole argument is pure conjecture based on imagination. Using it as if it were a fact that proves something makes little sense.
While it's a good warning to WMF, it's not currently a problem. With longevity, they are in for some large endowments. I think those giving a certain amount (?) should have a vote on various directions of the org, like shareholders but, for the common good that supports their mission statements, instead of shareholder value.
> While it's a good warning to WMF, it's not currently a problem.
I believe the author's thesis is that by the time "it's not currently a problem" is no longer an argument that makes sense, it will also no longer be possible to effectively correct the WMF's course in a way that will solve it.
I'm not sure I have any idea how to effectively determine if the author is correct about that, but certainly I don't think "it's not currently a problem" actually contradicts anything he's saying.
Funding will not grow forever, neither would expense. There's no danger it would consume whole world's GDP and would require us to acquire an intergalactic loan from Arcturian Galactic Bank. https://www.xkcd.com/605/
I agree with the thrust of your comment, but the bit about "clickbait" is misplaced.
Is is clearly a poetic metaphor - no-one clicking on the article seriously believed that Wikipedia has a literal biological cancer (the "cancer" metaphor is hardly a new one, if any criticism can be made here it would be that it is almost verging on cliche). Indeed the entire article is structured around this metaphor - the title is hardly false advertising.
I disagree strongly with this new obsession that every article title and headline must be written as pure "Man Bites Dog" factual summary, particularly for opinion pieces like this. Surely there is room for some attempt at poetic flair.
(A hypothetical example of a real "clickbait" style headline for this article might be "Google Will Buy Wikipedia").
> could have been called "Wikipedia's Costs Growing Unsustainably"
The table in the article suggests it's growing sustainably, as assets are increasing and revenues exceed expenses. The whole unsustainability hypothesis seems to be based on one metaphor "if it's growing, then it's cancer, ergo it is deadly, ergo it has to be stopped".
I agree with your post. The OP is a bit fearmongering, however there is some general issues that are definitely true of Wikipedia, but because it seems to be doing well overall (in terms of traffic and ubiquity), nothing will change even if "it has cancer." If what the OP describes as "cancer" is how the top 5 website in the world operates and stays there, then so be it I say.
The thing is that there doesn't really seem to be any REAL alternatives challenging Wikipedia in an honest, high effort way. As a wiki/knowledge fan myself, I've gone to different sites with different takes such as Quora (which is fantastic, can't really say it's a real competitor though), Genius.com (which is comparable only in a very narrow sense for songs/texts and nothing else), and Everipedia (which is the closest thing to a real competitor with all Wikipedia content imported, but is tiny in comparison to the last 2 sites above - Alexa 6k US vs Alexa top 100 for the other 2).
I would say out of everything I've found Everipedia comes closest in a valiant effort and I frequently contribute to it here and there, but at the same time, Wikipedia is just too dominant to see any real necessity to change how it is doing anything, whether that is for good or for bad. And my personal opinion is that maybe that is how it should stay too, given the size and scale that Wikipedia operates and its general continued success across most of its fronts. One thing is for sure: the world is definitely better with Wikipedia continuing onward even if "it has cancer."
I use to manage the IT needs of a cartography lab in a uni. one day i was curious to how much web traffic we got off the old website so I checked the IIS logs and saw we were getting over 200k hits a year. i was like what the?
so I enabled google analytics to see where they were coming from. the source was one of our sub pages on redlining was the 5th external link listed in wikipedia on that topic.
Thank you for this rebuttal. Wikipedia is no longer the small platform it was. It's international, has to expand to developing markets, has multiple sister projects like Wikidata, and tools like VisualEditor that require developers. Sure you can probably rein in costs if you don't believe in any sort of expansion, software improvement, or outreach programs. Lastly, a lot of the claims in the op-ed are simply unfounded like the statement that the Foundation isn't transparent enough, that it's developers are idiots, or that Wikipedia isn't sustainable with its reserves. This just seems like him being overzealous with his consulting experience "rescuing engineering projects that have gone seriously wrong", just as every week a designer will "fix Wikipedia's design".
This. It is worth it for people to figure out how to preserve truth and verifiability. Perhaps community guidelines is the state of the art, but how can software help prevent it from being gamed?
IMHO if Wikipedia loses that, it costs dearly not just to Wikipedia but to society.
I was very actively involved in MediaWiki development & Wikimedia ops (less so though) in 2004-2006 back when IIRC there were just 1-4 paid Wikimedia employees.
It was a very different time, and the whole thing was run much more like a typical open source project.
I think the whole project has gone in completely the wrong direction since then. Wikipedia itself is still awesome, but what's not awesome is that the typical reader / contributor experience is pretty much the same as it was in 2005.
Moreover, because of the growing number of employees & need for revenue the foundation's main goal is still to host a centrally maintained site that must get your pageviews & donations directly.
The goal of Wikipedia should be to spread the content as far & wide as possible, the way OpenStreetMap operates is a much better model. Success should be measured as a function of how likely any given person is to see factually accurate content sourced from Wikipedia, and it shouldn't matter if they're viewing it on some third party site.
Instead it's still run as one massive monolithic website, and it's still hard to get any sort of machine readable data out of it. This IMO should have been the main focus of Wikimedia's development efforts.
> the foundation's main goal is still to host a centrally maintained site
Wikimedia universe is way bigger than one site. There's Wikidata, Commons, Wikisource, Wiktionary, Wikivoyage, Wikibooks and so on. And there's a lot of language versions too - English is not the only way to store knowledge, you know.
> The goal of Wikipedia should be to spread the content as far & wide as possible
The requires a) creating the content and b) presenting the content in the form consumable by the users. Creating tools for this is far from trivial, especially if you want it to be consumable and not just unpalatable infodump accessible only to the most determined.
> Instead it's still run as one massive monolithic website
This is not accurate. A lot has changed since 2004. It's not one monolithic website, it's a constellation of dozens, if not hundreds, of communities. They are using common technical infrastructure (datacenters, operations, etc.) and common software (Mediawiki, plus myriad of extensions for specific projects), but they are separate sites and separate communities, united by common goal of making knowledge available to everyone.
> it's still hard to get any sort of machine readable data out of it
Please check out Wikidata. This is literally the goal of the project. You can also be interested in "structured Commons" and "structured Wiktionary" projects, both in active development as we speak.
> This IMO should have been the main focus of Wikimedia's development efforts.
It is. One of focuses - for a project of this size, there's always several directions. BTW, right now Wikimedia is in the process of formulating movement strategy, with active community participation. You are welcome to contribute: https://meta.wikimedia.org/wiki/Strategy/Wikimedia_movement/...
Disclosure: working for WMF, not speaking for anybody but myself.
> The goal of Wikipedia should be to spread the content as far & wide as possible
>> The requires a) creating the content and b) presenting the content in the form consumable by the users. Creating tools for this is far from trivial, especially if you want it to be consumable and not just unpalatable infodump accessible only to the most determined.
Yes, and as emphasized in the article, WMF has done a terrible job at building better tools. For crying out loud, we are still typing in by hand the complete bibliographic information for each cited reference.
Your other comments are similar. The fact that "WMF is trying", or have a named task force whose formal mission includes a complaint, is not enough justify years of high spending.
> Yes, and as emphasized in the article, WMF has done a terrible job at building better tools.
I respectfully disagree. I think WMF has done pretty good job. Could it be better? Of course, everything could. Is it "terrible"? not even close.
> For crying out loud, we are still typing in by hand the complete bibliographic information for each cited reference.
https://www.mediawiki.org/wiki/Citoid ?
In any case "it misses my pet feature" and "the whole multi-year effort is terrible" are not exactly the same thing.
> is not enough justify years of high spending.
I think the work that has been done and is being done justifies it. All this work is publicly documented. You think it's too much and you have the ideas how to do it better - you're welcome to comment. I can not comment on your value judgements - you may seem some projects are more valuable and not done, you are entitled to it. There's a process which gets some things done and some things left out, and by nature not everybody will be satisfied. I only want to correct completely factually false claims in the Op-ed, and I believe I have done so. If I can help with more information, you are welcome to ask. As for value judgements, I think we'd have to agree to disagree here.
> The issues with the Wikipedia editing UI are legion
Any existing UI can be analyzed to find a legion issues, no UI is ever perfect, especially over time and changing requirements. Wikipedia UI is certainly not perfect, and much work is to be done (and being done), but I would stop very far from calling the work that was already done "terrible".
> Clean house. Put the people who built Zotero in charge.
Err, I am having hard time making sense of this advice - why exactly people who built a reference management software must be running Wikimedia Foundation?
> I would stop very far from calling the work that was already done "terrible".
You already declared you weren't going to debate me on this point, so I don't know why you're bringing it up again, especially since you're not saying anything substantive.
> why exactly people who built a reference management software must be running Wikimedia Foundation?
Because they are philanthropically funded non-profit who build great academic/research software on a small budget while responding rapidly to user feedback.
If your objections center around the fact that WMF does a lot more than develop Wikipedia software, then you are missing the whole point of this thread: that WMF's primary contribution is Wikipedia, and almost everything else is secondary. So long as it's being funded by private citizens because of the value they get from Wikipedia, then this should be the focus. Yes, that means the people running Wikipedia conferences and local meetups will have less power.
> If your objections center around the fact that WMF does a lot more than develop Wikipedia software
WMF does a lot more than develops one piece of software to manage citations, yes. Nothing wrong with the software, I'm sure people who made it are awesome. But it's like discovering US federal government didn't solve a problem with a faulty light on your street and proposing that an electrician that did should thus be the President of the USA. Nothing wrong with the electrician or fixing the light, and maybe he'd even be a great President, but that in no way follows from his ability to fix the light. That's just completely unrelated things.
> that WMF's primary contribution is Wikipedia, and almost everything else is secondary
Not so for some time. Also, Wikipedia as a project is way bigger than just software.
> So long as it's being funded by private citizens because of the value they get from Wikipedia, then this should be the focus.
It is. I mean the value and improving it (again, if we correct from Wikipedia to "Wiki sites to gather and disseminate knowledge", which are more that just Wikipedia). But opinion on how to improve that value may not only be "improve this one particular feature".
> Yes, that means the people running Wikipedia conferences and local meetups will have less power.
Than who? And why? There are processes that decide which directions are prioritized and which are not. Right now two of them are happening as we speak - board elections and strategy consultation. Any decision that happens leaves somebody unsatisfied, because it's not possible to satisfy everyone. That doesn't mean everything is terrible, sorry.
"One massive monolithic website" is, I think, meant to be read as referring to the WMF sites being a thing that have a shared telecommunications Single Point of Failure—a "choke point" where a given piece of information can only get from a given WMF site, to a user, by travelling through WMF-managed Internet infrastructure.
Remember Napster, back in the day? It was able to be shut down because it had an SPOF: Napster-the-corporation owned and maintained all the "supernodes" that formed the backbone of the network.
Or consider the Great Firewall of China. If the Great Firewall can block your site/content entirely with a single rule, you have an SPOF.
The answer to such problems isn't simple sharding-by-content-type into "communities" like you're talking about; this is still centralized, in the sense of "centralized allocation."
Instead, to answer such problems, you need true distribution. This can take the form of protocols allowing Wiki articles to be accessed and edited in a peer-to-peer fashion with no focal point that can be blocked; this can take the form of Wikipedia "apps" that are offline-first, such that you can "bring Wikipedia with you" to places where state actors would rather you don't have it; this can take the form of preloaded "Wikipedia mirror in a box" appliances (plus a syncing logistics solution, ala AWS Snowball) which can be used by local libraries in countries with little internet access to allow people there access to Wikipedia.
> WMF sites being a thing that have a shared telecommunications Single Point of Failure
In fact, one of the long-term projects in WMF is making sure the infrastructure is resistant to single-point-of-failure problems - up to whole data center going down. We are pretty close to it (not sure if 100%, but if not close to it). Of course, if you consider existence of WMF to be point of failure, it's another question, by that logic existence of Wikipedia can be treated as single point of failure too. Anybody is welcome to create a new Wikipedia, but that's certainly not a point of criticism towards WMF.
> It was able to be shut down because it had an SPOF: Napster-the-corporation owned and maintained all the "supernodes"
WMF does not own the content or the code, both are in open access and extensively mirrored. WMF does own the hardware - I don't think there's a way to do anything about it, unless somebody wants to donate a data center :)
> If the Great Firewall can block your site/content entirely with a single rule, you have an SPOF.
> Instead, to answer such problems, you need true distribution.
I am skeptical about the possibility of making community work using "true distribution". Even though we have good means to distribute hardware and software, be it production or development code, we still do not have any ways to make a community without having gathering points. I won't say it is impossible. I'd say I have yet to see anybody having done it.
But if somebody wants to try, all power to them. You can read more about Wikimedia discussions on the topic here: https://strategy.wikimedia.org/wiki/Proposal:Distributed_Wik...
> this can take the form of preloaded "Wikipedia mirror in a box" appliances
We are pretty close to this - you can install working Mediawiki setup very quickly (vagrant or I think there are some other containers too, I use vagrant), dumps are there. Won't be 100% copy of true site since there are some complex operational structures that ensure caching, high availability, etc. which kinda hard to put into a box - they are in public (mostly as puppet recipes) but implementing them is not out-of-the-box experience. But you can make a simple mirror with relatively low effort (probably several hours excluding time to actually load the data, that depends on how beefy your hardware is :)
Most of this, btw, is made possible by the work of WMF Engineers :)
> We are pretty close to this ... [things you'd expect ops staff to do]
That doesn't come close at all, from the perspective of a librarian who wants a "copy of Wikipedia" for their library, no? It assumes a ton of IT knowledge, just from the point where you need to combine software with hardware with database dumps.
The average library staff who'd want to set this up in some African village would be less on the side of the knowledge spectrum of "knows what to do with a VM image", and more toward the side of "can plug in and go through the configuration wizard for a NAS/router/streaming box."
Once I can tell such a person to buy some little box with a 4TB hard disk inside it, that you plug in, go to the URL printed on the top, and there Wikipedia is—and then it can keep itself up to date, with a combination of "large patches that get mailed on USB sticks that you plug in, wait, and then drop back into the mail", and critical quick updates to text content for WikiNews et al that it can manage to do using a 20kbps line that's only on for two hours per day—then you'll have something.
I presume you have you tried Kiwix? For less than $100, you can install the full Wikipedia (with reduced size graphics) on a cheap Android tablet with a 64GB card. The installation the first time is a little clumsy, but the experience once it's local is solid: http://www.kiwix.org/downloads/.
I don't think "critical updates" are really that necessary. Swapping SD cards a couple times a year would solve most of it. I think it's pretty incredible (and useful) to to be able to have access to all that information for such a low cost even if it's a few months (or even years) out of date.
> That doesn't come close at all, from the perspective of a librarian who wants a "copy of Wikipedia" for their library, no?
Depends what you mean by copy. If it's just a static data source, any offline project would do it. If it has to update it's trickier, but some offline projects do it too. If you want to run a full clone of Web's fifth popular website, yes, it requires some effort. Sorry, no magic here :)
> "can plug in and go through the configuration wizard for a NAS/router/streaming box."
There are boxes that are integrated with one or another of the offline projects. There's also Wikipedia Zero - which in the world where mobile coverage is becoming more and more widespread even in poor regions, may be even better alternative.
That gives you Wikitext encapsulated in XML. How do you get at the content of the Wikitext?
I work on a Wikitext parser [1]. So do many other people, in different ways. Wikitext syntax is horrible and it mixes content and presentation indiscriminately (for example, it contains most of HTML as a sub-syntax).
The problem is basically unsolvable, as the result of parsing a Wiki page is defined only by a complete implementation of MediaWiki (with all its recursively-evaluated template pages, PHP code, and Lua code), but if you run that whole stack what you get in the end is HTML -- just the presentation, not the content you presumably wanted.
So people solve various pieces of the problem instead, creating approximate parsers that oversimplify various situations to meet their needs.
One of these solutions is DBPedia [2], but if you use DBPedia you have to watch out for the parts that are false or misleading due to parse errors.
avar: "The goal of Wikipedia should be to spread the content as far & wide as possible, the way OpenStreetMap operates is a better model."
I am confused.
Doesn't OSM data come encapsulated in XML or some binary format?
As for dispersion of content, I could have sworn I have seen Wikipedia content on non-Wikipedia websites. Is there some restriction that prohibits this?
I have seen Wikipedia data offered in DNS TXT records as well.
For each article there is some metadata, but the entire text of an article is just a blob inside one XML element.
For anyone who has not worked with the Wikipedia data dumps extensively before, trust us that it is not easily machine-readable and that even solutions like DBPedia / Wikidata are not yet suitable for many purposes.
As someone who contributes to many knowledge projects, including Wikipedia and Wikidata frequently, I'm curious about what you mean that Wikidata is not yet suitable for any purposes. Am I wasting my time contributing to it? I thought that it was helping a lot of machines understand data. Can you please explain further?
Please reread, for many purposes! I love Wikipedia.
The Wiki markup is extremely complicated and being user created, it is also inconsistent and error prone. I believe the MediaWiki parser itself is something like a single 5000 line PHP function! All of the alternate parsers I've tried are not perfect. There is a ton of information encoded in the semi-structured markup, but it's still not easy to turn that into actual structured data. That's where the problem lies.
Would there be some particular structure that everyone would agree on?
Alternatively, what is the desired structure you want?
Because the current format is so messy, I just focus on what I believe is most important: titles and externallinks. IMO, often the most interesting content in an article is lifted from content found via the external links. I also would like to capture the talk pages. Maybe just the contributing usernames and IP addresses.
Opinions or explanations that have no supporting reference are inexpensive. One can always these for free on the web. No problem recruiting "contributors" for that sort of "content".
Back to the question: I am curious what structure would you envision would be best for Wikipedia data? Assume hypothetically that a "perfect" parser has been written for you to do the transformation.
The structure I need for my particular project (ConceptNet) is:
* The definitions from each Wiktionary entry.
* The links between those definitions, whether they are explicit templated links in a Translation or Etymology section, or vaguer links such as words in double-brackets in the definition. (These links carry a lot of information, and they're why I started my own parser instead of using DBnary.)
* The relations being conveyed by those links. (Synonyms? Hypernyms? Part of the definition of the word?)
* The links should clarify the language of the word they are linking to. (This takes some heuristics and some guessing so far, because Wiktionary pages define every word in any language that uses the same string of letters, and often the link target doesn't specify the language.)
* The languages involved should be identified by BCP 47 language codes, not by their names, because names are ambiguous. (Every Wiktionary but the English one is good at this.)
There are probably analogous relations to be extracted from Wikipedia, but it seems like an even bigger task whose fruit is higher-hanging.
Don't get me wrong: Wiktionary is an amazing, world-changing source of multilingual knowledge. Wiktionary plus Games With A Purpose are most of the reason why ConceptNet works so well and is mopping the floor with word2vec. And that's why I'm so desperate to get at what the knowledge is.
I don't think you are using this in the way it was meant to be used. Wikipedia is a user edited, human centered project. Humans are error prone and that's something that you are going to have to live with if you want to re-purpose the data.
The burden of repurposing falls on you and wikipedia makes the exact same data that they have at their disposal available to you, to expect it in a more structured format that is usable by you and your project but that goes beyond what Wikipedia needs in order to function is asking for a bit much I think.
They make the dumps available, they make the parser they use available, what more could you reasonably ask for that does not exceed the intended use case for Wikipedia?
Afaics any work they do that increases the burden on Wikipedia contributors that would make your life easier would be too much.
But since you are already so far along with this and you have your parser, what you could do is to re-release your own intermediary format dumps that would make the lives of other researchers easier.
> believe the MediaWiki parser itself is something like a single 5000 line PHP function!
It's not. I'm on mobile so not easiest to link, but the PHP versio of the parser is nothing like a single function. There is also a nodejs version of the parser under active development with the goal of replacing the php parser.
The GP said Wikidata isn't suitable for many purposes, different from any.
It's a nice agreed-upon vocabulary for linked data. But you still need the data that the vocabulary refers to. The information you can get without ever leaving the Wikidata representation is still too sparse.
Thanks for working on that! Didn't know it was so bad. The following is a possibly stupid idea, but I'd like to hear your thoughts:
What if you just render the content into HTML and then "screen scraped" the text, and then convert into a more useful format (MarkDown, JSON, etc). Is that plausible?
That would allow a basic UI change on Wikipedia to break your code. Sometimes it is necessary, but not usually the best option in my experience, and it's pretty annoying to do.
You can get what amounts to an HTML dump (which is then indexed and compressed in a single huge archive) from Kiwix. Although they do them basically twice a year or so.
Go click through those links. Most of them are hardly maintained. E.g. last static HTML dump was in 2008. Current enwiki raw data dump is in progress and reads:
There are real logistical challenges in making these dumps and making them _useful_. For all Wikimedia's spending, they have not invested sufficiently in this area.
Years back Wikipedia released HTML dumps of the entire site, which was closer to providing the actual content of Wikipedia as structured data, but that was discontinued.
Random thought. Why can't something like Wikipedia be run distributed through a blockchain? Edits are just transactions that are broadcast over the network. I imagine the total cost of that to individual contributors of nodes would be less than the millions they're paying right now.
Completely with you. The goal of Wikipedia should be to spread the content and allow more new content.
Sadly it seems the opposite is true, whole parts of Wikipedia are infested by cancer (aka corrupt/out-od-mind admins who are acting in their own world/turf and interest), have a closer look at certain languages like de.Wikipedia.org where more new and old articles get deleted than content can grow (source: various German news media incl. Heise.de reported about it). And why is Wikia a thing? And why is it from the Wikipedia founder, has he a double interest!? And now he is starting a news offspring as well! Something like the Wikipedia frontpage and WikiNews, just under his own company. And on the otherside Wikipedia banned trivia sections to make the Wikia offspring even possible (happened 10 years ago, but you probably remember it; yet Wikipedia deleted/buried the trivia section history). Why even delete non-corporate-spam artices? Why are fictional manga creatures all o er on Wikipedia but info about movie characters all deleted? Many Wikipedia admin seem to be deletionists that care only about their turf, the care about "their own" articles, they revert changes to them just for their own sake. Look at the WikiData project. Why is it implemented by a German Wikipedia org that has little to do with intl Wikimedia foundation, it's not a sister project, they do their own fundraiser and media news reported not so nice things over the years.
Look at OpenStreetMap project, it works a lot better. Maybe the Wikipedia project should be transfered over or forked by OpenStreetMaps project. And delete all admin rights, and start over with the in some way toxic community that scares away most normal people who don't want to engage in childish turf wars and see their contributions deleted and cut down for no reason but admin power play.
> Also what's the difference between WikiData and DBPedia?
Wikidata is a Wikimedia project with aim to create a structured knowledge based. It is mostly filled and curated by humans: https://www.wikidata.org
DBPedia is a knowledge base which content is extracted from Wikipedia (mostly from the infoboxes). It is a project run by researchers: http://dbpedia.org
I am the author of this op-ed, which I will prove by posting a comment on my Wikipedia talk page [ https://en.wikipedia.org/wiki/User_talk:Guy_Macon#Hacker_New... ] before saving this. I am open to any questions, criticism, or discussion. BTW, as I noted in the op-ed. At the request of the editors of The Signpost, the version linked to at the top of this thread has fewer citations and less information in the graphic. The original version is at [ https://en.wikipedia.org/wiki/User:Guy_Macon/Wikipedia_has_C... ]
The key financial metric I'd look at is "months to cash out". Basically someone reasonable needs to decide "if no other money comes in to this organization, how long should it need to operate?"
From there you can get more specific on what "operate" means (i.e. will layoffs occur before scaling back hosting costs).
The question as I understand it is is "if no other money comes in how long could Wikipedia operate?
If you assume (and I do) that the Wikimedia foundation (WMF) would keep on the spending path they are on, it would take a year plus or minus a few months to go completely broke. If they were to immediately respond with massive spending cuts they could last a lot longer.
The reason I don't believe that the WMF will react to a revenue decrease with spending cuts is because they really, really, believe that everything they are doing and everything they have planned is absolutely essential. Plus, it is human nature to say "this is temporary. The revenue will go back to increasing next year", all the while greatly expanding the fundraising appeals.
His argument seems to boil down to "growth must be cancer" and "wikipedia/wmf shouldn't have expanded it's scope", with the conclusion "this must fail". But don't most organizations do? Are non-profits not allowed? Otherwise I also would like some more specific criticism about how money is wasted.
Reproducing the table from the article with one extra column, the ratio of expenses to revenue for clarity, it looks like they're still operating with a very comfortable margin. Yes, the 19% margin is tighter than a 50% margin 12 years ago, and their existence depends on donations now more than ever ($23,463/yr is sustainable by a single engineer's salary, $65,947,465/yr is...not), but Wikipedia and other Wikimedia projects also serve a much wider audience and broader purpose. This isn't scary in and of itself, especially if they've got cash reserves to give them time to tighten the belt later on if it becomes a problem and someone in a leadership position is monitoring their finances to act if their burn rate gets too high... I've seen plenty of nonprofits with tighter margins survive and succeed.
It seems op-ed followed Wikimedia's Financial Statements [1]
Expenses (2016/2015)
2016 2015
Salaries and wages 31,713,961 26,049,224
Awards and grants 11,354,612 4,522,689
Internet hosting 2,069,572 1,997,521
In-kind service expenses 1,065,523 235,570
Donations processing expenses 3,604,682 2,484,765
Professional service expenses 6,033,172 7,645,105
Other operating expenses 4,777,203 4,449,764
Travel and conferences 2,296,592 2,289,489
Depreciation and amortization 2,720,835 2,656,103
Special event expense, 311,313 266,552
Total expenses 65,947,465 52,596,782
11 million in awards and grants seems like something that you can cut easily in bad times. Also, they are still generating more income than expenses, and the margin is big enough to adapt if there is a sudden decrease.
I thought the essay's concern was that funding is growing too fast, allowing expenses to expand to fill the gap in a way that is unsustainable (because eventually the funding growth must come to an end).
Administrator since 2003 here. I have contributed to Wikipedia in various languages, Wikimedia Commons, Wikibooks, Wiktionary, Wikisource, etc. Three core points, particularly on Wikipedia:
(1) Bad experiences for new and established contributors mean less motivated contributors. This is due to factors such as too much bureaucracy, too many subjective guidelines, too much content being deleted (exclusionism), and an overwhelming mess of projects and policies.
(2) Not enough focus. By starting many projects the foundation has muddied its mission and public identity. In addition, it has broad and potentially mutually conflicting goals such as educating the public about various issues, educating the public about how to work with others to contribute to projects, asking the public for money, agitating governments and corporations for policy change and support, monitoring public behavior looking for evidence of wrongdoing, and engaging with education. Why not leave education to the educationalists, politics to the politicians, spying to the government and motivated contributors and fundraising to donors?
(3) Non-free academic media is hurting the project. Given that only a small number of editors have true access to major academic databases, it is often hard for contributors to equally and fairly balance an article.
Having said that, I still have tremendous respect for the project and comparing its costs to those of the prior systems necessarily incorporating manual preparation, editing, production and distribution of printed matter by 'experts', the opportunity costs for access alone justify the full expenditure. It's not a lot of money in global terms.
> Bad experiences for new and established contributors mean less motivated contributors.
This has nothing to do with the financials of the foundation and is completely a community issue.
> Not enough focus.
This is a valid point but I think you're being too scorched earth with it like saying that Wikipedia shouldn't do any political outreach at all. If its millions of viewers hadn't seen the SOPA blackout, would it have been as successful? If it didn't fight for freedom of panorama and other copyright issues, would it be able to exist in the same form as now? Your suggestion is like telling Japan to go back to isolationism. Sure it might work if you're self-sustaining, but it's no way to run a global project.
> Non-free academic media is hurting the project.
If you are part of a university, you likely have access to such media. Many public libraries also have such access. Lastly, there's the Wikipedia Library. [0] I'm not sure what you want Wikipedia to do here past what it's already doing.
>> Bad experiences for new and established contributors mean less motivated contributors.
> This has nothing to do with the financials of the foundation and is completely a community issue.
I do not contribute financially to wikipedia, despite being very interested in doing so, because of this issue.
I am sick and tired of seeing large amounts of properly formatted, well formed articles, written in good faith, deleted by the little hitlers protecting their precious wikipedia turf.
This "community issue" costs wikipedia several thousand of my dollars per year. I wonder how many other people decline to support them financially due to these "community issues" ?
> This has nothing to do with the financials of the foundation and is completely a community issue.
I don't agree. Choices in how the foundation spends it's money can amplify or diminish these concerns.
E.g. WMF has spent extensively to try to bring in a wider space of borderline editors rather than investing as much in infrastructure to soften the learning curve to retain and boost the participation of middle-tier editors, which exacerbates us vs them seige mentality... and the overuse of blunt tools to stem a rising tide of low quality edits at the cost of a poor experience for new contributors.
E.g. An example I'd cite for this is the extreme investment in "visual editing"-- which only barely manages to not mangle pages when its used-- over things like syntax highlighting.
Not all the blame in these areas falls on the WMF for sure. As an example, enwiki community factionlization around deletion blocked the deployment of revision flagging (basically supporting having 'release' versions of articles so that non-contributors are not constantly subjected to the very latest unreviewed revision of an article) which would have allowed radically less aggressive edit patrolling.
It's clear that salaries and awards and grants are driving the increase in cost. Maybe this is damning evidence of a decadent culture, which the author of this op-ed clearly presume, but I doubt it. I would expect that Wikipedia's employees have been working very hard for a long time to keep the site running and they've cultivated expertise in governing the site in a way that avoids controversy and maintains credibility. I'd rather Wikipedia spend to retain long-tenured experts who have paid their dues than be an underpaid-college-graduate-mill like so many non-profits are. It seems that they've done that, and they've waited until the organization was financially stable to do so.
When I say "I want to know where Wikimedia is spending its money", I don't mean "is it on people or on bandwidth or on equipment?"; I mean "is it on Wikipedia or Wiktionary? how much money did they burn into the finally-launched WYSIWYG editor that their own research shows is barely used and solves the wrong problem? how little time is being spent figuring out how to handle a world with decreasingly reliable second-party sources, given their adamant refusal to allow reliance on first-party material? do they have any resources at all dedicated to dealing with deletionism?". I do not care if the people there are being paid a million dollars a year: I want to know their time is being used in ways that makes sense, and as far as I can tell almost none of their resources are being spent on anything which seems to actually matter. If they explained "actually, we added an automated model for verifying the value of an edit that our metrics have shown decreased the amount of time moderators have to spend watching the site while having minimal effects on new user retention, a project which used twenty engineers for five years to build" I'd at least shrug and go "huh, OK"; but as of right now I am not seeing it... it isn't that they overpay their staff, it is that they fundamentally don't have anything useful to do with staff but seem to keep growing their staff and then allocating them towards dumb things while telling everyone if they stop donating to this cancerous staff growth the site will go offline, which is a situation for which I simply can't attach enough modifiers to the word "lie" to to express the level of active deception at play.
Exactly. There's this folk belief that the main risk with non-profits is that that they will pay themselves above-market salaries or otherwise embezzle money in outright fraud. And when people criticize the non-profit for inefficiency, they often defend themselves by saying "Look! The salaries for our legions of workers are market rate and we have all these noble sounding projects."
But this is a red herring, because outright fraud is relatively rare. Rather, the much biggest issue is a terribly managed organization spending resources ineffectively. Non-profits shouldn't be judged on overhead or executive salary (who cares?), they should be judged on what they accomplish for the amount they spend. And WMF does terribly on this metric.
Often staff is taken on in order to fill vacancies without as much regard for skill levels. The marginal value of extra employees lowers and can dip into the negative. This is the sort of ineffective spending which is invisible to all but their closest colleagues -- who have too little political capital to do anything about it.
Where I feel a lot of the animosity from the Wikipedia community stems from is that the people who have "cultivated expertise in governing" are actually Wikipedia volunteers, not WMF employees.
I'd add: Let's not get caught in the trap of looking at good-paying jobs as a problem. Wikipedia employees shouldn't be expected to work for next-to-nothing or nothing and to make great sacrifices for the rest of us, which is what many open source leaders and contributors must do (a bad thing). Why shouldn't we pay then well?
It just... really bothers me that Wikipedia has grown into this massive thing, with $60 million in cash reserves and $31 million in salaries a year... and the people who aren't getting paid are the ones actually writing an encyclopedia. For that kind of money, you'd think they could actually pay people to write an encyclopedia, like Britannica used to. Now Britannica is circling the drain, Wikipedia is raking in money, and instead of paying the writers, there's this whole bureaucracy slurping up the cash and not giving it to the people doing the actual work. I hate all this digital sharecropping. I hate all these businesses based on paying millions of amateurs nothing or next to nothing for large volumes of low quality labor, making it up on volume, and paying a handful of people large sums of money to "administer" it. You'd think for that kind of money you could pay some writers.
> ...I have never seen any evidence that the WMF has been following standard software engineering principles that were well-known when Mythical Man-Month was was first published in 1975. If they had, we would be seeing things like requirements documents and schedules with measurable milestones.
This part of the critique seems a little off, doesn't it? I don't know the state of WMF engineering, it very well may have problems, and a complete lack of documentation or planning is not a good sign, but the particular artifacts (requirements documents, schedules with milestones) mentioned here are more from the pre-Agile waterfall school of thought. Can anyone familiar with WMF engineering comment?
[Former product manager at the Wikimedia Foundation and longtime Wikipedia editor/admin here.]
The author of the op-ed is a devoted editor but seems almost totally ignorant of how development is conducted. The Foundation has been doing transparent quarterly/yearly roadmap planning alongside their annual plan / budget cycle (which is shared all publicly). On a shorter timeframe, they are pretty serious about Agile/Scrum. You can see on https://wikimediafoundation.org/wiki/Staff_and_contractors that today they even have a team of half a dozen full time Scrum masters. If what he thinks is missing is serious, detailed planning, he's sorely wrong.
The platform (MediaWiki) is still a FOSS community so you can find project requirements docs/roadmaps all over mediawiki.org, all the bugs on Phabricator, follow along on mailing lists, and even see commits on their Gerrit instance.
Agile isn't my cup of tea personally and I could grok criticisms of the organization's software output (ignoring the fact that they're buried under 10+ years of technical debt...), but it takes minimal digging to find all their plans and timelines. I would venture that the author chose not to dig into this because he, like a lot of entrenched old school editors, viscerally hates some of the past attempts to make MediaWiki a modern collaboration platform, such as building a WYSIWYG editor and a threaded discussion system to replace wiki talk pages.
I am very familiar with Agile and Scum, and I have seen the advantages over older paradigms such as waterfall. That being said, there are certain basic principles that the old methods and the new methods have in common. One such principle is the basic idea of having some sort of contact with the people who will be using the finished software and understanding their needs. The WMF does not do that. Instead, they build something in secret, throw it over the wall, and watch as the Wikipedia editors reject it as the steaming pile of crap that it is. They have done this again and again. Visual Editor. Flow. Mobile App. Knowledge Engine. All failures. All built without any input from the people who would be using them.
Now I KNOW that the developers are not stupid or ignorant, and I have checked as best I can and it appears that every one of them was able to create high quality software that meets the customer's needs when they were working other places. That leaves me with management as the probable culprit. And I don't think that the problem is product managers like the author of the post above this one. I think the blame is at the very top.
Finally, if it really "it takes minimal digging to find all their plans and timelines", I would like to see this demonstrated by providing links to the plans and timelines for the Knowledge Engine. --~~~~
Where can one find out what percentage of Wikimedia's developer staff resources (as opposed to open source contributors) are being allocated towards what projects? They are spending $31m this year on staff: what percentage of that is being spent on what kinds of staff (ex. software engineer vs. community manager), and what percentage of their developer staff is being used to build these aforementioned projects? If that number is extremely low then you can just discount that issue, but if that number is enourmous then more questions have to be asked (which would of course involve looking at the success metrics on those projects and what validation was done on them while they were designed). As it stands, Wikimedia is constantly asking for more money using the threat that Wikipedia will collapse, when for all we know most of their staff time is off building stuff like Wiktionary.
TL;DR: the largest chunk of the budget goes to the two departments that do engineering/design/PM/data science.
On your last point ("for all we know most of their staff time is off building stuff like Wiktionary") it's actually a big gripe in the smaller communities that probably 90% of the time and attention goes to Wikipedia.
Ok, so from this I see that $20m/year is going towards staff for "product" and "technology"; but there is no breakdown on what that is being spent towards. The point I was making is that if we knew the percentage of effort going towards engineering and multiplied it by the percentage of time being allocated towards some targeted projects and that value was low, then it would not be relevant... but we only have he first number and that number is high enough to mean we have to be concerned by the second number. Spending $20m for a year of engineering effort is a ridiculously large number for a website that fundamentally does as little as Wikipedia does... what was shown for it and what percentage of that can be allocated towards each outcome?
Agile approach doesn't mean that there are no requirements documents and no milestones. You're still supposed to write requirements in some form (e.g. user stories and test cases) and plan a few months ahead (while being ready to correct your course based on user feedback after every sprint).
> and a complete lack of documentation or planning is not a good sign
Neither it is true. There is documentation and planning. As pretty much every software project I've seen over my career, documentation could use some TLC (and unlike many other pieces of software, anybody can actually help it[1]), but it's not exactly described as "complete lack". There's a lot of documentation, though some areas are covered less than others -
Mediawiki is a big piece of software, and a long-term organically grown project, and if you have any experience with those you know what it means. It is known and regular effort is taken to improve it.
Same for planning - not exactly ideal, but "complete lack" is very, very far from the truth - moreover, unlike many other organizations, all the plans and all the internal workboards are public[2] (excluding security issues and sensitive information), so you can check for yourself.
> Can anyone familiar with WMF engineering comment?
Yes, I am familiar with it by virtue of being part of it (still not speaking for anyone by myself, off-the-clock, in completely personal capacity :) and I say this claim is completely false. Moreover, it is so obviously false and so easily disproven by public documents[3] that I wonder how one could publish that in a public media without bothering to do minimal due diligence. I mean, we all panic about "fake news" and stuff - shouldn't it make us to at least minimally try to check our claims with easy search or question on a mailing lists having dozens of people who could point out where the appropriate documents are? The author of the article seemingly believes it is unnecessary. I do respect his long-time contribution to Wikipedia (much more sizeable than mine) but that still does not entitle him to his own facts.
Looks like he disagrees with some of the projects Wikimedia took on - like making user experience more friendly with Visual Editor and mobile support, both IMO excellent projects, but everybody is entitled to their own take on this. It is fine. What's not fine is claiming that not agreeing with him is equivalent to not having direction at all and wasting money and being cancer. That's way too far and completely untrue.
That's the point; he's arguing that WMF engineering practices are so disorganized that not only don't they qualify as agile, they don't even qualify as waterfall (which predates agile by several decades).
Waterfall methodologies are deeply un-hip today, of course, but when they first coalesced they were a big improvement over what came before them, which was essentially nothing: an absence of any formal project management methodologies, with people cobbling together projects from bits and pieces of expertise learned in other disciplines.
(Note that I have no idea how WMF's software engineering practices work, so I have no idea if this assertion is accurate or not. I'm just trying to clarify what I think Macon is arguing here.)
You are correct. I am arguing that WMF engineering practices are so disorganized that not only don't they qualify as agile, they don't even qualify as waterfall (which predates agile by several decades). In particular, I am part of the community of Wikipedia editors. Nobody asked us what flow or Knowledge Engine should look like. That's part of Waterfall AND Agile. Yes a dedicated person can go to the developer's separate Wiki and mailing lists, but on Wikipedia itself there is zero evidence that the principles I see at [ http://agilemanifesto.org/principles.html ] are in play.
Again, the developers and their managers know this. I am convinced that they have been told in no uncertain terms that they will be fired if they interact with the Wikipedia community.
It's hilarious. It has only existed as a thing to critcize, and the term itself actually originates in a paper describing why it's broken. No one has ever advocated for the "waterfall" approach.
> and the term itself actually originates in a paper describing why it's broken.
So does the term "capitalism". Like capitalism, though, the waterfall method was a thing actually in wide use both before (the first paper describing it's use was about 20 years earlier than the critical one in which the term seems to have been first used) and after (it's been mandated by many institutions, particularly in government, even after that critical paper) being names in criticism.
> No one has ever advocated for the "waterfall" approach.
Actually, a number of large organizations, particularly governments, to this day mandate processes for software development projects, particularly large projects, that embody essentially the key features of the waterfall method, most critically that of doing full analysis across the whole scope before beginning development (often, in government, before getting approval for funding to open up contracting for the actual development work.) A lot of the contractors involved advertise that they use agile methods, but it ends often up being a kind of Scrum-within-waterfall monstrosity that managed to preserve the worst features of both.
This op ed is non-sensical. According to the author, every successful startup in history is "cancer". Wikipedia's costs have grown because their usage has grown exponentially (comparing costs to economy-wide inflation is particularly baffling).
If anything, I got from this article that Wikipedia has kept costs well below revenue growth, which is normally the sign of a healthy organization.
If hosting cost per hit has increased, something is wrong. Computer costs have gone down since 2005.
That's worth looking into. Wikipedia hasn't gone down the Web 2.0 Javascript/CSS rathole, where every page loads megabytes of vaguely useful junk. What's the problem?
> If hosting cost per hit has increased, something is wrong. Computer costs have gone down since 2005.
Images are dramatically larger, both in the raw size out of cameras and the resolutions people are willing to put on pages (including higher-DPI screens).
Video's likely a lot more prevalent now, too.
I dunno what they're paying for bandwidth, but AWS S3 has barely dropped per-GB bandwidth costs since its release in 2006.
In 2005, Wikipedia co-founder and Wikimedia Foundation founder Jimmy Wales told a TED audience:
> So, we're doing around 1.4 billion page views monthly. So, it's really gotten to be a huge thing. And everything is managed by the volunteers and the total monthly cost for our bandwidth is about US$5,000, and that's essentially our main cost. We could actually do without the employee … We actually hired Brion because he was working part-time for two years and full-time at Wikipedia so we actually hired him so he could get a life and go to the movies sometimes.
According to the WMF, Wikipedia (in all language editions) now receives 16 billion page views per month. The WMF spends roughly US$2 million a year on Internet hosting and employs some 300 staff. The modern Wikipedia hosts 11–12 times as many pages as it did in 2005, but the WMF is spending 33 times as much on hosting, has about 300 times as many employees, and is spending 1,250 times as much overall. WMF's spending has gone up by 85% over the past three years.
> Maybe you need 300 employees to maintain the site at that scale?
Should you? I mean, my day job is making it so you expressly don't. If your costs are scaling even linearly I would say you're doing something wrong. The point of scaling is to reduce costs--economies of scale are why you scale. And a user-editable encyclopedia and PHP application are not really good arguments for diseconomies.
But you can't reach perfect efficiency. Scaling will of course reduce costs in general, but it's not a given. If I had an app that had 5 users, my costs would increase if my user number grew to 5,000,000,000.
Without knowing more information about the financials, or how resources are allocated this is all conjecture. But a website that services 17bb page views/month is going to cost a lot of money to run. They could be spending their money very poorly, idk, but I also don't know whether or not what they are spending is an appropriate amount.
We aren't talking about "perfect efficiency". We're talking about not blowing up your costs. A website that serves seventeen billion pageviews per month of mostly cacheable and edge-serviced data is, while certainly a technical challenge, a very surmountable one. And a lot of the harder parts are blunted through Wikipedia's situation. Search, for example, is a difficult problem in those situations--but I'd bet money that most of those searches are coming from Google, which mitigates a large chunk of the demands on in-house search that a different kind of website might see. (I've used Wikipedia's internal search once this year, according to my browser history.)
Point to the diseconomies of scale and we can talk about them, but everybody else has figured out how to leverage economies of scale when building out a large technical system.
> Scaling will of course reduce costs in general, but it's not a given.
Not only does it seem costs haven't been reduced, their rate of increase has exceeded the growth in pages served, quite substantially. That's not the whole story I'm sure, but as a rough estimate that doesn't seem sustainable or healthy... or necessary simply in terms of general hosting cost declines over that period.
To just maintain the site - I mean, making sure every HTTP request (or reasonable part of them) is answered and Apache does not crash and logs are rotated and backups are performed - no, you don't need 300 people. If you froze Wikipedia in 2005 and never wanted to improve anything there, and never open new project, never make a local chapter, an editathon or support a new language, etc. etc. - you could probably do with 25-30 people.
But that won't be a live project. That would be a fossil that would slowly wither and die, as it becomes less and less relevant and more and more inadequate with the needs of the current user. In 2005, iPhone didn't exist, now everybody has a smartphone. Should we somehow account for that or just ignore it? How about knowledge graphs and linked data and all AI developments - should all Wikipedia knowledge still be text-only and ignoring whole Linked Data universe? How about supporting thousands of existing languages - should we just dump them in their own domain, or should we help them with automatic translations, article templating, language-sensitive searches and so on? How about creating richer media like maps, diagrams, graphs, video and audio content - should we help this or should we be content with just inserting links to outside content? And that's only minuscule part of the questions we can ask about things changed and developed in the last decade.
The point here is that Wiki universe is a a big and complex live active project (or set of projects), with very many facets, and reducing it to technical maintenance of one webserver site - even one that gets tons of traffic - is not a good idea. The goal of the movement, as I understand it, is not "make sure site en.wikipedia.org does not crash", it is "make all sum of human knowledge be available to everyone". It's a big mission, and it requires people to achieve it.
To be honest, most successful startups are indeed quite cancerous.
Twitter doesn't need 3,500 employees, Facebook doesn't need 17,000 and Google doesn't need 72,000. They could all fire 50%-90% of their workforce and the product wouldn't be materially affected for the vast majority of users. The reason they have this many is because they can have this many. Their success has given food to the cancerous growth.
You get this situation whenever a company isn't operated by a shareholder who get's dividends.
When the person who runs the company knows that every dollar they don't spend is money in their pocket, people start to actually care about expenses and only focusing on what is important.
1) one of the biggest moats for tech companies is talent, and it's necessary to have GOOD employees available to test new fields and grow quickly (when Android came along, Google suddenly needed a lot more engineers that it had available and saved the time/challenge/cost of hiring good quality engineers). Also the numbers you posted are overall employment numbers, not engineering ones (you need to have bigger HR, sales, support etc as you grow). I totally agree that there is a point of diminishing marginal return and they could be bloated, I just don't know what that is and I don't think you do either, so don't discount the number just because it's large.
2) Dividends are given if the company doesn't think that it can reinvest those in a more profitable way. Berkshire Hathaway notoriously doesn't give dividends - it doesn't mean Buffett & Munger are personally pocketing all the wealth. Also, dividend returns are taxed so that's why many times shareholders are okay with having the company reinvest it/bring it down as retained earnings because it's smarter that way.
In the USA, wasting money on bloat is effectively incentivised by the tax structure because you can spend money that is already inside the company with an effective discount vs distributing it so that it can be invested elsewhere.
Owners should be withdrawing the profits of their companies. The current situation leads to bloat and stagnation.
I agree that a lot of tech companies have a much larger staff than they need (especially the mid sized ones) and I agree that Google in particular seems to suffer from severe mismanagement, but this is kind of the best case when you see a company with high enough barriers to entry that they collect rents: they spend on R&D and other "wasteful" stuff. Part of the problem with the thought experiment of the perfect free market (infinite producers/consumers) is that margins are so razor thin that no one can do this kind of stuff, which can be enormously beneficial when well coordinated (see: postwar Japan) so you do want some of it to happen.
There isn't any basis to your statement that Facebook could make as much short term and long term profit if they fired half their staff. Instagram, for instance, is much bigger than they were a couple years ago, and those employees have built things that have drove growth for the product.
You don't even have to go as far as the big tech companies. You see it even in smaller ones. Something that's stuck with me is when, in the span of about a week or so, I had multiple people at different startups answer a question "hey, how's it going?" with "oh, really great, we [note: not founders] just closed a round and we're hiring X more people [note: at companies where the product is stable and well-understood]".
But he's right: every successful startup in history is cancer. Deliberately so: they grow huge very, very quickly. It's steroids.
The trick with any great start up is how to get out of the "aggressive growth without revenue" stage. Some (very few) keep the growth but aggressively grow revenue and become self-sustaining. Others dial down the growth at a certain stage, and revenue increases to match, and they're self-sustaining.
The point is you have to become sustainable. Most start-ups flare out badly. You cannot change over-night from "growth like cancer" to "stable and flat". You need some way to switch and it takes time.
Yes, and that revenue comes from begging banners that more often than not create the - entirely false -
impression that if people don't donate, Wikipedia will blink out of existence.
In the early days, when almost everything was volunteer-run, hosting costs were indeed Wikipedia's main expense (as explained in that old quote from Jimmy Wales). These days, hosting amounts to about 2% of expenses, and most of the rest goes to staff costs (incl. about two dozen people in fundraising alone). Meanwhile, most of the value - the actual content - is contributed by unpaid volunteers. The paid staff have absolutely nothing to do with it.
Nobody minded working for free when there was just enough money to cover hosting costs. Now, however, there is an influx of $100 million in donations a year, and none of that benefits the average contributor, the people actually writing Wikipedia. That grates a little, much like in the monkey fairness experiment (Google it if you're unfamiliar with it).
I think the article was questioning if this is suitable for a Charity. A business should be able to estimate its growth and sales, and can re-invent itself if a path isn't going to be profitable. The growth in a business should also be growing revenue (or it should be creating the foundation for later revenue).
A charity is based on how much people are willing to donate, which can change VERY quickly. Imagine what would happen if the public perception of the people behind wikipedia was to dramatically change and next fundraising campaign only brought in 20% of last years?
Additionally If people start to believe that the finances are squandered or not spent as expected, there could be a movement to NOT donate. This happened with the Red Cross after 9/11 where people felt they were duped by the fact their donations went to a consolidated funds (often to fund overseas activities) rather exclusively to 9/11 victims.
Charities also tend to need to keep a few years of funding in the bank to deal with a change in markets, as a charity typically can't use debt to get through rough years (economic downturns / recessions). Exponential growth makes this nearly impossible unless your running extremely lean.
I think the article is a little sensationalist (and maybe it needed to be to reach certain people), it seems to have some sound concerns.
You're right that the costs of running wikipedia are still quite low considering the incredible scope, popularity and importance of the site. "Cancer" seems over the top and needlessly combative. But, it definitely doesn't hurt having someone bring up the fact that costs grew by 6x in 6 years. As he says, more years of this could put wikipedia in a precarious position.
It seems the OP cares about wikipedia and is genuinely worried. These conversations need to be had (also at fast growing startups) and having them in the open is part of the wikipedia way. Maybe he's wrong but it doesn't seem nonsensical or disingenuous to me. It seems genuine and rational.
What you say is correct. He just says something different and not the opposite. He talks about that a money hungry, fast growing business is not the model an open content wiki should have as a foundation. The value to the user would probably be the same if it would be a 3 person inc with $150k/year revenue.
If you read the article, you'll notice it says that the expenses are growing faster than the traffic, and faster than revenue. This is the exact opposite of what you would normally expect (due to economies of scale).
Succinct and correct. I have been a regular donor to Wikipedia for years now, because I use the service heavily, support its mission, and like the access I have to contribute to its vast body of knowledge. When the fundraising solicitations come around, I've typically been too busy to look into the financials and have always assumed they operated at near costs and had declining revenue. Now, perhaps Wikipedia has some unfortunate but real problems around editorializing, and they have to pay a growing army to keep the worst of the forces of troll-dom at bay with conscientious hired curatorial help. Otherwise, I see no other reason why they should have 300X the employees.
And this is exactly the issue. Fund raising got to be "too" successful, which is to say at some point WMF did a fundraising round and they had more money than they needed, and rather than give that money back they spent it. And the next year, what ever it was they spent it on (could have been salaries or perks or what ever) seemed like "of course we need that thing we did it last year" and so they targeted to raise the higher amount, but they over shot again, and then spent more.
This has been a trap for charitable organizations ever since they existed. Churches, clubs, museums, Etc. The second trap is embezzlement which happens all the time because, as a donation funded organization for some reason people often neglect strong financial controls.
The article author is correct that unless corrected, this situation will kill the Wikimedia Foundation.
> I see no other reason why they should have 300X the employees.
Well, that's a little unfair. When you measure from 1, virtually any number will look like an absurd multiple. Calling it 300x makes it sound ridiculous but for a company with $80MM in revenue, 300 employees doesn't seem to be unreasonable.
Wikipedia hires php programmers as well. Last year I saw a posting on stackoverflow. Think the devop work required, project management, handling media, ever increasing storage needs, the managers, etc.
300 feels like too much but 80 could be a logical number.
Supporting many more languages and new projects like news and structured data should require more individual contributors and a corresponding management structure. Then of course you need HR to manage them, a fundraising organization that gets money from all over the world, and yet more people to handle the finances that have now gotten complicated.
To me, it's a no-brainer that the WMF of today needs more than one employee. Whether it needs 300, I don't know, but that doesn't sound far enough off for me to quibble with them over it.
They may need more than one, and I'm sure there are fixed or near-fixed labor costs pertaining to HR, legal, accounting, etc. but what business function besides content curation - which another commenter above claims is entirely volunteer, does WMF need that approaches even a linear growth rate to remotely justify the jump in orders of magnitude, let alone 300X?
The table of spending vs. revenue suggests revenue is growing just as fast as expenses. There's nothing here about the trends of the amount of traffic over time. And if there was, the cost of hosting and developing a system vs the amount of traffic is can take is not a linear relationship.
> There's nothing here about the trends of the amount of traffic over time.
Yes there is.
2005: > So, we're doing around 1.4 billion page views monthly.
2016: > According to the WMF, Wikipedia (in all language editions) now receives 16 billion page views per month.
The version of the article on the signpost lacks the table of figures, if you go to the version on his user page, it paints a clearer picture [0].
idorosen has posted a table on the root thread which adds a column for the expense ratio (1 minus the margin) and it's getting much higher much quicker [1].
Well, try to draw a trend line through the operating margin and tell me it's not that scary. I think that it's pretty scary if your permanent, ongoing financial commitments are growing to match your revenues. This way, you have no cash to react to intermittent expenses or R&D. And let's remember that Wikipedia is already doing more donation drives than ever.
Wikipedia is an explicitly not-for-profit enterprise, and thus something completely different from the standard startup.
For most startups, success means getting bought out by a larger company. Wikipedia by contrast has always put the highest priority on maintaining its own independence, which means that for them a buyout would be a profound failure.
I'm very conflicted about the donate me guilt tripping. On one hand these companies need cash to operate and they do a world of good, but on the other reading their financial reports I'm just not convinced they are efficient with the money.
I am not sure I understand what problem am I supposed to see when I look at the table. It looks to me like Wikipedia has income in excess of expenses and a reserve to cover unforeseen events. Remembering what Wikipedia was like in 2005 when Wales thought it didn't need employees, made me think that Wales could not imagine the scale at which Wikipedia is an important asset of humanity today.
It's similarly myopic to the criticism of Wikipedia's software engineering methodology further down the essay. There's a scale of project at which waterfall greatly increases the odds of success: without a plan, a pyramid doesn't get nice sharp corners or come to a point, the dam does not turn the turbine, and Armstrong does not boot print moon dust (never mind landing back in the vicinity of ships and helicopters and medical staff).
A big Wikipedia changes slowly. That's good. One person's rant doesn't cause it to pivot on a dime. One person's rant doesn't suddenly turn it to ad funded. One person's rant doesn't suddenly remove a category of articles.
The problem isn't that there isn't enough income to cover expenses.
The problem is that expenses have grown along with rapidly increasing income.
The two best explanations for this are:
a) Wikipedia has spent that money in ways that add commensurate value
b) Wikipedia is adding projects that sound reasonable, but don't add tons of value, in order to soak up the increased revenue (possibly unintentionally; intentions are irrelevant in bureaucracies).
It's not binary, of course. People who have a problem with that chart aren't saying Wikipedia has added NO value from the spend.
They're just noting that (b) is a very very common failure mode of large organizations that have easy access to money. In fact, I'd argue that it's common enough as the failure mode that the burden of proof should always be on the party who asserts that it's NOT happening, rather than the other way around.
Nah, just companies that profit from Wikipedia (hello Google!) and non-profits should pony up. Why shouldn't a person doing great work to make sure Wikipedia runs smoothly get paid when there's so much money going around?
Google should "adopt" 150 of their employees, MSFT 50, Facebook 50 and so on. It's tax deductible too...
Wikipedia isn't spending more because of high costs of hosting, bandwidth and development. They are spending on executives, administration, grants, travel/conferences, and other mostly unrelated expenditures.
I don't donate to them from an effective altruism standpoint, every year I've read their spending plan and decided there are far better places to give. If they only spent on the core Wikipedia platform and support - I'd donate every year.
> Wikipedia isn't spending more because of high costs of hosting, bandwidth and development. They are spending on executives, administration, grants, travel/conferences, and other mostly unrelated expenditures
It's not a struggling organization. People are giving them barrels of cash so they expand their scope to spend it, then ask for even more the next year.
> The modern Wikipedia hosts 11–12 times as many pages as it did in 2005, but the WMF is spending 33 times as much on hosting, has about 300 times as many employees, and is spending 1,250 times as much overall. WMF's spending has gone up by 85% over the past three years.
Can someone analyze this? It sounds a lot like he has a negative feeling about WMF, and threw in numbers to validate his opinion. I'd expect non-linear spending (in terms of pages hosted) at some point (because other things related to pages like links probably grow non-linearly).
As a Wikipedia administrator (mostly inactive), this sentiment makes complete sense to me. The WMF seemingly spends the majority of its money on non-critical functions such as community outreach, local chapters, yearly conferences and other non-critical costs. Including a parade of highly paid, not very effective executives. One thing to keep in mind is the WMF != the Wikipedia community, it is very possible to truly support the Wikipedia mission without also supporting how the WMF is ran.
> on non-critical functions such as community outreach, local chapters, yearly conferences
These are critical functions if you want to have live developing community. If you just want to have a site that answers http requests, sure, not critical. There are billions of those. Making sure Wiki projects work as communities and not just as IP address answering http requests is what makes it critical.
> One thing to keep in mind is the WMF != the Wikipedia community, it is very possible to truly support the Wikipedia mission without also supporting how the WMF is ran.
Absolutely. But in doing that one must not forget what the point is. If you declare chapters and and community development unnecessary, what is necessary? Just server maintenance? Nope. Google has tons of expertise in maintaining servers, still can't make communities. Their Freebase project is no more, Wikidata is alive and well. Something to learn from this?
There are regular surveys, and then by talking to people on events, conferences or just by people providing feedback on one of many channels. And of course by statistical measures such as traffic, editor activities, etc. (for which WMF has team that does relevant data collection and research - yet another non-obvious place where people work that is not directly "site maintenance"). You can also check out https://www.mediawiki.org/wiki/Collaboration and https://meta.wikimedia.org/wiki/Community_Resources for teams that do it and know better than I do :)
In February 2003, a Wikipedia backup was about 4 GB in size. Today it's over 30 terabytes (4-6 TB for text, history, etc., and ~27 TB for images and other media).
It is inflammatory tactic to make those who already agree with you clap - but it won't convince those who disagree. It turns rational argument into attention seeking.
What makes cancer special isn't just that it grows. It's that it grows and thereby kills its host: it consumes the resources the host needs to live, or blocks tubes that need to stay open, or squashes things that are damaged by squashing, or whatever.
None of that seems applicable here. Wikipedia is getting more and more donation money, and spending it. That's only cancerous if it's spending the money on things that harm it, or if somehow this process is diverting resources away from things Wikipedia actually needs. There's no sign of any of that, so far as I can see, and to tell a scary story the article makes predictions of hypothetical future doom.
If those predictions come true then OK, we can say that Wikipedia has cancer (or maybe some other ailment that's a better analogy). For now, so far as I can see the analogy just isn't there unless you take "cancer" to mean absolutely anything that grows. I don't think that's a helpful way to use the term.
I don't know much about wikipedia or WMF but the spending growth in that table is not exponential. And surely the correct way to measure the scale of the service provided is page views rather than number of pages.
IMO-- as someone who was directly involved at the time the big mistake was relocating to SF. That one decision was the beginning of a cascade of ever increasing spending which marked the end of an era of fiscally conservative operations.
There were many positive outcome too: this increase in spending has resulted in many benefits but not at all proportional to the increase in costs.
This is one of the reasons I hated working there. They waste money and every year beg for more. Why do they even need an office in a prime San Francisco location? On top of that, they work on stupid projects, and internally the organization is run by imbeciles.
As I understand it, Wikipedia's software and content is completely open source. You could make a foundation called BetterWiki, and run it on $1M (if you get the SEO and volunteers to your side). Is that right?
Now if the people donating (cards against humanity, mom+pops, etc) feel good supporting Wikipedia, and there are more people that want to feel good supporting Wikipedia than Wikipedia needs, maybe Wikipedia should invest money in more missions that go along with its general values?
The cancer metaphor seems very artificial - it is only about the exponential growth with no underlying model. Without the model exponential growth means nothing - because you don't know how to extrapolate the current trend.
Why were there so many sites that failed before Wikipedia took off? Looks like part of any search engine should be the service Wikipedia prodives, henece, the Google love affair. Can the same model be applied to other things, i.e. User created content + user curated content + indefinite feedback loop: the more users use it the more content is created/curated (a percentage of new users go on to creating/curating content)?
That's pretty indefensible. Safe assets like US Treasuries could give them about 1% in one year maturity yield. And since they're non-profit, would probably have no tax to pay on it.
It probably isn't cash in a bank account. Balance sheets typically list "cash and cash equivalents", which can include short-term investments of various sorts.
The WMF balance sheet for 2015-16[1] shows that they held $46.7 million in cash and cash equivalents. Long term investments are listed as $11 million.
Fun trivia from their Statement of Activities[2]: they spent more on processing donations ($3.6m) and conferences/travel ($2.3m) than on hosting costs ($2m).
They also have a short term investments line in balance sheet too, separate from cash. Leads me to believe it's cash. Although you are correct many companies lump them together.
> They also have a short term investments line in balance sheet too, separate from cash. Leads me to believe it's cash.
I can see how you'd come to that view.
But there's no need to guess, here. These terms have legal meanings derived, in the USA, from the GAAP standards. "Cash and cash equivalents" doesn't mean "actually it's all cash because we also listed 'short term investments'".
Actually, locking cash in a 1-year US Treasury would be indefensible. If they ever needed the funds they would have to sell it, possibly for a loss. They should be investing in commercial paper with maturities between 1-3 months and would receive a higher return than treasuries with much less risk of loss of principal. (This is what I do for a living)
My guess is, they probably plan their budget annually like any other non-profit, and are probably well insured against loss. If you do this for a living, can you explain why would a non-profit need almost 50% that liquid? I often read that other cultural institutions like museums are into much riskier assets under professional management. Am I mistaken about what is commonly done, or are you expressing your opinion that widespread practice is wrong?
Almost all growth you see in the real world is exponential in its early stages. The numbers the article points to show revenues exceeding costs and a growing surplus. There is no story here other than Wikipedia is a rapidly growing organization and is doing it while running in the black.
> If we want to avoid disaster, we need to start shrinking the cancer now, before it is too late. We should make spending transparent, publish a detailed account of what the money is being spent on and answer any reasonable questions asking for more details.
I never search for it, but I'm really surprised the foundation has not yet made their spending transparent. Aren't they supposed to be non-profit ?
Oh I see it all the time. Random extraneous projects and really niche projects that don't fit well in the wikimedia ecosystem, built ontop of the tech debt ridden remains of Mediawiki while the core devs are tasked with making helper tools when they should be paid to work on continually improving core functionality of Mediawiki so anyone other than Wikipedia can use it.
> It could be the WMF taking a political position that offends many donors.
We have a winner. It's the same thing killing ESPN. It's not wise for companies to take political positions on the left or on the right that will alienate half your users. Just don't do it.
Sorry, but sometimes companies do indeed need to take political positions if their existence is threatened. SOPA was a huge example of this and net neutrality is an upcoming partisan issue. If you'd rather they sit back and get shackled in the name of neutrality, you're not really for the project, you'd rather see it be at the mercy of other politicians without having any say or doing anything to counteract it at all. (it's also completely naive of the massive lobbying already going on. sounds like you just want your side to have a say and any opposition offends you)
Is this a case where the HN story originally appeared with a title matching the essay and was subsequently changed to a more descriptive title? Quite the opposite of the typical pattern :)
There are articles claiming Wikipedia is actually backed by some investors and corporate who want to control it. Wikiscanner results gives a hint about how.
"The modern Wikipedia hosts 11–12 times as many pages as it did in 2005, but the WMF is spending 33 times as much on hosting, has about 300 times as many employees, and is spending 1,250 times as much overall."
Are there any comparable data on the costs for 11x( 16 billion PV/M) this? (I'm thinking google/amazon here)
"their poor handling of software development has been well known for many years."
So is the problem inefficiency in the code/HW setup? That is solvable. Any pointers to the hosting solution used?
"I have never seen any evidence that the WMF has been following standard software engineering principles [...]. If they had, we would be seeing things like requirements documents and schedules with measurable milestones. This failure is almost certainly a systemic problem directly caused by top management, not by the developers doing the actual work."
I have zero problems with Wikipedia's resorting to nagging the everloving fuck out of people who don't donate.
I just wish I could get them to not nag the everloving fuck out of me. Every year when I see the donation banner, I donate perhaps $20. I would maybe double that donation if it could stop fucking nagging me.
If they can't work out how to stop nagging me, I'll find a technological solution to the problem. Ideally, I'd turn it on after I donate and it will turn off after the donation drive is done, so I can get reminded to again next year.
But of course that's never going to happen. So I'll never get reminded and my donations are going to stop. Oh well.
>Nothing can grow forever. Sooner or later, something is going to happen that causes the donations to decline instead of increase. It could be a scandal (real or perceived). It could be the WMF taking a political position that offends many donors. Or it could be a recession, leaving people with less money to give.
Does the Archive Team backup Wikipedia? Probably they do but if not I guess they should.
> How true are his statements? I seriously don't know enough about the WMF or the fiscal policies in place to make even a guess.
Its true in the sense any forecast is true.
WMF operated in line with its projections, however, so its a question of differing strategic goals b/t the author and the WMF rather than a true "wrong" or "right".
If you FIND for "Planned spending" you'll find a useful table that largely explains the numbers are in line with the planned budgets.
From an engineering point of view, would changing any part of their stack reduce the hosting burden? I did some tinkering with MediaWiki in late 2006, it was a bit convoluted at the time. I imagine it is a gigantic project now.
Good on Guy for bringing this up at last, but others have been questioning Wikipedia's fundraising for some time now [1][2][3] Note: I wrote the last one. In the same way that many charities exist to line the pockets of their staff, while not giving to their cause, Wikipedia has become a bloated fundraising bureaucracy that happens to have an online encyclopedia. The emphasis on fundraising leads to corrupt actions, for example the fundraising banners that say the site is in imminent danger of collapse despite it having over $100 million in the bank.
I'm happy they don't go down that route. And I wish they don't waste energy on discussing ads.
If we keep going with the cancer analogy: When I read about metastasis on Wikipedia, what kind of ads do you think I would be shown if they'd show ads? Benign ones?
I would, since the figures show that it's not necessary. If it becomes necessary it will be because WMF failed to control spending growth, something the cynic in me says is likely.
Not surprising. It's quality has deteriorated and not dependable. It's high SEO ranking has led to it being exclusively used for pushing an opinion or agenda. Articles on popular topics eventually end up being biased towards a viewpoint rather than being factual and chronological.
It's funny how you're using this post for soapboxing as the op-ed has nothing at all to do with what you said. This is why vitriolic titles like that shouldn't be allowed.
That's a good point, its like my local public library.
The problem summarizes down to my local public library spends most of its money on the physical building and physical books. You could inaccurately think that is the only expense of running a library, but there are numerous "in the noise" expenses people don't talk about as much.
You could replace my local public library with a completely non-physical website of ebooks. No physical building or books means that primary expense disappears. That means a ebook library, is completely free, right?
Well it turns out that the "lost in the noise" expenses of running a public library still exist and would then be the primary expenses. At perhaps 5% of previous "brick and mortar" cost. And the city and county were happy to fund a brick and mortar public library at 100% of budget, but you're not getting a penny or at most 1% if you're a "free virtual library" and you need 5% to keep open... Maybe a corporate benefactor could be found? I mean that's how we got our brick and mortar library, thanks, steel monopoly...
Note that my public library costs $3.5M/yr, and is one of the best in the state, if not the best, and serves 71K people in my city, that's about $50/yr per resident and I'm quite happy with it. It sounds like a public ebook library should round down to free. Amazon Kindle Unlimited is currently $120/year plus tax. Isn't that fascinating? The actual cost per user after all the financial hand waving is done implies the nicest newest largest most luxurious library in the state is less than half the cost of an inferior virtual replacement. It does make one wonder if "free" wikipedia is actually costing our civilization in total more than paper encyclopedias did in the 80s. Merely changing the billing model to trick people into thinking something is zero cost, when its actually more expensive to the overall civilization, isn't very innovative.
For anyone who hesitates like I did, WMF is Wikimedia Foundation, not World Monetary Fund-- there is no World Monetary Fund; it's the IMF -- International Monetary Fund.
> I have never seen any evidence that the WMF has been following standard software engineering principles that were well-known when Mythical Man-Month was was first published in 1975. If they had, we would be seeing things like requirements documents and schedules with measurable milestones.
This person appears to be completely ignorant of the changes in software engineering since, let's say, the mid 90s. (Which kind of discredits everything else he writes.)
However, he's introduced as "He runs a consulting business, rescuing engineering projects that have gone seriously wrong."
So basically this is just a consultant's sales pitch.
> So basically this is just a consultant's sales pitch.
I think it may be more a particular mindset, where being in possession of a hammer (probably a very good hammer, nothing wrong with it) makes everything look like a nail. Even if in fact it is a complicated assembly line producing microprocessor chips, and bashing it with a hammer probably won't do any good.
They are handling around 2.9k req/s. In april they had 7.6 billion page views. English articles: 5,400,448.
English wikipedia download has size of 13 GB compressed (expands to over 58 GB when uncompressed).
I remember ten years ago WP doing multiple giga seemed huge. Now I could download that in one day and keep the uncompressed version on my phone or like in a ten bucks sd card
ISPs use Wikipedia as a killer app (free access from mobile providers), so make them pay for it. Charge them for access, or demand they host an official mirror. In exchange you don't show begware to the ISP's customers. ISPs will end up competing to provide access, eliminating the bulk of hosting requirements.
But then the WMF couldn't spend all the money on projects and conferences.
Just running the site is such a small fraction of the expenses of the foundation, that it would easily be financed from donations without aggressive begging banners. The banners are just there to finance that the author of the article calls "cancer".
Frankly, the problem with this Op Ed is there is a very simple solution available to anyone who is concerned about a "doomsday" scenario for Wikipedia:
Mirror it and make it available read-only until that day comes as a separate "disaster recovery" organization and raise the minimal funding required for that function (1 employee + 1 dedicated server) would be sufficient to the task. More if you wanted to make it usable but at that point, you aren't really a dumb mirror.
> pages-articles.xml.bz2 – Current revisions only, no talk or user pages; this is probably what you want, and is approximately 13 GB compressed (expands to over 58 GB when uncompressed).
You just need to mirror this regularly and maintain a reasonable depth of the backup. (Say, once a month for the past 12 months)
Then, whenever this terrible event destroys Wikipedia you are on the clear due to you operating under the past licenses for the data and people will need a new place to go if Wikipedia is genuinely destroyed.
I understand how this is valuable in offsetting load immediately but I think the graver threat here is the disbanding of the contributor community and resultant loss of network-effect value.
I imagine the cost of the disorganization or loss of volunteer contribution effort in the splintering of WP spinoffs/mirrors is orders of magnitude larger than load offset from mirroring.
> I imagine the cost of the disorganization or loss of volunteer contribution effort in the splintering of WP spinoffs/mirrors is orders of magnitude larger than load offset from mirroring.
I'm sure it would be but at the end of the day I'm not sure that would be, ultimately, bad if a monolithic entity failed because of mismanagement.
Nothing stops them from sharing information/pages or linking to each other.
No one outside of WMF is downloading and storing the dumps they provide to the public?
In the past, I have used these for parsing and database setup practice, making CSV and local databases that require no network access. I am mainly interested in the reference urls found in wikipedia pages.
> No one outside of WMF is downloading and storing the dumps they provide to the public?
No one who is actively marketing and encouraging contributions on a scale people are aware of them.
If they existed, you'd have told me who they were. ;)
Simply storing the data isn't enough, you'd need to maintain it and make it publicly available as a mirror of the site and encouraging people to interact so you have a community to work with in the event of Wikipedia's failure.
I guess my point is the OP should be doing that instead of bitching.
> After we burn through our reserves, it seems likely that the next step for the WMF will be going into debt to support continued runaway spending, followed by bankruptcy. At that point there are several large corporations (Google and Facebook come to mind) that will be more than happy to pay off the debts, take over the encyclopedia, fire the WMF staff, and start running Wikipedia as a profit-making platform. There are a lot of ways to monetize Wikipedia, all undesirable. The new owners could sell banner advertising, allow uneditable "sponsored articles" for those willing to pay for the privilege, or even sell information about editors and users.
I honestly wouldn't complain if Wikipedia was owned and monetized by Google. I think they would recognize the importance of Wikipedia and handle it very carefully. I also think there is a small and very vocal minority on Hacker News who would be outraged, but most of the world wouldn't think twice.
I think majority here would disagree, including me :)
And for Google it's fine as it is, they make billions using the free data people provide through Wikipedia. They just spill out the facts (often wrong) and people never leave Google.
"My most surprising discovery: the overwhelming importance in business of an unseen force that we might call 'the institutional imperative.' In business school, I was given no hint of the imperative's existence and I did not intuitively understand it when I entered the business world. I thought then that decent, intelligent, and experienced managers would automatically make rational business decisions. But I learned over time that isn't so. Instead, rationality frequently wilts when the institutional imperative comes into play.
For example: (1) As if governed by Newton's First Law of Motion, an institution will resist any change in its current direction; (2) Just as work expands to fill available time, corporate projects or acquisitions will materialize to soak up available funds; (3) Any business craving of the leader, however foolish, will be quickly supported by detailed rate-of-return and strategic studies prepared by his troops; and (4) The behavior of peer companies, whether they are expanding, acquiring, setting executive compensation or whatever, will be mindlessly imitated.
Institutional dynamics, not venality or stupidity, set businesses on these courses, which are too often misguided. After making some expensive mistakes because I ignored the power of the imperative, I have tried to organize and manage Berkshire in ways that minimize its influence. Furthermore, Charlie and I have attempted to concentrate our investments in companies that appear alert to the problem."
[1] http://www.berkshirehathaway.com/letters/1989.html
reply