Remembering Justapedia
Oh wait, it's still here!

Remembering Justapedia

Wikify readers are no doubt familiar with the Mandela effect—the phenomenon in which people collectively misremember a specific event or detail, e.g. that Nelson Mandela died in prison in the 1980s or that the Berenstain Bears children’s books were spelled “Berenstein Bears.” Our favorite example is actually Fruit of the Loom’s, as some people swear the company’s logo once included a cornucopia basket. We love this one because the Fruit of the Loom’s comms team seems so exasperated by the theory that they put up a confrontational FAQ page on their website about it.

Our second-favorite Mandela effect is the belief that a disgruntled Wikipedia editor launched a competitor site called Justapedia back in 2023. Or at least this was our second-favorite Mandela effect until we found out that it really happened and the site is still around. Whoops! The first section of this newsletter digs into the rival encyclopedia’s odd past, present hype, and untenable future. Then we’ve got a brief summary of which news sources you should avoid when seeking a Wikipedia update. It turns out most media outlets that begin with ‘Daily’ are straight garbage! Finally, we’ve got a news recap. Spoiler alert: A dude wearing an “anti-contact non-offending pedophile” shirt pulled out a piece at WikiConference North America! We’ve got the details below.

Seriously. Keep reading.

More than just a pedia?

Article content
The spirit of 90s graphic design is still alive at Justapedia

We were all set to write about Grokipedia this week, but then Elon Musk announced the site’s launch was being delayed because “we need to do more work to purge out the propaganda.” So while Elon’s team continues their editorial struggle sessions, we instead turn our attention to a different Wikipedia competitor: Justapedia.

Justapedia launched on August 9, 2023 as a fork of Wikipedia, meaning that all of the English-language Wikipedia content was copied to a new site. Wikipedia operates under a Creative Commons license, which means all its content is basically free to use, repurpose, or (in this case) copy wholesale into an entirely new digital encyclopedia. Is that legal, you’re asking yourself? It basically is! The site actually has a long history of “forking“ like this. Wikipedia co-founder and cantankerous contrarian Larry Sanger is responsible for two such projects: Citizendium, a 2006 fork that attempted to elevate expertise over anonymity by forcing contributors to use their real names, and Everipedia, a 2015 fork run on blockchain technology that rewarded contributors with “IQ tokens”.

As you might have guessed, neither fork succeeded.

Back to Justapedia, though: The site was the brainchild of Betty Wills, a disgruntled former Wikipedia contributor, among other pastimes. Betty’s personal website also describes her as a “writer, TV producer, photographer, and equestrian.”

Upon launch, the Justapedia home page featured a notice that the site’s content “originated from Wikipedia and its volunteers, many of whom are also Justapedians dedicated to restoring and maintaining the spirit of objectivity and neutrality that Wikipedia has long since lost.” Above this was a quote from Albert Einstein: “We can’t solve problems by using the same kind of thinking we used when we created them.” (Albert Einstein, of course, never actually said this, so the quote didn’t exactly inspire confidence in the site’s commitment to accuracy.)

Justapedia soon caught the attention of Larry Sanger (naturally), and in December 2023 he endorsed it on Twitter and then tagged Elon Musk, suggesting he do the same. And, to be fair, there was some justified reason for optimism, as several prominent Wikipedia editors also hopped on board.

But beyond a promise to be different, it was initially unclear how the would-be Wikipedia competitor would distinguish itself from the site it copied 6 million plus articles from.

A 2023 Quillette piece profiling the site noted that “Justapedia’s content policies, as regards principles such as sourcing and neutrality, are also almost entirely the same as those at Wikipedia.” The only difference, then, was in how those guidelines would be enforced and how editorial disputes would ultimately be resolved:

Wikipedia uses a consensus-based model of editing, in which the outcome of disputes is ostensibly based on which side can better support its arguments. However, in practice, outcomes under this system are often determined either by majority opinion or by which side of a dispute receives support from users with the greater social clout. This model of editing can allow Wikipedia’s content policies to be subverted if the dominant group are not adequately informed about a topic, if they allow their ideology to impede their commitment to the site’s policies, or if the side that has policy on their side gives up out of frustration or exhaustion. Because Wikipedia’s Arbitration Committee cannot make decisions about article content, they are typically unable to address such situations. (Jimmy Wales cannot address them either, because he is mostly a figurehead with no real control over the site’s contents.)

Therein lies the rub. Wikipedia, for all of its faults (and there are many), remains an experiment in collective decision-making. You can say that decision-making often resembles Calvinball, and you would be right. But there’s a magic in the madness. Justapedia, conversely, was doomed from the start to be a hierarchical failure, with some animals more equal than others. When people get mad at Wikipedia, they blame the system. When people get mad at Justapedia, they blame Betty Wills—because she alone has the power to overrule them at any time.

This was evident in an early dispute about the Epoch Times Justapedia article that was documented on the r/Wikipedia subreddit:

This [Epoch Times] article has been compared to its Wikipedia version. Notably, the article was edited and protected by a single editor, Factsfirst, and only one other editor, JustMe [the account belonging to Betty Wills], was allowed to make changes. This restriction led to controversy when two editors raised concerns. They argued that the article incorrectly labeled ‘Epoch Times’ as center-right instead of far-right and also highlighted some apparent direct copying from the newspaper’s website. Following this, JustMe made minor edits to the article.

The subreddit post also included a screenshot of the Epoch Times Talk page discussion featuring pointed questions from Scope creep, an editor who should be familiar to anyone active on Wikipedia:

Article content

The discussion above occurred two years ago, but there’s no evidence that the site’s Betty Willis-centered editorial guidelines have changed since then. That said, there evidently is some increased interest of late, as Wills recently told Unherd that the site’s traffic has surged in recent months, with the site’s page views “increasing by 19% last month alone,” possibly due to increased prominence in Google search results. We’ve never seen it rank anywhere near the first page or be cited on ChatGPT, Perplexity, etc, but maybe we’re not searching for the right terms. When Justapedia has received attention, it’s been for highlighting how its ostensibly enlightened editors have treated topics like Antifa & Fascism compared to Wikipedia’s “woke” versions.

We can expect a similar side-by-side comparison of these topics when (if?) Grokipedia finally debuts.

What sources should be avoided on Wikipedia?

You can’t succeed on Wikipedia without strong sources. This applies whether you’re creating a brand-new article or improving an existing one. We’ve previously discussed what editors look for when judging whether a news outlet is reliable. But what about the opposite—how can you spot sources that should be avoided?

For starters, some websites are so unreliable they’re outright blacklisted from Wikipedia. These outlets are often blocked by the site’s spam blacklist, which prevents promotional or spammy links from being added. If you try to include one of these links in an article, your edit won’t save and you’ll see a message saying that a link you used is blacklisted.

Then there are outlets like The Daily Mail, Daily Star, and Daily Caller—not banned for spam, but because editors have deemed their content too unreliable for citation. The “perennial sources” list is the go-to resource for checking which outlets are approved or “deprecated” (essentially, blacklisted) on Wikipedia.

It may surprise PR and communications professionals that press releases generally can’t be cited on Wikipedia. While there are narrow exceptions, in practice it’s best to assume that press releases aren’t acceptable sources.

Because Wikipedia editors rely on sources with strong fact-checking and editorial oversight—they can’t verify information themselves—many types of content are ruled out. That includes contributor or sponsored posts (such as Forbes contributor articles or anything on Medium), as well as podcasts, YouTube videos, and anything from social media platforms. Even LinkedIn, which Google’s AI increasingly surfaces in search results, isn’t allowed: posts there are considered user-generated content and therefore not reliable enough for citation.

In the podcast episode embedded above, Wikify contributor and Lumino co-founder Rhiannon Ruff talks with PR & Lattes host Matisse Hamel-Nelis about the role of Wikipedia in PR and comms and the types of media coverage that are most impactful on the site.

Wikipedia in the news

Ohio man charged after brandishing gun at New York City Wikipedia conference“ | The Guardian (October 19, 2025)

Connor Weston, 27, was reportedly tackled by organizers of WikiConference North America 2025, thwarting tragedy, before police said officers booked him on counts of criminal possession of a weapon and reckless endangerment.
The Dayton resident had evidently paid to attend the four-day conference when he disrupted its opening ceremony at 9am Friday in Manhattan’s Civic Hall. He jumped on to a stage at the venue, pointed a gun at his head and the ceiling, and expressed a desire to take his life while a sign draped around his neck declared him to be an “anti-contact non-offending pedophile”, police and multiple media reports said.
According to the New York Times, conference safety team member Richard Knipel rushed the stage and clutched Weston from behind amid the chaos.
[...]
The Times reported that conference attenders thanked Knipel by lavishing him with Barnstars, which are Wikipedia’s official tokens of appreciation.

Our Take: A story like this is more about mental health and gun access than Wikipedia. That said, props to Richard Knipel and Wikipedia Revolution author Andrew Lih for acting quickly to restrain the would-be gunman.

Wikipedia says traffic is falling due to AI search summaries and social video“ | TechCrunch (October 18, 2025)

Miller says the foundation welcomes “new ways for people to gain knowledge” and argues this doesn’t make Wikipedia any less important, since knowledge sourced from the encyclopedia is still reaching people even if they don’t visit the website. Wikipedia even experimented with AI summaries of its own, though it paused the effort after editors complained.
But this shift does present risks, particularly if people are becoming less aware of where their information actually comes from. As Miller puts it, “With fewer visits to Wikipedia, fewer volunteers may grow and enrich the content, and fewer individual donors may support this work.” (Some of those volunteers are truly remarkable, reportedly disarming a gunman at a Wikipedia editors’ conference on Friday.)
For that reason, he argues that AI, search, and social companies using content from Wikipedia “must encourage more visitors” to the website itself.
And he says Wikipedia is taking steps of its own — for example, by developing a new framework for attributing content from the encyclopedia. The organization also has two teams tasked with helping Wikipedia reach new readers, and it’s looking for volunteers to help.

Our Take: This is the first we’ve heard of a new attribution framework. Is Miller just talking about an MLA / APA-style standardized citation? Or… a far more intriguing AI agent of some kind that works behind the scenes to interface with bots and AI queries? We’ll keep an eye on this potentially interesting development.

Grokipedia: Elon Musk is right that Wikipedia is biased, but his AI alternative will be the same at best“ | The Conversation (October 15, 2025)

Bias on collaborative platforms often emerges from who participates rather than top-down policies. Voluntary participation introduces what social scientists call self-selection bias: people who choose to contribute tend to share similar motivations, values and often political leanings.
But this shift does present risks, particularly if people are becoming less aware of where their information actually comes from. As Miller puts it, “With fewer visits to Wikipedia, fewer volunteers may grow and enrich the content, and fewer individual donors may support this work.” (Some of those volunteers are truly remarkable, reportedly disarming a gunman at a Wikipedia editors’ conference on Friday.)
For that reason, he argues that AI, search, and social companies using content from Wikipedia “must encourage more visitors” to the website itself.
And he says Wikipedia is taking steps of its own — for example, by developing a new framework for attributing content from the encyclopedia. The organization also has two teams tasked with helping Wikipedia reach new readers, and it’s looking for volunteers to help.

Our Take: We’ll have more to say about Grokipedia soon, but in the meantime we thought this piece did a nice job of laying out the editorial challenges and would-be Wikipedia successor will face.

Ready to learn more about Wikipedia? Check out our book!

Article content
Thanks for reading Wikify!



To view or add a comment, sign in

More articles by Lumino Digital

Explore content categories