Dozens of PlayStation 3s sit in a refrigerated shipping container on the University of Massachusetts Dartmouth’s campus, sucking up energy and investigating astrophysics. It’s a popular stop for tours trying to sell the school to prospective first-year students and their parents, and it’s one of the few living legacies of a weird science chapter in PlayStation’s history.
Those squat boxes, hulking on entertainment systems or dust-covered in the back of a closet, were once coveted by researchers who used the consoles to build supercomputers. With the racks of machines, the scientists were suddenly capable of contemplating the physics of black holes, processing drone footage, or winning cryptography contests. It only lasted a few years before tech moved on, becoming smaller and more efficient. But for that short moment, some of the most powerful computers in the world could be hacked together with code, wire, and gaming consoles.
Researchers had been messing with the idea of using graphics processors to boost their computing power for years. The idea is that the same power that made it possible to render Shadow of the Colossus’ grim storytelling was also capable of doing massive calculations — if researchers could configure the machines the right way. If they could link them together, suddenly, those consoles or computers started to be far more than the sum of their parts. This was cluster computing, and it wasn’t unique to PlayStations; plenty of researchers were trying to harness computers to work as a team, trying to get them to solve increasingly complicated problems.
The game consoles entered the supercomputing scene in 2002 when Sony released a kit called Linux for the PlayStation 2. “It made it accessible,” Craig Steffen said. “They built the bridges, so that you could write the code, and it would work.” Steffen is now a senior research scientist at the National Center for Supercomputing Applications (NCSA). In 2002, he had just joined the group and started working on a project with the goal of buying a bunch of PS2s and using the Linux kits to hook them (and their Emotion Engine central processing units) together into something resembling a supercomputer.
They hooked up between 60 and 70 PlayStation 2s, wrote some code, and built out a library. “It worked okay, it didn’t work superbly well,” Steffen said. There were technical issues with the memory — two specific bugs that his team had no control over.
“Every time you ran this thing, it would cause the kernel on whatever machine you ran it on to kind of go into this weird unstable state and it would have to be rebooted, which was a bummer,” Steffen said.
They shut the project down relatively quickly and moved on to other questions at the NCSA. Steffen still keeps one of the old PS2s on his desk as a memento of the program.
But that’s not where PlayStation’s supercomputing adventures met their end. The PS3 entered the scene in late 2006 with powerful hardware and an easier way to load Linux onto the devices. Researchers would still need to link the systems together, but suddenly, it was possible for them to imagine linking together all of those devices into something that was a game-changer instead of just a proof-of-concept prototype.
That’s certainly what black hole researcher Gaurav Khanna was imagining over at UMass Dartmouth. “Doing pure period simulation work on black holes doesn’t really typically attract a lot of funding, it’s just because it doesn’t have too much relevance to society,” Khanna said.
Money was tight, and it was getting tighter. So Khanna and his colleagues were brainstorming, trying to think of solutions. One of the people in his department was an avid gamer and mentioned the PS3’s Cell processor, which was made by IBM. A similar kind of chip was being used to build advanced supercomputers. “So we got kind of interested in it, you know, is this something interesting that we could misuse to do science?” Khanna says.
Inspired by the specs of Sony’s new machine, the astrophysicist started buying up PS3s and building his own supercomputer. It took Khanna several months to get the code into shape and months more to clean up his program into a working order. He started with eight, but by the time he was done, he had his own supercomputer, pieced together out of 176 consoles and ready to run his experiments — no jockeying for space or paying other researchers to run his simulations of black holes. Suddenly, he could run complex computer models or win cryptography competitions at a fraction of the cost of a more typical supercomputer.
Around the same time, other researchers were having similar ideas. A group in North Carolina also built a PS3 supercomputer in 2007, and a few years later, at the Air Force Research Laboratory in New York, computer scientist Mark Barnell started working on a similar project called the Condor Cluster.
The timing wasn’t great. Barnell’s team proposed the project in 2009, just as Sony was shifting toward the pared-back PS3 slim, which didn’t have the capability to run Linux, unlike the original PS3. After a hack, Sony even issued a firmware update that pulled OpenOS, the system that allowed people to run Linux, from existing PS3 systems. That made finding useful consoles even harder. The Air Force had to convince Sony to sell it the un-updated PS3s that the company was pulling from shelves, which, at the time, were sitting in a warehouse outside Chicago. It took many meetings, but eventually, the Air Force got what it was looking for, and in 2010, the project had its big debut.
Running on more than 1,700 PS3s that were connected by five miles of wire, the Condor Cluster was huge, dwarfing Khanna’s project, and it used to process images from surveillance drones. During its heyday, it was the 35th fastest supercomputer in the world.
But none of this lasted long. Even while these projects were being built, supercomputers were advancing, becoming more powerful. At the same time, gaming consoles were simplifying, making them less useful to science. The PlayStation 4 outsold both the original PlayStation and the Wii nearing the best-selling status currently held by the PS2. But for researchers, it was nearly useless. Like the slimmer version of the PlayStation 3 released before it, the PS4 can’t easily be turned into a cog for a supercomputing machine. “There’s nothing novel about the PlayStation 4, it’s just a regular old PC,” Khanna says. “We weren’t really motivated to do anything with the PlayStation 4.”
The era of the PlayStation supercomputer was over.
The one at UMass Dartmouth is still working, humming with life in that refrigerated shipping container on campus. The UMass Dartmouth machine is smaller than it used to be at its peak power of about 400 PlayStation 3s. Parts of it have been cut out and repurposed. Some are still working together in smaller supercomputers at other schools; others have broken down or been lost to time. Khanna has since moved on to trying to link smaller, more efficient devices together into his next-generation supercomputer. He says the Nvidia Shield devices he’s working with now are about 50 times more efficient than the already-efficient PS3.
It’s the Air Force’s supercluster of super consoles that had the most star-studded afterlife. When the program ended about four years ago, some consoles were donated to other programs, including Khanna’s. But many of the old consoles were sold off as old inventory, and a few hundred were snapped up by people working with the TV show Person of Interest. In a ripped-from-the-headlines move, the consoles made their silver screen debut in the show’s season 5 premiere, playing — wait for it — a supercomputer made of PlayStation 3s.
“It’s all Hollywood,” Barnell said of the script, “but the hardware is actually our equipment.”
Correction, 7:05 PM ET: Supercomputer projects needed the original PS3, not the PS3 Slim, because Sony had removed Linux support from the console in response to hacks — which later led to a class-action settlement. This article originally stated that it was because the PS3 Slim was less powerful. We regret the error.
Calling BS. Reference? More likely they were just missing the gigabit Ethernet, which represents zero loss in processing power.
Yeah, this seems off. The PS3 and PS3 slim had the exact same processor, albeit at different sizes for better cooling and lower power consumption. The slim lost the original PS3’s emotion engine (which itself was from the PS2), but that wouldn’t have been a factor. What actually happened is probably like you say, slower ethernet or the loss of some kind of I/O.
I think they couldn’t use the PS3 slim because the Linux compatibility was removed. But of course, that had nothing to do with computing power.
As always the real answer is in the comments.
Mary Beth confirmed that this is the reason, and updated the post later in the day. New for today: I managed to stuff my launch 60GB PS3 into a backpack and brought it to work for photos, so we could put pictures of the proper PS3 into the story as well! Sorry for all the scuffs around the HDMI port, I did a LOT of fumbling with that cable in my entertainment center over the years.
Gotta admit I rarely used the 60GB PS3’s backwards compatibility, though… was much more convenient to use HDLoader on my OG PS2 to load games from a hard drive! Really cool use of the optional network adapter/hard drive.
Yeah that is some pure BS. There was no power gap between the original PS3 and the Slim. It lost the PS2 emulator engine which wasn’t being used in supercomputing anyways. The graphics and performance between the original PS3 and Slim were in the same ball park range for the most part and they have to keep it that way so as to not alienate the adopters.
I also found this remark to be odd.
I assumed this was going to be about the folding@home option, glad I was wrong.
I have fond memories of using that thing! Or, to be more precise, of not using my PS3 so it could do its thing.
This honestly is a good example of how unfocused Sony was with the PS3 and why they pivoted so hard with the PS4 to being a straight up gaming machine. Projects like these were cool, but Sony focusing on things like Linux on the PS3 is a good example of why the PS3 was a failure until late in its lifespan.
Actually, the Linux feature was included as to be exempt from import taxes in the EU, as it could be defined as a computer instead of a gaming console.
Uh, I’m thinking the $600 price tag had more to do with its’ early failure.
Don’t remind me. I remember waiting in line overnight (with 50 other people) for the Xbox 360 on its launch day. I didn’t plan on getting the PS3 on its launch day but I happened to go into Best Buy on that day thinking they would be sold out. Nope, there was a huge circular stack of both PS3 configurations ready for people to buy. Nobody waited in line overnight, they were just sitting there. I picked one up but still, that $600 pill was a hard one to take. Still better than the $1500 Samsung Blu-ray player sitting in the home theater section.
The price tag was a result of all those things. Putting a million I/O ports on the machine, using super powerful CPU that was expensive and not a great fit for a game machine and trying to make it a multimedia machine as much as a game console jacked the price way up.
If they picked the wrong CPU, that was just their stupidity.
Gaming is basically the most silicon-intensive task any average person will ever encounter.
There is no same-generation multimedia application that couldn’t be covered with the CPU in a gaming console starting with the original Playstation.
The only hardware in the box that legitimately increased the price of the machine was the bluray drive, and that’s a large part of why many people eventually bought a PS3.
Other OS was a trifling side project, the failures were from the cost being so high that not only was it too high for many consumers, but was sold at a large loss early on, the blu ray drive particularly being so expensive (they "won" the format war, but I wonder if it was worth it with streaming crushing all).
The other thing was banking on Cell so heavily early on and their Toshiba partnership for a GPU falling through (the dual cell thing was an early drawing board idea that was quickly abandoned, despite popular talking points), RSX was rushed and underwhelming even a year later than Xenos, the latter of which with unified shaders and unified memory pulled well ahead of RSX. Late stage when devs learned to use Cell more properly, a lot of what it ended up doing was making up for how much RSX was underwhelming, doing things like pre-culling textures and vertex processing to make life easier for RSX.
Sounds like a very plausible explanation. Millions of consumers researched whether Sony was focusing on Linux for PS3, and made their purchasing decisions based on that, rather than the price point.
Sony focusing on things like linux, making the PS3 the ultimate multimedia machine and using a cpu that was expensive and not the best fit for gaming is what put the PS3 at that price point.
I’m sorry, you’re gonna have to give more details on this.
@Mary Beth Griggs, time to do more research about the articles.
I knew the Person of Interest plotline was ripped from reality, but I had no idea that the hardware shown in those episodes WAS reality — meaning they used parts of the actual supercomputing cluster. As a fan of the show, this was an amazing detail to learn (and emblematic of why I was a fan in the first place).
There’s some theorizing that the PS5 may borrow from SSG technology, where the GPU can directly interface with NAND rather than going through the rest of the system, which benefits data analysis on huge datasets and some large renders:
https://www.amd.com/en/products/professional-graphics/radeon-pro-ssg
This was drooped from $7000, to "just" $4600, so the PS5 would be by far the cheapest variant of this available if true.
That’s where I wish Other OS was coming back…To get your hands on it for that cost would bring us back to the era of the PS3 having compute applications well above its price range, if what you’re after is the pure performance rather than the pro support that comes with true SSG cards. If you’re, say, a data science team who wants to do some playing on such technology but aren’t sure if they’re going full bore on it, $500 could be a much easier justification than several grand.
The trouble for Sony is that they’re subsidizing the hardware, so how does the "razers and razer blade" model work for general computing? MS gets a license for Windows, Sony has to price the console to complete.
I’d love to see them figure it out and release a custom AMD design optimised for graphics and AI.
They didn’t subsidize much at all in the 8th gen, selling for a small profit or near at-cost at launch afaik, after the financial disaster of the early 7th I’m not sure they’ll do it again, at least not as much.
They have no incentive to add other OS though, I agree, and it just opens up hacking attack vectors. If it does have SSG though, it would have been really cool to play with it at a fraction of the cost.
The biggest thing that killed this is that while the Cell chip (PS3) is a single precision floating point monster, detailed simulations want double precision. Single precision (32-bit) only gives you 6-9 significant decimal digits and detailed simulations quickly blow past that without some serious extra design work and headaches. Since these simulations are often exquisitely delicate and sensitive to minor offsets (a butterfly flapping its wings, etc.), your particle velocity in one cell going from 3.123456 m/s to 3.123455 m/s due to lack of precision is a really big deal and makes it really bad when you add dissimilar ranges.
A double (64-bit) gives you 15-17 significant decimal digits and will obviously give you much more accurate simulations with much less effort (there are even 80-bit extended double). Of course they’re a much more significant burden on the hardware. But once GPUs became able to crush single and double floating point numbers, messing with networking an obsolete console that Sony didn’t want you using that way any more was obviously doomed.
Regarding the PS4 being "a regular old PC", does this mean that, even if it could have run the Linux kit, it would not have been useful? What was wrong with it?
Well, the main thing was that you could build a PC with better CPU and same GPU performance for slightly less money. Sure, you don’t get BR or controller plus it wouldn’t be as good for games because consoles get extensive optimization, but for simulations none of this matters.
On the other hand, PS3’s CPU was quite interesting. It was slower but more general than the GPU and faster while less flexible than desktop CPUs. Additionally, Cell in PS3 had a big brother sold for HPC customers, so when you optimize code for PS3, switching to the big one is going to be reasonably easy – much easier than first optimizing for your desktop CPU you are developing on. Plus, back then simulations on GPUs barely existed – these days most will opt for more powerful GPUs even if Cell-like CPU is released.
I’m still super bitter that Sony got rid of Folding@Home. It was my favorite news app and it helped the world. I really wish they’d bring it back.
Oh boy, do I have news for you: There’s always BOINC for PCs to have (I recommend joining World Community Grid). PCs nowadays are plenty fast and efficient (looking at you, Ryzen 3700X) – so if you’re interested in helping the science world, do it!
Great project, whcih showed how unfocused Sony`s games console strategy was back then.
Trying to sell Cell chips by sticking them in your games console was insane.
Minor correction – it was called OtherOS, not "OpenOS".
Is PS3 Slim less powerful than PS3 fat?
no. Poor research by journalist.
My original comment was deleted…because I recommended ArsTechnica if you want well researched and informed tech articles…unlike this one.