Google uncloaks once-secret server
Google for the first time showed off its server design. (Click to enlarge)
(Credit: Stephen Shankland/CNET)
Updated at 4:08 p.m. PDT April 1 with further details about Google's data center efficiency and shipping containers modules and 6:30 a.m. April 2 to correct the time frame of efficiency statistics.
MOUNTAIN VIEW, Calif.--Google is tight-lipped about its computing operations, but the company for the first time on Wednesday revealed the hardware at the core of its Internet might at a conference here about the increasingly prominent issue of data center efficiency.
Most companies buy servers from the likes of Dell, Hewlett-Packard, IBM, or Sun Microsystems. But Google, which has hundreds of thousands of servers and considers running them part of its core expertise, designs and builds its own. Ben Jai, who designed many of Google's servers, unveiled a modern Google server before the hungry eyes of a technically sophisticated audience.
Google server designer Ben Jai
(Credit: Stephen Shankland/CNET)Google's big surprise: each server has its own 12-volt battery to supply power if there's a problem with the main source of electricity. The company also revealed for the first time that since 2005, its data centers have been composed of standard shipping containers--each with 1,160 servers and a power consumption that can reach 250 kilowatts.
It may sound geeky, but a number of attendees--the kind of folks who run data centers packed with thousands of servers for a living--were surprised not only by Google's built-in battery approach, but by the fact that the company has kept it secret for years. Jai said in an interview that Google has been using the design since 2005 and now is in its sixth or seventh generation of design.
"It was our Manhattan Project," Jai said of the design.
Google has an obsessive focus on energy efficiency and now is sharing more of its experience with the world. With the recession pressuring operations budgets, environmental concerns waxing, and energy prices and constraints increasing, the time is ripe for Google to do more efficiency evangelism, said Urs Hoelzle, Google's vice president of operations.
"There wasn't much benefit in trying to preach if people weren't interested in it," said Hoelzle, but now attitudes have changed.
The company also focuses on data center issues such as power distribution, cooling, and ensuring hot and cool air don't intermingle, said Chris Malone, who's involved in the data center design and efficiency measurement. Google's data centers now have reached efficiency levels that the Environmental Protection Agency hopes will be attainable in 2011 using advanced technology.
"We've achieved this now by application of best practices and some innovations--nothing really inaccessible to the rest of the market," Malone said.
The rear side of Google's server.
(Credit: Stephen Shankland/CNET)
Why built-in batteries?
Why is the battery approach significant? Money.
Typical data centers rely on large, centralized machines called uninterruptible power supplies (UPS)--essentially giant batteries that kick in when the main supply fails and before generators have time to kick in. Building the power supply into the server is cheaper and means costs are matched directly to the number of servers, Jai said.
"This is much cheaper than huge centralized UPS," he said. "Therefore no wasted capacity."
Efficiency is another financial factor. Large UPSs can reach 92 to 95 percent efficiency, meaning that a large amount of power is squandered. The server-mounted batteries do better, Jai said: "We were able to measure our actual usage to greater than 99.9 percent efficiency."
Urs Hoelzle, Google's vice president of operations
(Credit: Stephen Shankland/CNET)The Google server was 3.5 inches thick--2U, or 2 rack units, in data center parlance. It had two processors, two hard drives, and eight memory slots mounted on a motherboard built by Gigabyte. Google uses x86 processors from both AMD and Intel, Jai said, and Google uses the battery design on its network equipment, too.
Efficiency is important not just because improving it cuts power consumption costs, but also because inefficiencies typically produce waste heat that requires yet more expense in cooling.
Costs add up
Google operates servers at a tremendous scale, and these costs add up quickly.
Jai has borne a lot of the burden himself. He was the only electrical engineer on the server design job from 2003 to 2005, he said. "I worked 14-hour days for two and a half years," he said, before more employees were hired to share the work.
Google has patents on the built-in battery design, "but I think we'd be willing to license them to vendors," Hoelzle said.
Another illustration of Google's obsession with efficiency comes through power supply design. Power supplies convert conventional AC (alternating current--what you get from a wall socket) electricity into the DC (direct current--what you get from a battery) electricity, and typical power supplies provide computers with both 5-volt and 12-volt DC power. Google's designs supply only 12-volt power, with the necessary conversions taking place on the motherboard.
Google's data center efficiency has been improving gradually.
(Credit: Stephen Shankland/CNET)That adds $1 or $2 to the cost of the motherboard, but it's worth it not just because the power supply is cheaper, but because the power supply can be run closer to its peak capacity, which means it runs much more efficiently. Google even pays attention to the greater efficiency of transmitting power over copper wires at 12 volts compared to 5 volts.
Google also revealed new performance results for data center energy efficiency measured by a standard called power usage effectiveness. PUE, developed by a consortium called the Green Grid, measures how much power goes directly to computing compared to ancillary services such as lighting and cooling. A perfect score of 1 means no power goes to the extra costs; 1.5 means that ancillary services consume half the power devoted to computing.
Google's PUE scores are enviably low, but the company is working to lower them further. In the third quarter of 2008, Google's PUE was 1.21, but it dropped to 1.20 for the fourth quarter and to 1.19 for the first quarter of 2009 through March 15, Malone said.
Older Google facilities generally have higher PUEs, he said; the best has a score of 1.12. When the weather gets warmer, Google notices is that it's harder to keep servers cool.
An excerpt from a video tour Google presented of its data center containers. Like conventional data centers, Google's shipping containers have raised floors.
(Credit: Stephen Shankland/CNET)
Shipping containers
Most people buy computers one at a time, but Google thinks on a very different scale. Jimmy Clidaras revealed that the core of the company's data centers are composed of standard 1AAA shipping containers packed with 1,160 servers each, with many containers in each data center.
Modular data centers are not unique to Google; Sun Microsystems and Rackable Systems both sell them. But Google started using them in 2005.
Google's first experiments had some rough patches, though, Clidaras said--for example when they found the first crane they used wasn't big enough to actually lift one.
Overall, Google's choices have been driven by a broad analysis on cost that encompasses software, hardware, and facilities.
"Early on, there was an emphasis on the dollar per (search) query," Hoelzle said. "We were forced to focus. Revenue per query is very low."
Mainstream servers with x86 processors were the only option, he added. "Ten years ago...it was clear the only way to make (search) work as free product was to run on relatively cheap hardware. You can't run it on a mainframe. The margins just don't work out," he said.
Operating at Google's scale has its challenges, but it also has its silver linings. For example, a given investment on research can be applied to a larger amount of infrastructure, yielding return faster, Hoelzle said.
A diagram of a Google modular data center
(Credit: Stephen Shankland/CNET)
[CNET editors' note: Prohibited content deleted.]
@ gyvancy: So you don't find it bizarre that a giant company like Google uses thousands of single batteries instead of UPSs?
Honestly i do think its kind of bizarre that google woud make modular servers like this and then have the ecological nightmare of battery recycling.
Since when did a technially sophisticated audience become fools?
I've been a big advocate of 12V power systems and direct to battery power for UPS' It doesn't make sense to get a big UPS that brings the 12V up to 110/220V only to have the computers PS bring it back down to 12v/5v/3.3v
Actually, most of my own computers are 12V capable, so I can always patch them into the car's battery if power goes out.
The Onion is a little more 'subtly' blatant about its satire. Fool on you for not believing in simple solutions
The need to conserve power would be key to staying profitable - the less power you use, the more money you make - and Google is on 24/7.
/s
Google is the most hypercritic company on earth
Gigabyte GA-9IVDP
I cannot find that part number on Gigabytes page, but here is a link to DRAM upgrades.
http://www.memory-up.com/Memory/GigabyteGA-921395.html
It is probably a custom board manufactured under contract. Particularly the 12V only part of it.
If its "secret" you don't file a patent on it. You protect it as a trade secret. I bet Google knows that their patent is about to be granted so a) they have no need to keep it secret and b) it would be public information soon anyways since thats what patents are for - you get exclusivity for some time in exchange for publicly disclosing your inventions.
And if the main power fails, it won't matter if the server is working if all the switches are down. That means Google has all the switches on batteries too.
But I think the real advantage of the onboard battery is that they can get rid of redundant power supplies, which are expensive and uses additional power. I am sure if the PS goes down, the battery kicks in and the data center engineers get alerted, and they can replace the PS within a couple of hours without any interruption to the operation.
Frankly if the main power feeds fail I wouldn't be too worried about switches. The main concern is keeping the cooling systems running to prevent a meltdown -- especially at the higher temps google runs their datacenters at. My guess is that google still uses traditional UPS or flywheel systems for their cooling infrastructure.
During the CA rolling blackouts in 2000, my laptop kept operating when the power went out, as it had both wall power and a battery. The battery stayed charged until the balckout, then it ran for 3 hours on battery - AUTOMAGICALLY.
The article is referring to Servers, how many laptops do you know of with dual CPU's (Not Dual Core but physical CPU's) and what looks to be atleast 8 or more GB of Ram and 2 hard drives!
You APRIL FOOLS!
http://ieeexplore.ieee.org/xpls/abs_all.jsp?tp=&arnumber=1022321&isnumber=21994
There are commercial products available - PC Power supply with built-in DC UPS
http://www.tri-m.com/products/engineering/hesc104.html
http://atlantis.com.ua/rpstr/catalog/Micronix_pv-5127_uk.pdf
http://www.amtrade.com/pc_power/small_uninterruptible_power.htm
It's a Googlesmart Implementation, but not something new.
Google still hasnt done anything outside of search.. thats it.. 1 trick pony.. hasnt suceeded at anything else its in against the incumbents...
Hummers have been around long before the EV1 was even thought of!
You do know the H2 and H3 aren't the only Hummers around right?
I'd love to replace my racks full of (name brand) servers with these. They probably cost less than 1/3 as-well. Google really should sell these, although I suppose they aren't flashy/pretty enough for retail.
look at the power leads from the battery, they run into the power supply. The battery just supply the 12v bus in the power supply in the event of a power failure. If the power supply fails the battery isn't going to do any good
----------
If you squint at the -expanded- photo, you'll see that the power supply is putting out 13.45 or 13.65 volts (at 20 amps), perfect for charging a lead-acid cell. The battery will be maintained quite happily.
I do love the Velcro...
Also, I wonder how much it costs them to deal with thousands of worn-out batteries? Batteries like that lose their potency after a number of years.