Google Webmaster Central Blog - Official news on crawling and indexing sites for the Google index

Using site speed in web search ranking

Friday, April 09, 2010 at 11:00 AM

Webmaster Level: All

You may have heard that here at Google we're obsessed with speed, in our products and on the web. As part of that effort, today we're including a new signal in our search ranking algorithms: site speed. Site speed reflects how quickly a website responds to web requests.

Speeding up websites is important — not just to site owners, but to all Internet users. Faster sites create happy users and we've seen in our internal studies that when a site responds slowly, visitors spend less time there. But faster sites don't just improve user experience; recent data shows that improving site speed also reduces operating costs. Like us, our users place a lot of value in speed — that's why we've decided to take site speed into account in our search rankings. We use a variety of sources to determine the speed of a site relative to other sites.

If you are a site owner, webmaster or a web author, here are some free tools that you can use to evaluate the speed of your site:
  • Page Speed, an open source Firefox/Firebug add-on that evaluates the performance of web pages and gives suggestions for improvement.
  • YSlow, a free tool from Yahoo! that suggests ways to improve website speed.
  • WebPagetest shows a waterfall view of your pages' load performance plus an optimization checklist.
  • In Webmaster Tools, Labs > Site Performance shows the speed of your website as experienced by users around the world as in the chart below. We've also blogged about site performance.
While site speed is a new signal, it doesn't carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point. We launched this change a few weeks back after rigorous testing. If you haven't seen much change to your site rankings, then this site speed change possibly did not impact your site.

We encourage you to start looking at your site's speed (the tools above provide a great starting point) — not only to improve your ranking in search engines, but also to improve everyone's experience on the Internet.

The comments you read here belong only to the person who posted them. We do, however, reserve the right to remove off-topic comments.

109 comments:

Billy said...

Zoompf also provides a free performance scan similar to Google's PageSpeed or YSlow, to find over 300 front-end performance issues.

http://zoompf.com/free

Team Estrogen said...

How much of a speed difference affects the rankings? Should I be concerned about running GWO experiments?

Amanda said...

Exciting stuff. Glad speed is becoming a target for search ranking. Strangeloop Networks conducted research on Performance Impact: How Web speed affects online business KPIs- check out the webinar here: http://www.strangeloopnetworks.com/news/events/past_webinars.aspx

dianosq said...

I do not think that this is a solid idea. What about sites that post lots of photos on their pages or use complex services that take longer to load? What about all the sites that use advertisement? They obviously load slower than a plain HTML site.

It would be nice if Google would add more transparency to the new signal, including if a website's rankings are affected by its loading time (in webmaster tools for instance).

You guys hopefully look at the connection speeds and origins of visitors as well. A website with lots of Indian users for instance will likely have slower speeds reported than a website with Japanese or Swedish users. Are those factors included in the calculation?

How can a webmaster check to see if a recent ranking drop (say on April 1) is related to that new factor?

Ryan said...

Might be interested in checking out Compuware and their Gomez web performance product. Targeted for larger sites and pretty powerful. http://www.compuware.com/solutions/web-performance-management.asp

John said...

So if Google Analytics' code snippet is slow, would that lower a website's rankings? That would be the ultimate irony.

morkmork said...

How is page speed measured by Google?

If the Google toolbar measurements are used to determine page speed, this can give a wrong performance figure. If say the home page of a site has the same URL for logged in and logged out users, a complex web application can bring the average page load time up quite a lot. A site with reports and charts (e.g. Google analytics) loads very slow when logged in, this shouldn't affect the ranking IMO.

Any comments on this?

ronakorn said...

Great stuff, a little content on site can make web faster but make website's rankings up?

Navi Arora said...

Now i have to move my websites from Shared Hosting to VPS :(

Andy Beard said...

Looks like I need to advise all my readers to remove Google Friend Connect due to speed problems.

I wrote about that problem before when site speed was first introduced to webmaster tools.

I haven't seen any data to suggest it runs faster

T.B.H. Ames said...

I do wonder how this will effect websites that are extremely educational -- particularly in the science and math fields -- that have applets and scripts to explain and show difficult concepts and allow the user to manipulate data. By using page load speed, we could see these extremely informational and useful websites drop their page rank while more superficial, simple websites take their place. I'm not wholly convinced this was a great idea.

john bishop images said...

I don't often comment here, but I spend a lot of time and ewffort with Google trying to improve my sites ranking and just spend weeks gziping and adding last-modified tags to everything. I am very computer literate and this is not an easy task.

I have some major problems with this. Your current tool in Webmaster cites many pages with >1 DNS lookup and guess what - that other lookup is for Google Analytics! And as I move to more integrated services I become more dependant on other sites and their ability to perform (eg embedding a video from Vimeo, YouTube, content from creative commons, and many others)!

My site (johnbishopimages.com also shares resources on my webhost (bluehost.com) with other sites because this is what I can afford. Are you going to penalize sites because they can't afford dedicated hardware resources? Kinda flies in the face of many of Google's other initiatives!

How important will this signal play in page ranking? If it plays a major role, you are penalizing the little guy who is trying to get a start and rewarding the larger corporate sites because they can afford lots of iron and content servers spread across the countryside - not exactly a level playing field and one that, on the surface, this signal seems to perpetuate.

Please either rethink this strategy and gibe us more insight to how it will work.

John
;-j

scriptster said...

I'm removing Google Analytics code from all my sites - it's very slow and WMT always shows it as one area of improvement. Google AdSense code often renders excruciatingly slow. I guess, it's gotta go, too.

Additionally, I don't see any practical value in "site speed" "Page Speed" - maybe - but what is "site speed"? Depending on the feature of the site you are using, you can get a very fast page (static HTML), very slow page (text search in a database) or any of the variations of speed in between. There is no average! It's like the average temperature of all the people in a room. Makes no sense.

Also, I don't hold my breath for Google to explain how they measure the speed but it would be fair to at least acknowledge if they are talking about the time the page(s) render in the user's browser - depends greatly on the browser and the PC - or the time it take the pages' components to load. They keep using both interchangeably yet these are two very different things.

Talk about confusion: how 'bout pages that load from different URLs as in iframe or frames?

How about slowing down competitors' sites via botnets? Has anyone thought about implications?

This is the worst move Google has taken, hope they reverse their decision soon.

Kroc Camen said...

Google, I’m getting messages that some pages are missing a title tag, but they’re not—I think GoogleBot may be borking if the HEAD element is missing (remember, optional in HTML5!) I couldn’t verify via meta tag because of this too. Hope you can fix this, thanks.

Nikolay Matsievsky said...

It seems it's a finest hour for WEBO Site SpeedUp - http://www.webogroup.com/ - also open source solution.

dianosq said...

The least they can do in my opinion is to tell a webmaster if a page ranking or website ranking is affected by its page loading times.

haseebahmed4all said...

So now even creatives, developers, hosting services,Information Architecture(Technical Architecture), Information design all matters to SEO - in short it is to say build your website properly and help users/internet growth !

Adam said...

I understand that speed matters. No one likes to wait for a page to load. But I have a couple questions / concerns:

How much is this a factor for PR?

With the recent court ruling with the FCC vs. Comcast speed might be tiered or throttled in the future. Is that a concern?

MagicYoyo said...

Shame.
Officially, Google set a ranking factor not to increase relevancy but to reduce crawling cost.

dkubb said...

This is something I've been hoping would happen for a long time.

It's in Google's best interest to rank sites higher that make the user's experience a priority. A site with alot of images and advertisements does not enhance user experience. All things being equal I would prefer to visit a fast loading site versus a slow one when I search for something.

Ehrich said...

....so all the spam bloggers are going to buy even MORE of the ISPs and then use their botnets to DDOS the legit sites.

BRILLIANT!!! not.

This will end badly.

lwiki said...

thanks.

Lenny Rachitsky said...

Glad to see this finally happen, though I wish there was more transparency into what "slow" means. I did a quick analysis of the analysis, including unanswered questions that I hope we can get answered:

http://www.transparentuptime.com/2010/04/your-sites-performance-now-affects-your.html

Unfortunate Futurist said...

How will this be affected by preferred access issues? (i.e., following the recent court case setting back net neutrality causes) Will we be able to track what pay-for-play preferential treatment looks like using this tool?

dianosq said...

"All things being equal I would prefer to visit a fast loading site versus a slow one when I search for something."

Well I would prefer to find a site that has the information that I'm looking for. I would not mind waiting a bit longer to retrieve those information. What good is a fast loading time if the site is offering useless information?

This will only lead to attacks on popular sites to slow them to a crawl and to spammers using fast static sites to serve their contents.

chrisw said...

Up above, T.B.H. Ames said...
"I do wonder how this will affect websites that are extremely educational...

I run just such a site, with large, highly illustrated pages (a bit like Wikipedia page length and with as many photos) and obviously this is a huge concern, not least because I've seen a huge traffic and income drop-off recently. Is this why, I wonder?

I think it's most unlikely, but unfortunately, I have no way of knowing because the information given in the post is a little bit too vague for me to tell.

I've been following the Google Page Speed initiative since December, when it was first mooted, and I've spent a huge amount of time improving my page speed since then, to the extent that the Webmaster tools graph suggests I have improved the average speed (of 400 or so pages) by about 50%. Superb! Thanks guys! I appreciate the steer and, all told, I applaud Google for trying this bold initiative. In the long run I think it's broadly a good thing unless it encourages people to go for speed over quality, rich content, which is not at all a good thing.

Like other posters here, I would appreciate more (and more precise) information so I know where I am. At the moment, like other posters, I am now erring on the side of caution: a) I no longer dare use any kind of urchin-type analytics because of potential speed impact; b) I will now, most likely, be removing some of the other widgets and stuff I use; c) I am scared about running experiments too and using anything remotely server-intensive, except pure HTML (e.g. a wiki running on PHP) that could slow pages down. These things are all negative impacts of a speed drive, as far as I can see, and a backward step. No?

So I do fear a negative impact on page quality if we relentlessly pursue page speed, unless you can reassure people about what factors will and won't make a difference and how. I appreciate you can't always reveal what you do, and I fully understand that, but at the moment I feel we have too little information.

In sum: thanks for pushing us into action. Broadly supportive, but concerned about negative impacts, including Analytics/ads/widgets/PHP etc, and very concerned about gradual, creeping impacts on page quality.

Matt N said...

I guess once again I will have to be the black sheep of the bunch.

This is not good at all. While I am all for improving speed on website, sometimes it is beyond our control. Sometimes website hosts are at fault. So if this does in fact affect ranking, than that is a bit problematic.

And what about site with Flash? Does this mean once again we have to revamp our sites just to fit the needs of a company. (And yes that is a Bash on Apple too).

Speed should NOT be a factor in rankings. I don't get why people can't think this through before praising Google.

Matt N said...

And I love how Google now "praises" speed when their own Adwords has a loading screen on it now.

BRAVO google.

ed.robinson said...

If you consider the broader impact: This is an important step for the future of the internet - putting a standard in place for a critical part of the user experience. Together with other best practice guidelines, this ultimately will improve the experience for everyone using the internet.

Ed Robinson
http://www.aptimize.com

lyon said...

I like how Google is doing this. The interweb is about communication and sharing of information. It makes sense that the highly relevant site I want to find might rank lower than one that's just a blank page with nothing on it. After all, the blank page may not have anything useful, but it LOADS FAST and that's what's really important.

I cannot be spending time out of my busy day waiting for relevant web sites to load.

It also really makes a ton of sense, because if your site gets indexed at a time of day when it gets a lot of traffic, then it will have lower rankings all the time. I think this is great, because your site should FAST FOR EVERYONE ALL THE TIME or it's a crap site with shitty information and the webmaster should be castrated and have their eyes plucked out by vultures from hell.

Another great innovation from Google.

Martin Missfeldt said...

As an artist I must say: bad idea. Creative Webmaster and Blogger use images to illustrate their texts - or to show their pictures.
Will the serps in future just show results without images?
My WMT says me that all of my sites are slow but I will not remove images - they are a benefit for each sites.

TriNi said...

I am a bit concerned about this. My site provides relevant and useful information to my readers, however, because I provide proof of payments in terms in images regularly in my blog posts, I'm concerned that the images slow down the speed my site takes to load, and hence affect my rankings.

What can I do about this?

arvana said...

On several of my sites, by far the slowest-loading elements are the JavaScript for Google Adwords and Analytics!

MikeHopley said...

Why so many knee-jerk reactions? Did you actually read the post?

Loading speed is becoming a factor in the rankings. It is not the only factor.

Relevant pages will still top the rankings. If your site has good content, it will continue to rank well.

Users value both speed and quality. By including speed in the ranking algorithms, Google is just trying to give users what they want.

By making this factor public knowledge, Google are hoping to spur website owners to make their sites faster. This can only be a good thing.

bob said...

Wow there are a lot of cry baby's on here. I support site speed and proper coding, remember content is still king and stop whining.

Alex said...

If you'd like to see your sites pagespeed and yslow results together and track it over time, you can do this for free at

http://gtmetrix.com/

We'll be adding some more to it over the next few weeks to get a handle on how long it takes your site to load from a users perspective.

Azizuan Aziz said...

thanks

Dave Artz said...

Thank you Google!

Robert Sinton said...

Superficially, encouraging site speed is a worthy goal, but that strikes me as just one pro outweighed by many cons.

There are a number of drawbacks, most listed in earlier comments, but I'll add one more: geography. If I'm building an NZ-hosted site targeted at NZ users, competing with US-hosted sites, how will load speed influence its ranking?

Unless site speed is measured from many points around the globe and averaged (unlikely), I would have to assume that it will be measured from the US. Not only would this penalise an NZ-hosted site, it would actually work directly against the feature's original intention, because all else being equal NZ users should expect to get faster performance from the NZ-hosted site in the first place.

The first thing that struck me on reading this, though, was that it is not a feature I want as a Google user. If I'm looking for something I wanted answers ranked by relevance. If a site has the best answer to my question, I don't want it second in the list because it loads a little slower than a less relevant site.

I use Google as a search engine, not a site reviewer. This feature strikes me as a step backward: it makes Google less useful to me.

valentin alecse said...

How can we find out the webpage loading speed?

Ebrahim Elaidy said...

Ooh No!
that's Open the way for the exploitation of hosting companies

WadeM said...

I suggest removing godaddy site seal as it is very slow to load.

PS-Mye said...

hi my site is reported spam!

now when i want open my website , a security warning shown !

google send a messege me about the post that , reported!

i delete that post!

how can i change back my website to ago and without showing security wanrnig?

MarshallsBlog said...

With this Google is costing webmasters a lot of money.

Also I can't find any solution how to increase your webspeed. Instate of tools to check your webspeed.

:(

David said...

@MarshallsBlog a content delivery network will speed up your sites content and CDNs are not that expensive anymore.. maxcdn has a great introductory offer $10 for 1 TeraByte

MikeHopley said...

At the moment, like other posters, I am now erring on the side of caution: a) I no longer dare use any kind of urchin-type analytics because of potential speed impact; b) I will now, most likely, be removing some of the other widgets and stuff I use; c) I am scared about running experiments too and using anything remotely server-intensive, except pure HTML (e.g. a wiki running on PHP) that could slow pages down.

I think you're over-reacting here. It's good that you care about speed, but you don't need to be quite so strict.

When optimising for speed, you need to weigh the potential speed gains against any loss of features. Otherwise, you'd just remove everything and be left with a blank page. ;) You also need to consider the difficulty of implementing and maintaining particular optimisations.

For example, minifying your entire HTML document could be difficult, depending on your setup. Sure, it will give a performance gain -- but a small one.

Google Analytics has a new asynchronous loading method. Loaded in this way, it has almost no effect on the page load time. It still gets reported by WMT Page Speed, but this is an anomaly.

Widgets from external sites (e.g. social sharing widgets) do often slow sites down badly. Use them with discretion. Do they really improve the user experience, or are they just cruft? To my mind, most of them are designer-centric cruft.

Server-side processing is the last thing you should be worrying about. 80 -- 90% of end-user response time is spent on the front-end, so your server performance only accounts for 10 -- 20% of your loading time. But if you're concerned, then take some measurements! There's nothing like real numbers to put things in perspective.

James said...

As a user, I'm delighted with this. If I'm searching for something, I'll often have to open a couple of the Google search results to find the answer; I'm much less inconvenienced by a handful of seconds to identify a page as not having the information I need than waiting 30 seconds for one to crawl down the pipe. Using page speed as a tie-breaker will certainly improve my browsing experience, just like blocking Flash does.

As a web developer, I'm pleased by this as well: I've advocated efficient site design for years, but others crank out sites which only perform well if your URL starts with "localhost". Yesterday, I saw a big commercial site (ebuyer) which loaded extremely slowly - it was immediately obvious why. The front page was loading three separate CSS files - all from their own server - the first of which consisted of six @import directives. NINE - perhaps more - separate HTTP requests just to get the CSS data? Insane. The other aspects were as bad, but the CSS seemed a nice obvious example.

Users have been complaining about slow page loads for years now, but largely been ignored by a certain subset of developers who don't see the problem with ramming megabytes of extra cruft into their pages. Maybe Google adding weight to that will finally end that bad habit, or at least reduce it.

Alex Paterson said...

I have implemented some of the advice google has given about site speed.
Most of my sites were already relatively fast, as they sit on dedicated servers.
Contrary to what some of the whingers on here are saying, implementing most of the suggestions is a no cost exersise and fairly easy, therefor it should be done.

The only significant thing I havn't done yet, is to gzip my style sheets, which I am working on.

The performance improvement is quite noticable and the webmaster tools sitespeed page confirms this.

here is one of my sites afors

I can only say, I am very pleased with the results.

So thank you Google for bringing site speed to the fore and to my attention in particular.

With regard to some of the posts saying they can't, won't, or are removing analytics etc, then you are cutting off your nose despite your face. So look out, your sites will definately suffer longer term.

webalytics said...

Very interesting article and helpful, too.

What's your approach regarding the use of Urchin Soiftware? Since it's stongly recommended to place the tracking code into the header section of your pages this might be slowdown as well. Or would you recommend to host both (urchin.js and __utm.gif) on an external source?

Thanks,
Holger

gavtaylor said...

Im all for rewarding sites for being able to serve pages quickly but before this was rolled out to the public Google really should have sorted out their own issues.

analytics, ad sense... both the most common issue reported by Site performance in webmaster tools

Asian Capital said...

WE implemented in DECEMBER ONLY

Nothingsuspicoushere said...

All you people that are whining about slow speed have slow sites.

baeritukaez said...

I will no longer be putting Google Analytics on my page. I will also be removing all AdSense advertisements. My website will no longer be embedding YouTube videos.

All three make websites slow.

shubhkarman said...

I hope this addition would help to make the web a better (and faster) place. :)

Farhan said...

Its a great tool and exceptionally worked on my sites http://dxnlanka.blogs.lk and http://dxnlanka.webs.com

Jack London said...

Anyone here using Host1free? Are they providing good hosting service?
How good is there a free plan compared to other hosting provider?
Any help would be greatly appreciated.
[url=http://www.host1free.com]free hosting[/url]

contact said...

In general this sounds good. However, one thing that is still a problem is that you haven't addressed the algorithm clearly enough. Are you just comparing the exact values from the "Labs/Site Performance" from GWT? Or something else? For instance I have a dating site that has 30% of our users in the 1st world and 70% of the users in the 3rd world. Does the slow load time from the slow connections from the 3rd world negatively affect my "page speed" score for my SERPS in the 1st world?

Shahab khan said...

Will try my to best to speed up my site!

arun said...

I know this would happen and I have kept the blog to a minima, I haven't upgraded my blogger template till last week. The only extra scripts are google adsense n analytics.
I think google not only check speed but also the usability factors like font size etc.

BEARENOK said...

Спасибо Google. Теперь все задумаются над улучшением скорости своих сайтов.

JIYkp said...

This could be good or bad. But how are we supposed to know if Google isn't transparent about the process and how significant this factor is. Would appreciate if Google would elaborate.

James said...

@webalytics: they recommend not using urchin.js at all - it's obsolete, we're all supposed to have moved to loading ga.js asynchronously, because of exactly this speed issue. It's quite possible to do things like this AFTER the page is fully loaded from the user's perspective, and if you do this it does NOT show up as a problem in Google's site performance tool - and in the Analytics case, they literally provide the code ready to paste into your own HTML!

I don't know about AdWords, but you can certainly do something similar with YouTube videos as well; I've seen sites already using a simple placeholder for each video. It isn't until you click the placeholder that the Flash or HTML5 video object gets rendered.

Use the tools Google already provides us with and none of these things would be problems, for developers or for users!

Techie said...

My website sped in Webmaster Tools hasn't been updated since March 14, but Googlebot crawls my site every 30 minutes. How can I get it updated?

Emil said...

This is great news, it's about time that Google takes site speed into consideration. Don't worry too much about your hosting, optimize your site first and you can start by validating the code you wrote. There are also many shared hosting providers out there who provide decent speed, so you should be just fine.

Vangelis said...

Slow websites are already penalized by users.
I can't figure out why speed should affect the ranking.

Probably js code will be out of the equation but what about network round-trip?

Sites hosted in the US (close to Googlebot) will have better ranking due to less network delay? This is wrong

chrisw said...

The thing is to try to see this with an open mind. I was initially hostile back in December, but decided to run the tools and tests, and immediately saw how much benefit I could gain for free. Anyone who's doubtful, suspicious, or hostile, do try the tools and surprise yourselves.

Thanks to MikeHopley for v.helpful answers to my earlier questions. :)

Richard said...

I have a product that could be used in many countries so I want people around the world to find my website.

So I am concerned about how this will affect one of the key benefits of the internet over other traditional marketing channels: that is, the possibility to reach out to new international markets.

If you want to make search results more relevant for people in other countries besides the country where my website is hosted, then please don't penalise them for having a slow internet connection to my server.

Having a copy of my website in each country would not be economical or manageable for me, and having multiple copies would not be good for Google.

Also, won't the user's computer's specifications and Local Area Network affect download speed? If so, and you are using data from Google toolbar in the browser, how will you take this into account?

It would be better if you used a metric that is not affected by user variables such as the number of hops from the server to the user or by factors that are beyond the control of the website owner such as the vagaries of intermittent hosting services or ISPs.

Instead of trying to measure "speed" (who's speed?), I propose a combination of page size and conformance with W3C standards.

utushino said...

Is it possible during the calculation of the "site speed signal" consider only the SPEED OF A SITE ITSELF, excluding external scripts (like Google Analytics) and graphical visitor counters (like StatCounter, ClustrMaps, etc)?

I'm asking, cause Webmaster Tools are showing these as the only possible reason of site slowness.

Or maybe you can create a list of most popular services, whose graphical logos will be excluded from the consideration?

luijar said...

So will App Engine hosted sites improve their speed when server is idle (or spinning a new server to scale). It seems that if Google is obsessed with speed this is one place they need to make some improvements.

sri ram said...

Click on the links to EARN FREE MONEY ONLINE,

EARN FREE MONEY ONLINE LINK 1
EARN FREE MONEY ONLINE LINK 2
EARN FREE MONEY ONLINE LINK 3
EARN FREE MONEY ONLINE LINK 4
EARN FREE MONEY ONLINE LINK 5
EARN FREE MONEY ONLINE LINK 6
EARN FREE MONEY ONLINE LINK 7

Sourish said...

sriram making lot of money

Italie said...

Mistake, if you ask me."War and Peace was taking too long to read, so have a go at these cliff notes instead."

onehundredandtwo said...
This post has been removed by the author.
onehundredandtwo said...

I think it's a great idea, especially when sites are bogged down with popups, ads and site analytics.

I have a fairly slow connection, and sites that require 8 HTTP requests from 5 different domains doesn't work well with my connection.

Shuatas said...

I think this is ridiculous. You have a good sense of humor. Have you tried the results of Page Speed in your own websites? You follow your own recommendations?

What about Blogger? You make Blogger slow adding innecesary code. You don't offer a decent file hosting service for Blogger users in order to let the users optimize the blog's load times. Look the Blogger gadgets embedded with iframes, the comment form (it's slow and doesn't work well), and a large etc.

And what about the Google API? Have you tested the load times?

Please look at your own before and start to work in order to improve your services. Do you need any employers in Blogger?

Rick said...

I think there's good intentions behind this, but it'll be very detrimental in some circumstances.

My business and clients are in New Zealand. I tried a US based host for a start but load times to NZ were pathetic. So now that they're all hosted in NZ so they load nice and fast for their intended audience, WMT tells me that my sites take about 16 seconds to load. Is that going to make my rankings plumet?

Talkrabb said...

What is the benefit if a fast but content- and image-less page is preferred?

In order to speed up my start page FriendConnect, Analytics & AdManager have to go.
The content will be split in to four pages.

-> Usability worse, speed & position better.

To see the same content the user has to open four pages.

Sorry for my FriendConnect community with 800+ members too.

@Google: Is that what you want?

MikeHopley said...

@Rick:

WMT SiteSpeed gets its load-time data from a subset of your actual visitors (those who have the Google Toolbar with Pagerank checking activated).

If you're seeing average readings of 16 seconds, then that means 16 seconds for your actual users, not for some arbitrary Google server pinging you from the US.

Just how local is your host? If you've chosen a host in your own city, then you may be seeing extremely fast loading times that are not reflected over the rest of New Zealand.

Pay attention also to the accuracy level of these data. Is GWT is saying, "These estimates are of low accuracy (less than 100 data points)"? If so, then you could find that the data are not reliable (not enough samples to be statistically significant).

With regard to server location: have you considered using a CDN for your static assets (images, javascript, css, flash...)? You can now get a zero-commitment CDN such as Amazon Cloudfront, where you just pay for the bandwidth used. The cost is trivial.

At the other end of the scale, if you're running a huge business with a high-traffic website, you can get better performance and volume pricing from the big boys such as Akamai. The CDN marketplace now caters for every scale of business.

The location of the HTML document itself is much less important, because this item accounts for only about 10--20% of loading time.

webalytics said...

@James: Thanks for the information. We are aware of the existence of the Asynchronous GATC. But we also know that there are compatibility issue when using GATC via the "setlocalremote" parameter and then parsing log files into Urchin Software.
What's your approach to this?

Cheers,
Holger

website optimization said...

This is great to see. We provide a free page analysis tool at:

WebPageAnalzyer.com

to test website speed and offer recommendations. As well as operating costs, faster site speed has been shown to improve comversion rates and lower bailout rates.

drz said...

You forgot Chrome's profile tab in it's developer tools for speed measurement. It provides a full break down. I was able to use it to work out why ajax search was adding fifteen seconds to page load times.

On another note, speed is good but less content = faster but less relevant content.
Web rendering speed is more about the hardware you are running to spider the sites and the speed and quality of the rendering engine in googlebot. While I think your idea has it's merits in that a lot of these web developers who build sites that pull heaps of adverts and write bad bloated code, it depends on what context "Speed" Is being measured, what is considered "Slow" For x amounts of bytes in total and whether the average speed result of several crawls is used instead of just one, as to whether this idea is a useful one.

Seeing how the relevance of search results provided by Google have been degrading in my observations over the years. I think methods to improve result relevance would be more welcome but standards compliant pages are always the fastest to load in web browsers and I'm all for that.

Rick said...

@MikeHopley:

Cool, I didn't know the speed data was taken from actual visitors, but now I'm quite worried. Though this does mean it only records people using Firefox (and I guess IE) who love stacking themselves up with toolbars.

My VPS is hosted in Auckland while I'm at the other end of the country in Invercargill. I get load times around 5 secs uncached and 1-3 secs cached.

You're right, it says it has less than 100 data points.

In regards to using a CDN for assets, that's quite tricky with the CMS I use (scripts and CSS are auto combined, minified, gzipped and cached and images are auto cropped, scaled and cached) but I'm definitely going to look into it if it'll make a difference.

Thanks for your feedback.

Rui S. Sousa said...

And once again google promotes MFAs.

Can someone explain the exact relation between speed and quality?

The Author said...

Not sure this is valid logic. Sounds too simplistic. I agree with the commentators who ask why not better filter spam sites and bogus forums. Quality sometimes takes a bit longer to load.

Dave Artz said...

@Rui - See Andy Kings's books on website performance optimization.

http://www.websiteoptimization.com/speed/1/1-3.html

adam.jn said...

Seems like I will have to invest in a VPS or dedi now

London Escorts said...

http://www.sofday.com

Ginox said...

How much page speed is enough to rank a website?

Vishal Agrahari said...

hi,

I think this is good move for future. Big corporate site never compromise on their rich content so they go for dedicated server and small sites will go for search guidance.

Google should show some transparency and while ranking the pages they should include download speed of the page so it will help user to decide whether it is worth for them or not to click with respect to their internet speed. And site webmaster also take it seriously.

My site is not so much rich with images: http://www.seo-speaking.com

zaytona2010 said...

I thank you for your efforts in Google
You are wonderful thank you for your
Administration Forums Salfit
ws

http://www.salfeet.ws/vb

M. D. Vaden of Oregon said...

This sounds like a foolish move on Google's part if quality content is the goal from searches. If I'm wrong, let Google explain how this will not undermine good content.

Some of the best information pages I've seeen have a lot of photos and a lot of text. They should load slower.

It seems that this effort by Google could encourage folks with websites to reduce useful content, to keep higher rankings.

Now ... if Google is able to make allowance for images, and gauge speed more by data transfer speed rather than just page size, that might be fine.

René said...

@Ginox

Having you page speed performing better than average is a good start. Compare it at your Webmaster Tools -> Google Labs -> Site performance and check for yourself.

Nancy said...

How important will this signal play in page ranking? Make money online If it plays a major role, you are penalizing the little guy who is trying to get a start and rewarding the larger corporate sites because they can afford lots of iron and content servers spread across the countryside - not exactly a level playing field and one that, on the surface, this signal seems to perpetuate.

MikeHopley said...

@Rick:

My VPS is hosted in Auckland while I'm at the other end of the country in Invercargill. I get load times around 5 secs uncached and 1-3 secs cached.

You're right, it says it has less than 100 data points.


I'd probably not take it too seriously then -- unless you know that you're on a faster-than-typical connection for NZ.


In regards to using a CDN for assets, that's quite tricky with the CMS I use (scripts and CSS are auto combined, minified, gzipped and cached and images are auto cropped, scaled and cached) but I'm definitely going to look into it if it'll make a difference.

I think using origin pull could help here.

Something I didn't think of before: you would need to be quite selective choosing a CDN. Most budget/mid-level CDNs have networks concentrated in Europe and the US.

If a CDN has no servers in New Zealand, then I'd guess it would actually make performance worse for your customers.

You'd be okay with Akamai, of course, but they require you to buy at least $150 US per month of bandwidth (a very rough estimate). That would be from a reseller/partner, not dealing with them directly.

evision said...

i have gone through this blog. i am a kinda net savvy person. so nowadays im doing some online business and this blog is doing great for me.

online business

Spacefish said...

Site speed is a very important thing for the enduser experience.

The slowest loading things an my page are googleAds and analytics..

Which time is measured in your webmastertools site performance chart? The time till the browser has loaded all local resources or ALL resources even like googleads or adsense?

Highspeedys said...

I designed my site http://www.ispreview.co.uk to be lean and fast on both a shared, vps or dedicated server, so I don't expect to suffer any problems as a result of this. Still it would be nice to have more details on how this might affect PR, the blog post is not clear.

Mikael said...

Hi, does it affects all the Google search engines, like Google.fr ?

Joe said...

I saw this coming late last year and got prepared. Site speed should be a factor because it forces you to optimize your data for everyones benefit. I hate slow sites and there are still people out there surfing dialup(broadband).

Even sites heavy with pictures and files can benefit from some optimization. My experience has been that even those sites can be optimized and cleaned up to improve speed without sacrificing design/layout. Big clunky sites are easy to optimize too. Follow the advice and techniques yslow and pagespeed recommend and it helps. They are not asking you to remove these items, just clean them up so they perform optimally.

Seems like the people whining are the ones with the clunky sites that do not want to adapt. Its not easy work, but to stay on top of your internet rankings, you must stay on top of developing ranking standards and implement accordingly

Ernst de Haan said...

@Google: Note that some sites are very much focused on a specific geographic region, e.g. www.pensioenpage.com and www.bol.com are both focused on The Netherlands, primarily. The content is all in Dutch.

So when you measure the site performance from the US, the figures may not be relevant to the real target audience.

Do you have this potential issue covered?

Sean said...

It's a good thing that Google is highlighting the importance of site speed and its role in web-search ranking. Not only will the assessment of this site speed helps in improving rankings but also helps in overall Internet user experience. I've been looking for more ways to deliver more traffic to our sites originally by utilizing resources specifically for SEO Wales.

Robert Moses said...

This addition will be a big benefit. Sure, taking the time and spending resources to make your site faster isn't always easy, but the bottom line is improving the experience for your readers/visitors/customers. Anything you do/add/change on your site should have improve the visitor/customer experience as a goal.

Markus Popp said...

I would appreciate if W3C validation was a factor in search rankings. So that sites with hundreds or thousands errors are ranked significantly lower. (or does that happen already?)

That would be one more motivation factor for web masters to create their sites correctly.

Francesco said...

funny thing..this page was very slow in loading to me :)

Michael said...

The slowest things about my sites are the includes from Google Analytics.

Brian said...

"today we're including a new signal in our search ranking algorithms"..."We launched this change a few weeks back." So, which is it? A few weeks back or "today?"

Peter said...

I think it is amusing how worried people are about this new feature. Regardless of what Google measures or not, the fact is that if your site is slow, visitors will not stay - end of story. So... it makes sense for Google to give slow sites a low priority in the page-rankings, because users want fast content.

Maybe this will be the final nail in the coffin for all those self-proclaimed internet marketing "gurus" whose pages feature everywhere and take ages to load.

Jon said...

It's nice that the news is finally official! :D

It makes complete sense. How many times have you left a site because it's loading that fraction too slowly? The fact of life is that people are more and more impatient and webmasters need to make sure they pay careful attention to the speed of their site!

This does mean, that is more important than ever for companies to have a good hosting provider on a dedicated server!

UKFast are the UK's fastest hosting provider and allow you to test the speed of your site online... http://bit.ly/c6bljT

Cheers for the article guys!

Dotcom-Monitor said...

www.Dotcom-Monitor.com especially focused on a proactive approach to Google's use of site speed factor for SEO when we attended the November, 2009 at the Pubcon SEO conference. We set up 9 free online tools to measure website speed at http://www.dotcom-monitor.com/task_instant_test.aspx