Since early 2009 Google's Matt Cutts has recorded a superhuman number of videos to help struggling site owners understand their site in search. While the videos are great, sometimes the guy just needs to get to the point. With that in mind we've done the hard work and watched every Matt Cutts video to pull out simple, concise versions of his answers: The Short Cutts!
523 Short Cutts (Running time 19:39:28)
Search Quality Raters who manually assess side-by-side results and judge which is best and various other tests.
For manual actions check out Webmaster Tools. Check dates of algorithmic updates. Tweak content.
404 the page. Consider displaying related products and using "unavailable after" META tag.
Either use a responsive site or rel=alternate if you have a mobile site with a rel=canonical on the mobile site.
Percolator = Incremental indexing. Dremel = Like MySQL but for huge databases. Pregel = Solution to graph problems
Anything that is deceptive or manipulative or where high value material compensation is traded.
Google decides which is the most important and ranks that one highly whilst lowering the others.
Do not use the same sites over and over, and always write unique content, not spun.
The more serious the action taken by the webmaster, the more serious Google treat it.
No - Keep them unique and apply them to the most important pages.
Not directly, although sites with more pages have a better chance of ranking due to wider variety of keywords.
Ordinarily no as internal links should not normally use rel="nofollow".
Keep it small and have plenty of unique content on the same page.
Make sure you have high quality content on your website that people would refer back to.
Users should send spam reports and Google will take action.
No more than non-mobile, but users want quick mobile experiences.
A rel nofollow tag on widgets and infographic embeds is recommended.
No, contact them and ask them not to link to you then disavow them using the disavow tool.
They can't show them all, but in time they plan to show around 3 links in messages to help diagnose the problem
You can as long as Google treats them as generic, otherwise they're tied to a specific country. List here: https://support.google.com/webmasters/answer/1347922?hl=en
If you're not trying to stuff hidden text in there then don't worry about it, it's a normal thing on today's web
As long as it's not spammy and keyword stuffed then you shouldn't stress about it. Google understands lots of sites need boiler plate content.
Not usually, it can look spammy. If they are country TLDs, perhaps have a single country locator page and link from there.
They're working on providing examples as the messages go out, or a place in Webmaster Tools to see examples. Meanwhile, ask a Googler in the Webmaster Forums for examples
No, if it's just for a short period then you shouldn't worry about that. If it's for, say, 2 weeks, then yes that could be a problem
People use more natural language. Google want to master conversational search
Google prefers a source link to be near the top, but anywhere in the article is ok
Yes, released relatively soon. For now there are PHP and Python libraries to allow you access to this data
Google don't hate Albania. Google.al launched earlier in 2013 and there was a Google doodle for Albania independence day recently, but proper support takes time
No, but maybe it's a signal for Google to look at in the future
Google will include some examples of URLs they have flagged as spam, so that webmasters know where to look to clean up links
Uploading files that aren't in plain text (Excel, Word etc.), trying to disavow individual links rather than whole domains, using incorrect domain: syntax, having the reconsideration plea in the disavow text and leaving lines uncommented that should be commented
Yes, they're always looking to use that signal in useful ways. E.g. It may mean pages on non-important sites are surfaced if content is written by an important author.
Data refreshes are the same as algorithm updates (they're not). Updates are intended to sell more ads (they're not). Link building can replace good UX and design (it can't).
Paid content must be disclosed to users and the links no-followed. Google will take action if not.
All search engines are subjective to some degree, but Google take outside influence where it helps to improve search results
If the site has multiple pages that are a really good match for the query.
Penguin 2.0, advertorials that pass PageRank, attacks on specific spammy areas like 'payday loans' results, more sophisticated link analysis, better detection and reporting of hacked sites, detection and better ranking for people who are industry authorities, refining the Panda algorithm, and better communication to webmasters generally.
Yes, it's still used by Google though maybe not other services
That patents Google have had granted are in use. They're not necessarily
Not having a domain at all, not making it crawlable, not using words people search for on your page, not having compelling content, not having good titles and descriptions, not making use of webmaster resources.
Do a spam report with examples and they'll prune them out of results
Should be fine in one go, but probably add in stages
It's possible but can be difficult. Best to get a fresh domain
Google can parse most Javascript, but do some tests. For the most part it should work for Google as well as users
Look in Google Webmaster Tools in Traffic > Links to Your Site
Try to keep the same story on a single URL as it evolves
Explain what you've done to change your ways, giving assurance it won't happen again
301 is a direct instruction, rel canonical is just a strong hint
Maybe you're cloaking. Use Fetch as Googlebot in Webmaster Tools to confirm
Matt sees the positive side of the web instead of just the spam
Google usually detects they're irrelevant, but 301, canonical tag or Webmaster Tools can help
Not for normal search but potentially for vertical specific e.g Recipes
Not really, unless you're selective about what you repost, and have lots of unique content
Make sure you write the article yourself so it's not the exact same article as others. Do another angle
We crawl the web, crawling the best pages more often. Then we sift documents based upon queries and rank those results
Will return the best results to the user, regardless of the TLD it's on
There are no guidelines on having tracking pixels, so this shouldn't affect rankings
When a freehost has got infested with so much spam
It's not just optimisation. It's content, links, user experience etc.
Point a rel="author" link to your profile, then name your posts on your profile
Add rel="author" on the end of link URLs and make the anchor text "+your name"
One or two is fine, any more than five and Googlebot will probably not follow. Avoid mixing 301 and 302s as well
Look at backlinks, internal linking, canonicalization, most common cause of a drop is violating quality guidelines e.g selling links
Looks for certain words and gives them certain weight. If there are enough words with enough weight then it says "this looks like it needs filtering"
We support all people on the web using forums, webmaster videos, blogs, chats etc.
No, but using Google Translate to auto-generate this can be spammy. Get a human to do a real translation
Use subfolders and set up in Google Webmaster Tools
We look at canonicals, rel="author" tags, where we found it first and if it's scraped content
There are about 40 domains you could use that aren't country specific
It's an umbrella term for PageRank, site history, respectability, content quality etc.
Start with site:mysite.com, check for malware, check Webmaster Tools. Try the Webmaster Forum. Is it just your site, or lots of sites?
Ranking reports aren't important. Concentrate on how your site converts
Optimise site speed, control of CMS, education program, internal linking, social media
The percentage of nofollow is very small, not a big problem
Not so bad on a couple of domains if they offer something different
Do what is right for the site, but a little more content is better for Googlebot
It's how Google reduces latency between finding a document and users being able to search it
It doesn't always, but it's usually the best or only place for the product
Use the Safe Browsing diagnostic service. Use Webmaster Tools. Try unmaskparasite.com
Build authority on your main site or small set of sites
When Google crawls, it's a snapshot in time. It might miss important links in a rotation
If they're really bad, it can have a negative impact on crawling or indexation
Test first with a subpage and 301 it. Then start with the smallest parts of your site and move it gradually
Lower your DNS TTL to about 5 minutes; duplicate the content to the new site. Once traffic picks up you can remove the old site
Have as many as you like, but don't chain more than five
Tweet it, use pubsubhubbub to get it crawled. Do a DMCA request if you're ripped off
Read the papers and literature. Check out Jeff Dean, Urs Holzle, Luiz Barroso
Google don't want to play with the robots.txt spec, they want it in Webmaster Tools
They try not to make changes during the holidays, but it can happen
Translate the content, put them on separate domains
Keyword stuffing, duplication, link quality, hacked sites, communication with users
Not a guideline anymore. Do what is right for the page and users
Not using Twitter in ranking (they can) and returning non-crawled pages as family safe
Use related questions links, highlight good ones, add voting buttons etc.
Treat customers badly to get complaint links, Spam team are slackers, only links matter
Tripit, 3rd party batteries, podcasts, MySixSense, Wifi Analyser, Mapviewer etc.
There's no API for this, but more data will be provided in time as people do request that
Buy one domain name, as much hosting as possible and go from there
They are separate systems. A user may also have JavaScript disabled which is only used by Analytics to track visits.
Tell them Google agrees. For results that will stand the test of time, build a great site
Let Google figure out the duplicates on their own before taking this step
Googlebot can interact with forms, or maybe there are just links you don't know about
Personalised Search, Country/City Dependent, Different Data Centre
Yes, if you want your Twitter profile to rank. No for the outbound links
Keep up to date and restrict /wp-admin/ folder to good IP addresses
mail.google.com, google.com/calendar, www.techmeme.com, Google News, Tech Crunch, Google Reader, Twitter, Friend Feed
Bad markup. It just needs more time or Google don't have enough trust in your domani
Some countrys use DMOZ because it's easier to click than type, but we rely on it less.
Controversy, participating in the community, oiginal research, newsletters, social media, lists, blogs, How tos and tutorials, run a useful service, make a few videos
Yes, but it helps if you use the canonical tag
No guarantees to crawl anything in the sitemap. To get more pages crawled get more authority/reputation
Hopefully over time Google will get better at exciting JS and AJAX and represent that appropriately in our search results
In general, Google figure out what your post is about, so don't worry too much about it
The top 9 screen resolutions in Google Analytics were all bigger than 800x600
Don't worry about it. There's no special benefit to it being the first link
Return information real-time, UI changes, hacking websites will still be a trend
Have a set of delimited links on your site that accurately reflect site's hierarchy
If you can make your site work on mobile, then do that instead
Filter and sorting by recent results and type of results (video, forum, reviews)
That people gain weight when they first start working at Google due to free food (the Google 15); people want to make a Space Elevator
To only show the q= parameter in the URL and not all other parameters
Use HTTPS; use POP3; labels; turn off web clips; change your theme
Make sure site isn't hacked. Go to Google Webmaster Help Forums, file reconsideration request
If you only rely on JavaScript to hide your email address, it's possible for Google to execute the JS and for your email to be crawled
If you manually change the geo-location from what Google thinks, you'll notice you may not appear in the old location as often
He'd create a useful service and operate completely transparently
They choose the most relevant/useful snippets and titles to the user's query
301 redirect each webpage to the new webpage. Contact any good links you may have, asking them to update their link
UP, UP, DOWN, DOWN, LEFT, RIGHT, LEFT, RIGHT, B, A - Secret ninja mode
Redirect by IP, but don't do anything special for Googlebot
We make a lot of small updates instead of large updates. When we have big changes, we are happy to confirm that they have taken place
Most of the time you don't need to worry about it. You might get a link to you from it, so worst case it won't hurt
Google's getting better at finding in JavaScript. You should nofollow or block JavaScript in robots.txt
A .com may be the best result for UK
You can search by your location on Google on a smart phone
Yes, it can affect what countries you rank in
Don't worry about it, but keyword in the URL helps a little bit
It's a subset of total backlinks. Use Webmaster Tools for more. And yes, they do
Lots of valid reasons to do this, but Google might scrutinise it
Indirectly. It's all about trust, authority, PageRank, reputation and quality
They can, but fix the structure first. Be careful with it
New types of data to search; semantic search; mobile; people will store more data in the cloud