Mar 31 2012

Should You Block Google From Crawling Your Slower Pages?

Google’s head of web spam, Matt Cutts put out a new video discussing site speed’s impact on rankings. This is not the first time Cutts has addressed the issue, but it’s a somewhat different take than we’ve seen before, as it’s in direct response to the following user-submitted question: You mentioned that site speed is a factor in ranking. On some pages, our site uses complex queries to return the users request, giving a slow pagetime. should we not allow Googlebot to index these pages to improve our overall site speed? “I would say, in general, I would let Googlebot crawl the same pages that users see,” says Cutts. “The rule of thumb is this. Only something like 1 out of 100 searches are affected by our page speed mechanism that says, things that are too slow rank lower. And if it’s 1 out of a 100 searches, that’s 1 out of roughly 1,000 websites. So if you really think that you might be in the 1 out of 1,000, that you’re the slowest, then maybe that’s something to consider.” “But in general, most of the time, as long as your browser isn’t timing out, as long as it’s not starting to be flaky, you should be in relatively good shape,” he continues. “You might, however, think about the user experience. If users have to wait 8, 9, 10, 20 seconds in order to get a page back, a lot of people don’t stick around that long. So there’s a lot of people that will do things like cache results and then compute them on the fly later. And you can fold in the new results.” “But if it’s at all possible to pre-compute the results, or cache them, or do some sort of way to speed things up, that’s great for users,” Cutts says. “Typically, as long as there is just a few number of pages that are very slow or if the site overall is fast, it’s not the kind of thing that you need to worry about. So you might want to pay attention to making it faster just for the user experience.But it sounds like I wouldn’t necessarily block those slower pages out from Googlebot unless you’re worried that you’re in one of those 1 out of a 1,000, where you’re really, really the outlier in terms of not being the fastest possible site.” In November, we referenced another video Cutts did talking about page speed , where he also dropped the “1 out of 100 searches” stat. He said basically not to overly stress about speed as a ranking factor. Both the new video and that video were actually uploaded to YouTube in August, so this advice is already older than it appears. Today’s video, however, was just made public by Google, so it stands to reason that the advice from the company remains the same.

Mar 29 2012

Matt Cutts: 1 In 5 People In U.S. Have Heard Of SEO

As you may know, Google launched a new product today called Google Consumer Surveys . Googlers are certainly hyped up about it . Google’s head of web spam, Matt Cutts, used the product to put out his own survey about SEO in which he determined that 1 in 5 in the U.S. have heard of SEO. “In my world, everyone I talk to has heard of search engine optimization (SEO),” he says on Google+ . “But I’ve always wondered: do regular people in the U.S. know what SEO is? With Google’s new Consumer Surveys product, I can actually find out. I asked 1,576 people ‘Have you heard of ‘search engine optimization’?” “It turns out only 1 in 5 people (20.4%) in the U.S. have heard of SEO!” he says. “The survey also turned up an interesting gender difference: almost 25% of men have heard of SEO, but only about 16% of women have,” Cutts notes. “Doing this sort of market research in the past would have been slow, hard, and expensive. Asking 1,500 people a simple question only costs about $150.” The survey may only be a small set of people compared to the actual population of the country, but my guess is that’s not that far off. In my experience, outside of work, most people have no idea what SEO is. That’s probably one reason that Google wants to level the playing field in search rankings , when it comes “over-optimized” content . But that’s a whole other discussion .

Mar 22 2012

Google Will Need Time To Learn About How To Rank New TLDs

We recently talked about a post Google’s Matt Cutts made to Google+ discussing how Google will handle the new TLDs . He referenced a blog post talking about how the n ew TLDs will be “automatically favoured by Google over a .com equivalent,” which Cutts said is “ just not true .” He has now put out a new video talking about how Google will treat the TLDs, in response to the user-submitted question: How will Google treat the new nTLDs where any Top Level Domain is possible e.g. for corporations eg. www.portfolio.mycompanyname regarding influence on ranking and pagerank? “Well we’ve had hundreds of different TLDs, and we do a pretty good job of ranking those,” says Cutts. “We want to return the best result, and if the best result is on one particular TLD, then it’s reasonable to expect that we’ll do the work in terms of writing the code and finding out how to crawl different domains, where we are able to return what we think is the best result according to our system.” “So if you are making Transformers 9, and you want to buy the domain or something like that, it’s reasonable to expect that Google will try to find those results, try to be able to crawl them well, and then try to return them to users.” “Now there’s going to be a lot of migration, and so different search engines will have different answers, and I’m sure there will be a transition period where we have to learn or find out different ways of what the valid top level domains are, and then if there’s any way where we can find out what the domains on that top level domain are,” he says. “So we’ll have to explore that space a little bit, but it’s definitely the case that we’ve always wanted to return the best result we can to users, and so we try to figure that out, whether it’s on a .com, or a .de, or a dot whatever, and we’ll try to return that to users.” Cutts also put out another new webmaster video talking about why you shouldn’t be obsessing over your link numbers .

Mar 22 2012

Google On How A Lot Of Your Links Don’t Count

Google has over 200 signals it uses to rank results. Given Google’s legendary PageRank algorithm, based on links, it has led to a lot of people worrying about links way too much. That’s not to say quality links aren’t still important, but just because you have a whole bunch of links, it doesn’t mean your site is going to rank well. Google’s Matt Cutts posted an interesting webmaster help video under the title: “Will Google Provide More Link Data For All Sites?” It’s Cutts’ response to the user-submitted question: In the wake of the demise of Yahoo Site Explorer, does Google Webmaster Tools plan to take up the reigns this product once provided to SEO’s everywhere? Cutts responds, “What I think you’re asking is actually code for ‘will you give me a lot of links?’ and let me give you some context about Google’s policies on that. I know that Yahoo Site Explorer gave a lot of links, but Yahoo Site Explorer is going away. Microsoft used to give a lot of links. And they saw so much abuse and so many people hitting it really, really hard that I think they turn that off so that people wouldn’t be tempted to just keep pounding them and pounding their servers.” “So our policy has been to give a subsample of links to anybody for any given page or any given site– and you can do that with a link colon command–and to give a much more exhaustive, much more full list of links to the actual site owner,” says Cutts. “And let me tell you why I think that’s a little bit more of a balanced plan. Yahoo Site Explorer, they were giving a lot of links, but they weren’t giving links that Google knew about. And certainly, they don’t know which links Google really trusts. And so I think a lot of people sometimes focus on the low-quality links that a competitor has, and they don’t realize that the vast majority of times, those links aren’t counting.” “So, for example, the New York Times sent us a sample of literally thousands of links that they were wondering how many of these count because they’d gotten it from some third party or other source of links,” he adds. “And the answer was that basically none of those links had counted. And so it’s a little easy for people to get obsessed by looking at the backlinks of their competitors and saying, ‘oh, they’re doing this bad thing or that bad thing.’ And they might not know the good links. And they might not know that a lot of those links aren’t counted at all.” “So I also think that it’s a relatively good policy because you deserve to know your own links,” he continues. “I think that’s perfectly defensible. But it doesn’t provide that much help to give all the links to a competitor site unless you’re maybe an SEO, or your a competitor, or something along those lines. So for somebody like a librarian or a power searcher or something like that, using link colon and getting a nice sample, a fair fraction of links to a particular page or to a particular website, is a very good policy.” “I think that’s defensible, but I don’t expect us to show all the links that we know of for all the different sites that we know of, just because people tend to focus on the wrong thing,” he concludes. “They don’t know which links really count. So they tend to obsess about all the bad links their competitors have and only look at the good links that they have. And it’s probably the case that surfacing this data makes it so that you’re helping the people who really, really, really want to try to get all their competitors backlinks or whatever. And I just think it’s a little bit more equitable to say, OK, you’re allowed to see as many of the backlinks as we can give you for your own site, but maybe not for every other site. You can get a sampling, so you can get an idea of what they’re like, but I wouldn’t expect us to try to provide a full snapshot for every single site.” Links obviously aren’t everything, and if you follow Google’s changes, it’s easy to see that other signals have been given a lot more significance in recent memory. This includes things like content quality, social signals and freshness. If you’re that worried about the number of links you have, you’re living in the wrong era of search. Granted, links have value beyond search ranking. They still provide more potential referrals to your site, but in terms of Google, the search engine is moving more and more away from the traditional 10 organic links anyway, with more personalized results, fresher results, blended (universal search) results, and more direct answers .

Mar 19 2012

Google: Actually, Meta Tags Do Matter.

Google posted a new Webmaster Help video from Matt Cutts today. The question at hand this time is: How much time should I spend on meta tags, and which ones matter? This one is also significant because Cutts submitted the question himself. That means, he felt this was an important enough issue, that even though it wasn’t submitted it by a user, needed to be addressed. “So the conventional wisdom a few years ago was that meta tags mattered a whole lot,” says Cutts. “You really had to tweak them and spent a lot of time to get your keywords right, and did you have a space, or a comma between your keywords, and all that kind of stuff. And we’ve mostly evolved past that, but the pendulum might have gone a little bit too far in the other direction, because a lot of people sometimes say, don’t think at all about meta tags. Don’t spend any time whatsoever on them, and so let me give you a more nuanced view.” “You shouldn’t spend any time on the meta keywords tag,” he says. “We don’t use it. I’m not aware of any major search engine that uses it these days. It’s a place that people don’t really see when they load the browser, and so a lot of webmasters just keyword stuff there, and so it’s really not all that helpful. So we don’t use meta keywords at all.” This is actually not the first time Cutts has posted a video about this topic. There was one from several years ago, where he basically said the same thing about the keywords meta tag . At the time, Google talked about how it used the description meta tag, as well as the meta tags “google,” “robots,” “verify-1,” “content type,” and “refresh”. Here’s a chart from Google Webmaster Tools, which breaks down how Google understands different meta tags: “But we do use the meta description tag,” Cutts continues in the new video. “The meta description is really handy, because if we don’t know what would make a good snippet, and you have something in the meta description tag that would basically give a pretty good answer–maybe it matches what the user typed in or something along those lines, then we do reserve the right to show that meta description tag as the snippet. So we can either show the snippet that might be the keyword in context on the page or the meta description.” “Now, if the meta description is really well written and really compelling, then a person who sees it might click through more often,” he says. “So if you’re a good SEO, someone who is paying attention to conversion and not just rankings on trophy phrases, then you might want to pay some attention to testing different meta descriptions that might result in more clickthrough and possibly more conversions. So don’t do anything deceptive, like you say you’re about apples when you’re really about red widgets that are completely unrelated to apples. But if you have a good and a compelling meta description, that can be handy.” “There are a lot of other meta tags,” he says. “I think in the metadata for this video, we can link to a really good page of documentation that we had, that sort of talks about which stuff we pay attention to and which stuff we don’t pay attention to. But at a 50,000-foot level, don’t pay attention to the keywords meta tag. But the description meta tag is worth paying attention to.” It sounds like SEO still matters .

Mar 15 2012

Will Google Rank New TLDs Better Than .com Domains?

Google’s head of web spam, Matt Cutts, took to Google+ to bust yet another myth (there’s been a lot of Matt Cutts myth busting lately, it seems). He points to an article from Adrian Kinderis, CEO of ARI Registry Services (described as “a top-level domain specialist”), which claims that the new top-level domains will “trump .com in Google search results” . Kinderis writes: Will a new TLD web address automatically be favoured by Google over a .com equivalent? Quite simply, yes it will. I’ve been researching this topic since development of the new TLD program first began (around 6 years ago) and have closely followed the opinions of the many search industry experts who have taken a great deal of interest in the introduction of these new domains and the impact they will have. The more I research, the more I have no doubt that a new TLD address will trump its .com equivalent. Followers of Cutts may have some doubt. Here’s what he said about it on Google+: Sorry, but that’s just not true, and as an engineer in the search quality team at Google, I feel the need to debunk this misconception. Google has a lot of experience in returning relevant web pages, regardless of the top-level domain (TLD). Google will attempt to rank new TLDs appropriately, but I don’t expect a new TLD to get any kind of initial preference over .com, and I wouldn’t bet on that happening in the long-term either. If you want to register an entirely new TLD for other reasons, that’s your choice, but you shouldn’t register a TLD in the mistaken belief that you’ll get some sort of boost in search engine rankings. In the comments on Matt’s post, one reader suggested that Google doesn’t rank good content, but ranks popular content. Matt responded to that, pointing to a post we did on a video where he discussed porn sites and PageRank .