Oct 31 2012

Matt Cutts Talks Subdomains Vs. Subdirectories

Google’s Matt Cutts posted a new Webmaster Help video about subdomains vs. subdirectories. It’s a topic Google has talked about various times in the past, but as Cutts notes in the video, it’s been a while, so perhaps it’s worth revisiting. The user-submitted question he’s responding to is: I’m interested in finding out how Google currently views subdomains — whether there’s any difference between a site structured as a group of subdomains and one structured as a group of subdirectories. “They’re roughly equivalent,” says Cutts. “I would basically go with whichever one is easier for you in terms of configuration, your CMSs, [and] all that sort of stuff.” You can watch the video for a more complete answer.

Oct 31 2012

Matt Cutts As “Matt Romney” (And Years Worth Of Other Matt Cutts Halloween Costumes)

Google’s Matt Cutts shared his halloween costume with readers of his personal blog . Behold “Matt Romney”: “This was a fun, easy, comfortable costume,” he says (after outlining his five-point plan for a Mitt Romney costume). “I practiced a few of Mitt Romney’s catchphrases and I think people really enjoyed seeing ‘Matt Romney’ around the Googleplex.” He also notes that he will be handing out full-size candy bars for the first couple dozen kids who show up to his house. Last year, Cutts was the blackhat stick man from the xkcd webcomic: In 2010, he was a ninja : In 2009, he went the see-through hole route : In 2008 it was Rick Astley : In 2007 it was a LOLCat : In 2006 he was “Zombie Jeeves” : He also tinkered around with the idea of Silent Bob: In 2005, he was Inigo Montoya : In 2003, he was “Punk Rock Matt”: In either 2000 or 2001 he went as Google chef Charlie Ayers:

Oct 30 2012

Matt Cutts Talks Quality Raters’ Impact On Algorithms, Says Guidelines May Be Made Public

While it has been known that Google’s “quality raters” (the people who judge sets of search results behind the scenes) don’t directly influence Google’s algorithms, there is still a misconception out there to the contrary. Nobody at Google (as far as we know) is looking at these sets of search results and voting sites up and down as if they were browsing reddit. Google’s Matt Cutts talked about this in a new Webmaster Help video released today. He responds to the user-submitted question: If you have human ‘quality raters’ evaluating the SERPs and influencing which sites may be impacted by Panda, how do you confirm that consumers are more satisfied with the results? “There’s a problem with this question…the word ‘influencing,’” says Cutts. “So, we have evaluation raters who look at the quality of pages, using their own judgment, as well as guidelines that we give them on when things are navigational, when things are vital, which things are off topic, which things are spam…all that sort of stuff. But those folks don’t influence our algorithm in any direct sense.” “When an engineer has…an idea for an algorithm – call it “panda” – he’ll come up with an algorithm, and it will rank the results 1-10, so you’ll have a side by side (left side and right side), so you’ll actually have the results right there,” he continues. “That goes out to the evaluation team and these human quality raters, and as a blind taste test, they say, ‘I prefer the left side of the search results’ or ‘the right side of the search results’…and then we’ll get that feedback back, but that evaluation where the search quality evaluators say, ‘I prefer this side’ or ‘I prefer that side’ does not directly affect the algorithm. It doesn’t affect Panda.” Cutts does suggest that we might see the actual guidelines Google gives to the quality raters made public. They have been leaked in the past, as he notes, but Google may sometime soon post those for anyone to see anytime. “We might be able to make those human quality rater guidelines that we make available to people at Google available to the larger world, and I think that would be a good thing because then people would be able to read through it,” says Cutts. “It leaked a few years ago, and what someone said was, ‘The biggest surprise is that there weren’t really that many surprises. All the guidelines that we provide are pretty much common sense, and would match with what I think just about anybody would sort of say about…’Yeah, it does make sense that this is a navigational page or that this is pages off topic.’” For more on what Cutts has said about Google’s Quality Raters process in the past, read this .

Oct 29 2012

It’s Apparently Not Out Of The Question That Google’s Link Disavow Tool Could Be Used For Ranking Signals

Earlier this month, Google launched the Link Disavow tool , which webmasters can use to tell Google to ignore certain links they believe to be bad. While Google will only do so at its own discretion, some may be wondering if Google will be using the data it gets from the tool for other purposes (like maybe as a ranking signal). If enough sites submit links from a specific site, for example, would Google use that data to determine that the site in question is really a bad site, and therefore use the data as a ranking signal? It seems like a logical question, and Google’s Matt Cutts didn’t exactly rule out the possibility, though he says this is not the case now. Danny Sullivan at Search Engine Land posted a Q&A with Cutts, in which he asked if “someone decides to disavow links from good sites a perhaps an attempt to send signals to Google these are bad,” is Google mining this data to better understand what bad sites are? “Right now, we’re using this data in the normal straightforward way, e.g. for reconsideration requests,’ Cutts responded. “ We haven’t decided whether we’ll look at this data more broadly. Even if we did, we have plenty of other ways of determining bad sites, and we have plenty of other ways of assessing that sites are actually good.” Google does, of course, have over 200 signals, but that doesn’t mean there isn’t room for the data to play some role in the algorithm, even if it’s not the weightiest signal. “We may do spot checks, but we’re not planning anything more broadly with this data right now,” he adds. “If a webmaster wants to shoot themselves in the foot and disavow high-quality links, that’s sort of like an IQ test and indicates that we wouldn’t want to give that webmaster’s disavowed links much weight anyway. It’s certainly not a scalable way to hurt another site, since you’d have to build a good site, then build up good links, then disavow those good links. Blackhats are normally lazy and don’t even get to the ‘build a good site’ stage.” It does sound like a pretty dumb strategy, and I doubt that many will go this route to try and hurt other sites, but that doesn’t mean that sites who get a lot of people including them in their Link Disavow files shouldn’t worry about it at all, does it? Look at the overreaction webmasters have partaken in with regards to link removal, thanks to the Penguin update. What makes anyone think that a similar overreaction won’t take place with the Link Disavow tool? Even if Google hasn’t decided whether it will use the data as a ranking signal later, one has to wonder if we’ll ever know if they do decide to implement it. I don’t see that one making Google’s monthly lists of changes.

Oct 29 2012

You Might Be Waiting Months To See Effects From Google’s Link Disavow Tool

If you were hit by the Penguin update, you should probably use Google’s new Link Disavow tool, particularly if you can’t get bad links removed. That’s the message we’re getting from Google’s Matt Cutts. Cutts and Google really emphasized that most people should not use the tool, and basically painted the tool as a last resort kind of thing, but Danny Sullivan has posted a Q&A session with Cutts in which he says: “The post [Google’s announcement post last week] says anyone with an unnatural link warning. It also mentions anyone hit by Penguin, but I keep getting asked about this. I’m going to reiterate that if you were hit by Penguin and know or think you have bad links, you should probably use this too.” Emphasis ours. Here’s the exact text from the original post : “If your site was affected by the Penguin algorithm update and you believe it might be because you built spammy or low-quality links to your site, you may want to look at your site’s backlinks and disavow links that are the result of link schemes that violate Google’s guidelines.” Sullivan asked Cutts how long it will take for sites to see improvement, asking if sites may have to take into account the time it takes for Google to push another Penguin update. Google had initially indicated that it could be a matter of weeks, but here, Cutts admits that it could be as long as months . “It can definitely take some time, and potentially months,” Cutts is quoted as saying. “There’s a time delay for data to be baked into the index. Then there can also be the time delay after that for data to be refreshed in various algorithms.” As Penguin victims know , webmasters can be waiting quite a while for Google to roll out a refresh. Even though Cutts says that you should probably use the tool if you were hit by Penguin, don’t take that to mean that it will do any good if you use it without actually trying to get links removed first. He also told Sullivan that you shouldn’t count on it working if Google doesn’t see any links actually taken down off the web. While the tool is something that some webmasters and SEOs have wanted for a long time, it may not be the silver bullet that they had really been looking for. Google even acknowledged in its original post about the tool that it might not ignore some links you tell it to . Image: Batman: Arkham City

Oct 24 2012

Google Is Experimenting With Ways To Make Reconsideration Requests Better

Google has been experimenting with how to make the reconsideration request process better for webmasters who have been dealt a manual action penalty by Google. Google’s head of webspam, Matt Cutts, put out a new Webmaster Help video discussing reconsideration requests and whether or not they’re actually read by humans. The video was a response to the following user-submitted question: Right now, when a webmaster sends a reconsideration request, how many chances does it have to be read by a real human? Do you plan to make it possible for webmasters to answer when they get a result back from Google? “Whenever you do a reconsideration request, if you don’t have any manual action by the webspam team, so there’s no way we could do anything, in essence, because it’s algorithmically determining where you’re ranking, those are automatically closed out,” says Cutts. “Those aren’t looked at by a human being, but 100% of all the other reconsideration requests are looked at by a real person.” “We don’t have the time to individually reply with a ton of detail, and so we do think about ways to be more scalable, and so I understand it might not be as satisfying to get, ‘Yeah, we think you’re okay,’ or ‘No, you still have issues,’ but that is a real human that is looking at that and generating the response that you read back,” he says. He goes on to say that if Google still thinks you have issues with your site, you should take the time to investigate and figure out some things you can do before submitting another request. If you just submit it again without doing anything, Google will likely consider you to be “hard headed” and find it “unproductive to continue that conversation.” “We’ve actually been trying a very experimental program where when we see someone who’s done a reconsideration request more than once, we’ll sample a small number of those and send those to other people to sort of say, ‘Okay, let’s do a deeper dig here.’ You know, maybe we need to send a little bit more info or investigate in a little bit more detail,” continues Cutts. “It’s just one of the ways we’ve been experimenting. We’ve actually been doing it for quite a while to try to figure out, ‘Okay, are there other ways that we can improve our process? Other ways that we can communicate more?’ So it’s the kind of thing where we don’t guarantee that if you appeal a couple times that you’ll get any sort of more detailed of an answer, but there are people reading all those reconsideration requests.”

Oct 24 2012

Google Celebrates Movember Again, This Time With A Chrome Video

Remember last year when Google’s Matt Cutts and Bing’s Duane Forrester grew moustaches ? That was for Movember , a decade-old campaign to draw awareness to men’s health started in Australia. This year, ahead of November, Google has released a new Chrome video celebrating the campaign. “For ten years, Movember has been changing the face of men’s health,” says Gabriella Conlon from Google’s Chrome marketing team. “Movember has grown from humble beginnings with a couple of blokes having a beer and deciding that it was important to raise universal awareness for men’s health. The movement took off across the globe as photos and support were shared over the web.” “At Google, we have been inspired that a group of motivated Aussie guys could spark global change — and have such fun doing it along the way,” says Conlon. “So we’ve chosen to celebrate this story in our new Chrome video.” We haven’t heard yet if Cutts and Forrester will be going the moustache route again this year.

Oct 23 2012

Hall Of Famer Matt Cutts On Why Google Doesn’t Provide An SEO Quality Calculator

Google’s Matt Cutts, who on Friday, was inducted into the University of Kentucky’s Hall of Fame here in Lexington, has posted a new Webmaster Help video answering a question about why Google doesn’t provide some kind of “SEO quality calculator”. The exact question was: Why doesn’t Google release an SEO quality check up calcualtor? This will help people to optimize their websites in the right direction. Is Google willing to do this or it just wants people to keep guessing what’s wrong with their websites? He talks about how much Google does to provide Webmasters with help via Webmaster Tools and the notifications it sends out. Then, he says, “If we were to just give an exact score…so back in the early days, InfoSeek would actually like let you submit a page, see immediately where it was ranking, and let you submit another page, and there are stories that have lasted since then about how people would just spend their entire day spamming InfoSeek, tweaking every single thing until they got exactly the right template that would work to rank number one. So we don’t want to encourage that kind of experimentation, and we know that if we give exact numbers and say, ‘Okay, this is how you’re ranking on this particular sort of algorithm or how you rank along this axis,’ people will try to spam that.” “But what we do want to provide is a lot of really useful tools for people that are doing it themselves – mom and pops – people who are really interested and just want to dig into it – agencies who want to have more information so that they can do productive optimization – all that sort of stuff,” he continues. “We don’t want to make it easy for the spammers, but we do want to make it as easy as possible for everybody else,” he adds. “There’s inherently a tension there, and we’re always trying to find the features that will help regular people while not just making it easy to spam Google.” Of course, it’s getting harder to get on the front page of results on Google anyway, because of all of the other elements they’re adding to the SERPs and the reduced number of organic results appearing on an increasing number of them.

Oct 22 2012

In Case You Were Wondering, Quoting Isn’t Duplicate Content [Matt Cutts]

Google’s Matt Cutts has put out his latest Webmaster Help video. This time he takes on a pretty classic topic – duplicate content. There’s not much here that any industry veterans will find to be of particular interest, but he is answering a user-submitted question, so clearly there are people out there unsure of Google’s take on quoting other sources. The question is as follows: Correct quotations in Google. How can you quote correctly from different sources without getting penalized for duplicated content? Is it possible to quote and refer to the source? “You’re a regular blogger, and you just want to quote an excerpt – you know, some author you like or some other blogger who has good insight – just put that in a blockquote, include a link to the original source, and you’re in pretty good shape,” says Cutts. “If that’s the sort of thing that you’re doing, I would never worry about getting dinged by duplicate content. We do have good ways of detecting that sort of thing without any sort of issue at all.” “If, however, your idea of quoting is including an entire article from some other site, or maybe even multiple articles, and you’re not doing any original content yourself, then that can affect the reputation of how we view your site,” he adds. Basically, as long as you are adding some kind of value and perspective to what you are quoting, you’re going to be as far as Google is concerned. “Those sorts of things are completely legitimate and absolutely fine,” Cutts says. “I wouldn’t worry about that. So, if you’re quoting (and linking) rather than scraping, you’re probably okay. You may not want to go overboard on how much text you’re actually quoting from a source, however. Otherwise, you’re liable to be run into trouble with the source itself.

Oct 19 2012

Matt Cutts Talks About When You Should Worry About Your Links

Google’s Matt Cutts is back to posting Webmaster Help videos rather frequently. In the latest, he talks about whether or not a site should worry about their links if they have not been participating in link schemes. Cutts speaks in response to the following user-submitted question: If I haven’t bought links, participated in any linkwheels or schemes, or spammed links, should I spend time analyzing my links and trying to remove ones I didn’t create that look spammy? “My simple answer is no,” says Cutts. “If you haven’t been going way out there, playing toward the gray hat/black hat edge – if you haven’t been pushing the envelope, participating in paid links…all that sort of stuff, in general, you know, you get a mix of links from all over the web. Some of them are going to be higher quality (Chicago Tribune, Washington Post, whatever). Some of them are going to be lower quality, including some random people who happen to scrape other people who link to you.” “If you haven’t been pushing the envelope, it’s not the kind of thing where I would worry about looking at your link profile, carefully pruning, and trying to figure out each individual link that you think should count,” he continues. “Now, if for example, you have gotten an ‘unnatural links’ warning because maybe you were doing some paid links or you paid someone to build links on your behalf, maybe they were pushing the envelope, and you didn’t realize it, then you can download links, sorted by date,” Cutts says. “Hopefully we’ll give you some examples of the sorts of links to look at. Then it might make sense to look into that, but otherwise, your average mom & pop – your normal business (someone who’s not just trying to place number one for ‘poker’ or ‘online casinos’) is not the sort of situation where you need to worry about looking at your individual link profile in my opinion.” Google has actually penalized itself in the past for some “pushing of the envelope” that was done on its behalf without the company realizing it. You may recall that the company’s Chrome browser landing page was penalized after a paid link scandal. Of course, after the penalty wore off, the page was able to climb back up in the search results. Google, of course, has launched a new Link Disavow tool , which lets webmasters tell Google links it would like to be ignored, but Google has cautioned that this should really only be used as a last resort if you have had actual warnings, and have done all you can do to get the questionable links removed. Most sites should not use it, according to Google, and the comments made here by Cutts kind of back up that notion. Here, he’s basically saying that most sites probably don’t even need to worry about their link profiles (provided they’re not doing anything spammy), so these sites certainly wouldn’t want to mess with the Link Disavow tool, which when used improperly, could come back to haunt webmasters . If you’re unsure about what all Google considers to be link schemes, read this section from Google’s Quality Guidelines on the topic.