Jul 31 2013

Cutts: We Will Give More Info In Link Messages Over Time

Google’s Matt Cutts says webmasters can expect Google to expand the amount of info Google provides in Webmaster Tools messages related to manual web spam actions. Cutts put out a new Webmaster Help video today discussing this topic, when asked: Will Webmaster Tools ever tell us what links caused a penalty? “First off, remember, algorithmic things are just ranking, so they don’t generate messages in Webmaster Console,” Cutts responds. “However, if you log in to the Webmaster Tools Console, and you see that there’s a message, that means that there has been some direct manual action by the web spam team that is somehow directly affecting the ranking of your website. So in those cases, right now, some of those messages have example links or example URLs that are causing issues for us.” He continues, “We wouldn’t necessarily say that those are the only things because if you have a million URLs that are offending things, we couldn’t send all million URLs in an email or even a message, because that’s just gonna take too much storage, but we are going to, over time, give more and more information in those messages, and so I wouldn’t be surprised if you see, you know, 1, 2, 3 – some number of example URLs or links that give you an idea of where to look in order to find the sorts of things that are causing that particular action. So, I think that is really useful. We’re going to keep looking at how we can expand the number of example URLs that we include in messages, and I think that will be a great thing for webmasters because then you’ll have a really good idea about where to go and look in order to help diagnose what the issue is.” This is actually the second time Cutts has discussed this topic in a Webmaster Help video this month. Back on the 15th, Google released a video in which he also said they’d try to get more examples of bad links in messages to webmasters. You can check that out here , if you want to see exactly what he said then.

Jul 29 2013

Google Talks Geotargeting And Generic ccTLDs

Google’s latest Webmaster Help video deals with ccTLDs and geotargeting – specifically Google’s view of a developer grabbing a ccTLD that is generally associated with a country they’re not actually in. Here’s the exact question: As memorable .COM domains become more expensive, more developers are choosing alternate new domains like .IO and .IM – which Google geotargets to small areas. Do you discourage this activity? “I want you to go in with your eyes open,” Google’s Matt Cutts responds. “Because you can pick any domain you want, but if you pick a domain like .ES or .IT because you think you can make a novelty domain like GOOGLE.IT (‘Google It’), you know, or something like that, be aware that most domains at a country level do pertain to that specific country, and so we think that that content is going to be intended mainly for that country.” He does note that there are some ccTLDs that are more generic like .IO, which stands for Indian Ocean, but there are “very few” domains that are actually relevant to that. A lot of startups were using it, and it was something that was more applicable to the entire world, he says. For reasons like this, Google periodically reviews the list of ccTLDs, looking for things that are in wider use around the world. This way, it can view sites with these domains as more generic. Here’s a list of the domains Google considers generic. Cutts talked about this topic in another video earlier this year, specifically responding to the question: We have a vanity domain (http://ran.ge) that unfortunately isn’t one of the generic TLDs, which means we can’t set our geographic target in Webmaster Tools. Is there any way to still target our proper location? You can see his response to that one here . On a semi-related note, last week, WordPress.com started letting users register .CO domains.

Jul 24 2013

Google’s OK With This Kind Of Hidden Text

Today’s Webmaster Help video from Google is interesting. It tackles hidden text, but not the kind that Google has always spoken out agains (and talks about in its quality guidelines), but a more legitimate kind. In the video, Matt Cutts answers the following submitted question: How does Google treat hidden content which becomes visible when clicking a button? Does it look spammy if most of the text is in such a section? (e.g. simple page to buy something and “show details” button which reveals a lot of information about it). “I wouldn’t be overly concerned about this, but let’s talk through the different consequences,” begins Cutts. “It’s pretty common on the web for people to want to be able to say, ‘Click here,’ and then ‘show manufacturer details,’ ‘show specifications,’ ‘show reviews,’ and that’s a pretty normal idiom at this point. It’s not deceptive. Nobody’s trying to be manipulative. It’s easy to see that this is text that’s intended for users, and so as long as you’re doing that, I really wouldn’t be too stressed out.” He continues, “Now certainly if you were using a tiny little button that users can’t see, and there’s like six pages of text buried in there, and it’s not intended for users, and it’s keyword-stuffing, then that is something that we could possibly consider hidden text or probably would consider hidden text, but in general, if you just have something where you have a nice AJAXy sort of site, and things get revealed, and you’re trying to keep things clean, that’s not the sort of thing that’s going to be on the top of our list to worry about because there’s a lot of different sites that really do that.” “It’s pretty common on the web, and a lot of people expect that on the web,” he says. “Take, for example, Wikipedia on your mobile phone – they’ll have different sections, and then if you click, they expand those sections, and there’s good usability reasons for doing that, so as long as you’re not trying to stuff something in in a hidden way that’s deceptive or trying to distort the rankings – as long as you’re just doing that for users, I think you’ll be in good shape.” As a user notes in the comments of the video, you have to click a button to reveal the video description on YouTube.

Jul 22 2013

Matt Cutts Talks About Duplicate Content With Regards To Disclaimers, Terms/Conditions

Google’s Matt Cutts has put out a new Webmaster Help video once again discussing duplicate content. This time it’s about duplicate content with regards to how it relates to legally required content, such as disclaimers and terms and conditions. The exact question Cutts responds to is: How does duplicate copy that’s legally required (ie Terms & Conditions across multiple offers) affect performance in search? Cutts notes that there was a follow-up comment to the question, saying that some in the financial services industry are interested in the answer. “The answer is, I wouldn’t stress about this unless the content that you have is duplicated as spammy or keyword stuffing or something like that, you know, then we might be – an algorithm or a person might take action on – but if it’s legal boiler plate that’s sort of required to be there, we might, at most, might not want to count that, but it’s probably not going to cause you a big issue,” says Cutts. “We do understand that lots of different places across the web do need to have various disclaimers, legal information, terms and conditions, that sort of stuff, and so it’s the sort of thing where if we were to not rank that stuff well, then that would probably hurt our overall search quality, so I wouldn’t stress about it,” he says. So, long story short: don’t make your disclaimers and terms spammy, just like with any other content. As usual, if you play by the rules (Google’s quality guidelines), you should be fine.

Jul 17 2013

Google: You Probably Shouldn’t Link Your 20 Domains Together

You know how Google has everybody afraid of links ? People are also afraid to link to their own stuff in certain ways, and Google’s Webmaster Help video today pretty much indicates that this is with good reason. If you have a bunch of domains, you need to be careful about linking them to each other, because Google’s not a fan. Matt Cutts took on the following question in the video: Should a customer with 20 domain names link it all together or not, and if he links it should he add nofollow to the links not to pass PageRank? “Well first off, why do you have 20 domain names?” Cutts begins. “You know, if it’s all ‘CheapOnlineCasinos’ or “MedicalMalpracticeInOhio,’ you know, that sort of stuff, having 20 domain names there can look pretty spammy, and I probably would not link them all together. On the other hand, if you have 20 domain names and they’re all versions of your domain in different countries, right – Google.co.za, Google.fr, Google.de – that sort of thing, then it can make a lot of sense to have some way to get to one version of the domain to a different version.” “But even then,” he adds. “I probably wouldn’t link all the domains even in the footer, all by themselves, because that’s a little bit strange. I’d probably have one link to a country locator page, which might even be on domain.com, and you might have flags or something like that, so there are ways to get to those other domains. And as long as there’s a good way for users to get there, then search engines will be able to follow those links as well. Just make sure that they’re normal static HTML links, and we’ll be able to follow, and the PageRank will flow, and all of that sort of thing. So if there’s a really good reason for users to do it – maybe you could have a dropdown where you could pick your country or something like that – then it might make sense.” “But having the country top-level domains is one of the only areas where I can think of where you’d really need to have twenty different domains,” says Cutts. “In theory, you might have a blog network, but even then, you know, I’ve seen very large blog networks, and you’ve got that footer at the bottom that has a lot of unrelated domains, and at some point it gets pretty big. Even then, you’d probably only have like ten domains, and maybe a few posts on each domain that are linking to each other, so at the point where you have 20, unless there’s a really good reason, I would be a little bit leery of just doing some massive cross-linking scheme between all of them.” So it appears that even if you’re not trying to “scheme” per se, Google might view it as a scheme, if you link your own web properties together in a way that it doesn’t like. Of course, there’s always nofollow.

Jul 16 2013

No, Google Still Doesn’t Think Link Building Is Bad

For more than a year, webmasters have been receiving a great deal of messages from Google about unnatural links pointing to their sites. Sometimes it’s obvious which links Google does not like, but often times, it’s not so clear. As we’ve discussed repeatedly in the past , people became afraid of links to the point where they’d go around and try to have legitimate links removed from legitimate sites in an effort to reverse old link building efforts for fear that Google would not approve and send a penalty their way. Has Google made you afraid to build links or to leave existing links on the web? Let us know in the comments . Since this phenomenon really started to run rampant, Google has given webmasters the Disavow Links tool , which lets webmasters tell Google specific links to ignore, but Google’s message with that has basically been that most people shouldn’t use it, and before using it, do everything you can to clean up the bad links you have out there. So, while it is perhaps a helpful tool, it hasn’t necessarily put all of the fear of link building to bed. But Google wants you to know that it doesn’t consider link building “illegal”. Google’s Matt Cutts did an interview with Stone Temple’s Eric Enge last week, which Cutts tweeted out to his followers as a reading recommendation. We discussed some of the things Cutts said, mainly surrounding guest posts, in another article , but link building was another big area of discussion. Enge, introducing the piece, notes that there are people who think link building is illegal. “No, link building is not illegal,” says Cutts. “It’s funny because there are some types of link building that are illegal, but it’s very clear-cut: hacking blogs, that sort of thing is illegal.” But even beyond actual law, Cutts confirms that “not all link building is bad.” “The philosophy that we’ve always had is if you make something that’s compelling then it would be much easier to get people to write about it and to link to it,” Cutts tells Enge. “And so a lot of people approach it from a direction that’s backwards. They try to get the links first and then they want to be grandfathered in or think they will be a successful website as a result.” He notes that a link from a press release would “probably not count,” but if the press release convinces an editor or reporter to write a story about it, then the editorial decision counts for something. Cutts thinks a great way to build links is to build strong Twitter, Facebook and Google+ presences, and strong, engaged followings, then create great content that you push out to the audience, who will likely share it, and start doing other things that cause visibility and help it rank (these are actually Enge’s words, but Cutts “completely” agrees). In essence, you shouldn’t rely completely on Google, and should diversify your way of getting to your audience. If the Panda update taught the web one lesson, that was it. Ask Demand Media . When asked about authority as a ranking factor, Cutts tells Enge, “I would concentrate on the stuff that people write, the utility that people find in it, and the amount of times that people link to it. All of those are ways that implicitly measure how relevant or important somebody is to someone else. Links are still the best way that we’ve found to discover that , and maybe over time social or authorship or other types of markup will give us a lot more information about that.” On the subject of those link messages Google sends Webmasters, people often say they want Google to give them more specific examples of bad links. Google says it will try to give more in the future. This was the subject of a new Webmaster Help video Cutts put out this week. “We’re working on becoming more transparent, and giving more examples with messages as we can,” said Cutts. “I wouldn’t try to say, ‘Hey, give me examples in a reconsideration request,’ because a reconsideration request – we’ll read what you say, but we can really only give a small number of replies – basically ‘Yes, the reconsideration request has been granted,’ or ‘No, you still have work to do.’ There’s a very thin middle ground, which is, ‘Your request has been processed.’ That usually only applies if you have multiple webspam actions, and maybe one has been cleared, but you might have other ones left. But typically you’ll get a yes or no back.” He continued, “But there’s no field in that request to say – a live amount of text – to just say, ‘Okay, here’s some more examples. But we will work on trying to get more examples in the messages as they go out or some way where you…for example, it would be great if you could just log into Webmaster Tools and see some examples there.” “What I would say is that if you have gotten that message, feel free to stop by the Webmaster Forum, and see if you can ask for any examples, and if there’s any Googlers hanging out on the forum, maybe we can check the specific spam incident, and see whether we might be able to post or provide an example of links within that thread,” Cutts concludes. “But we’ll keep working on trying to improve things and making them more transparent.” I don’t think the audience was completely satisfied with Cutts’ video. The top YouTube comments as of the time of this writing are: “Great question. Very unsatisfying answer.” and “Matty, great non-answer. You should run for office!” Those were the two with the most upvotes. Have you been affected by Google’s link warnings? Do you think Google provides a sufficient amount of examples of what it considers to be bad links? Have you altered your link building strategy over the past year? Let us know in the comments .

Jul 15 2013

Google Will Try To Get More Examples Of ‘Bad Links’ In Messages To Webmasters

Google says it will try to get more examples of so-called “bad links” in its messages to webmasters who have submitted reconsideration requests after being hit with webspam penalties. In a Webmaster Help video today, Google’s Matt Cutts responded to the submitted question: Client got unnatural links warning in Sept’ 12 without any example links, 90% links removed, asked for examples in every RR but no reply, shouldnt it be better to have live/cached “list” of bad links or penalties in GWT? Think about genuine businesses. “That’s fair feedback. We appreciate that,” says Cutts. “We’re working on becoming more transparent, and giving more examples with messages as we can. I wouldn’t try to say, ‘Hey, give me examples in a reconsideration request,’ because a reconsideration request – we’ll read what you say, but we can really only give a small number of replies – basically ‘Yes, the reconsideration request has been granted,’ or ‘No, you still have work to do.’ There’s a very thin middle ground, which is, ‘Your request has been processed.’ That usually only applies if you have multiple webspam actions, and maybe one has been cleared, but you might have other ones left. But typically you’ll get a yes or no back.” He continues, “But there’s no field in that request to say – a live amount of text – to just say, ‘Okay, here’s some more examples. But we will work on trying to get more examples in the messages as they go out or some way where you…for example, it would be great if you could just log into Webmaster Tools and see some examples there.” “What I would say is that if you have gotten that message, feel free to stop by the Webmaster Forum, and see if you can ask for any examples, and if there’s any Googlers hanging out on the forum, maybe we can check the specific spam incident, and see whether we might be able to post or provide an example of links within that thread,” Cutts concludes. “But we’ll keep working on trying to improve things and making them more transparent.” How would you like to see Google approach this issue?

Jul 10 2013

Matt Cutts Talks About Site Downtime’s Impact On Rankings

Google has released a new Webmaster Help video with Matt Cutts addressing the question: If my site goes down for a day, does that affect my rankings? Sound familiar? I thought so too. Earlier this year, Cutts did a similar video addressing the question: How do I get my search rankings back after my site has been down? Here’s the new one: “Well, if it was just for a day, you should be in pretty good shape,” says Cutts. “You know, if your host is down for two weeks, then there’s a better indicator that the website is actually down, and we don’t want to send users to a website that’s actually down, but we do try to compensate for websites that are transiently or sporadically down, and you know, make a few allowances. We try to come back 24 hours later or something like that, so if it was only just a short period of down time, I wouldn’t really worry about that.” He adds that you might want to drop into the Google Webmaster forum and look around a little. He notes that there was recently a day where Googlebot itself was having trouble fetching pages. It usually has “pretty good reliability,” though, he says.

Jul 10 2013

Google On How Not To Do Guest Posts

Google’s view of guest blog posts has come up in industry conversation several times this week. As far as I can tell this started with an article at HisWebMarketing.com by Marie Haynes, and now Google’s Matt Cutts has been talking about it in a new interview with Eric Enge . Haynes’ post, titled, “Yes, high quality guest posts CAN get you penalized!” shares several videos of Googlers talking about the subject. The first is on old Matt Cutts Webmaster Help video that we’ve shared in the past . In that, Cutts basically said that it can be good to have a reputable, high quality writer do guest posts on your site, and that it can be a good way for some lesser-known writers to generate exposure, but… “Sometimes it get taken to extremes. You’ll see people writing…offering the same blog post multiple times or spinning the blog posts, offering them to multiple outlets. It almost becomes like low-quality article banks.” “When you’re just doing it as a way to sort of turn the crank and get a massive number of links, that’s something where we’re less likely to want to count those links,” he said. The next video Haynes points to is a Webmaster Central Hangout from February: When someone in the video says they submit articles to the Huffington Post, and asks if they should nofollow the links to their site, Google’s John Mueller says, “Generally speaking, if you’re submitting articles for your website, or your clients’ websites and you’re including links to those websites there, then that’s probably something I’d nofollow because those aren’t essentially natural links from that website.” Finally, Haynes points to another February Webmaster Central hangout: In that one, when a webmaster asks if it’s okay to get links to his site through guest postings, Mueller says, “Think about whether or not this is a link that would be on that site if it weren’t for your actions there. Especially when it comes to guest blogging, that’s something where you are essentially placing links on other people’s sites together with this content, so that’s something I kind of shy away from purely from a linkbuilding point of view. I think sometimes it can make sense to guest blog on other peoples’ sites and drive some traffic to your site because people really liked what you are writing and they are interested in the topic and they click through that link to come to your website but those are probably the cases where you’d want to use something like a rel=nofollow on those links.” Barry Schwartz at Search Engine Land wrote about Haynes’ post , and now Enge has an interview out with Cutts who elaborates more on Google’s philosophy when it comes to guest posts (among other things). Enge suggests that when doing guest posts, you create high-quality articles and get them published on “truly authoritative” sites that have a lot of editorial judgment, and Cutts agrees. He says, “The problem is that if we look at the overall volume of guest posting we see a large number of people who are offering guest blogs or guest blog articles where they are writing the same article and producing multiple copies of it and emailing out of the blue and they will create the same low quality types of articles that people used to put on article directory or article bank sites.” “If people just move away from doing article banks or article directories or article marketing to guest blogging and they don’t raise their quality thresholds for the content, then that can cause problems,” he adds. “On one hand, it’s an opportunity. On the other hand, we don’t want people to think guest blogging is the panacea that will solve all their problems.” Enge makes an interesting point about accepting guest posts too, suggesting that if you have to ask the author to share with their own social accounts, you shouldn’t accept the article. Again, Cutts agrees, saying, “That’s a good way to look at it. There might be other criteria too, but certainly if someone is proud to share it, that’s a big difference than if you’re pushing them to share it.” Both agree that interviews are good ways to build links and authority. In a separate post on his Search Engine Roundtable blog, Schwartz adds: You can argue otherwise but if Google sees a guest blog post with a dofollow link and that person at Google feels the guest blog post is only done with the intent of a link, then they may serve your site a penalty. Or they may not – it depends on who is reviewing it. That being said, Google is not to blame. While guest blogging and writing is and can be a great way to get exposure for your name and your company name, it has gotten to the point of being heavily abused. He points to one SEO’s story in a Cre8asite forum thread about a site wanting to charge him nearly five grand for one post. Obviously this is the kind of thing Google would frown upon when it comes to link building and links that flow PageRank. Essentially, these are just paid links, and even if more subtle than the average advertorial (which Google has been cracking down on in recent months), in the end it’s still link buying. But there is plenty of guest blogging going on out there in which no money changes hands. Regardless of your intensions, it’s probably a good idea to just stick the nofollows on if you want to avoid getting penalized by Google. If it’s still something you want to do without the SEO value as a consideration, there’s a fair chance it’s the kind of content Google would want anyway.

Jul 8 2013

Matt Cutts: People Searching By Voice Less Likely To Use Keywords

Google put out a pretty interesting Webmaster Help video today in which Matt Cutts discusses voice search’s impact on searcher behavior. In response to the question, “How has query syntax changed since voice search has become more popular?” Cutts talks about the trends that Google is seeing. “It’s definitely the case that if you have something coming in via voice, people are more likely to use natural language,” says Cutts. “They’re less likely to use like search operators and keywords and that sort of thing. And that’s a general trend that we see. Google wants to do better at conversational search, and just giving your answers directly if you’re asking in some sort of a conversational mode.” While search-by-voice is certainly a growing trend on mobile, Google, as you may know, recently launched its conversational search feature for the desktop, and improvements to that shouldn’t be far off. Cutts continues, “At some point, we probably have to change our mental viewpoint a little bit, because normally if you add words onto your query, you’re doing an ‘and’ between each of those words, and so as you do more and more words, you get fewer and fewer results, because fewer and fewer documents match those words. What you would probably want if you have spoken word queries is the more that you talk, the more results you get because we know more about it, and so you definitely have to change your viewpoint from ‘it’s an and of every single word’ to trying to extract the gist – you know, just summarize what they’re looking for, and that matching that overall idea.” Good luck trying to optimize for gist. “If you take it to a limit, you can imagine trying to do a query to Google using an entire document or you know, a thousand words or something like that,” Cutts adds. “And rather than match only the documents that had all thousand of those words, ideally, you’d say, ‘Okay, what is the person looking for? Maybe they’re telling you an awful lot about this topic, but try to distill down what the important parts are, and search for that.’ And so it’s definitely the case that query syntax has changed. I think it will continue to change. You know, we allow people to query by images. You can search for related images by dragging and dropping a picture on Google Image Search. So people want to be able to search in all kinds of ways. They don’t want to think about keywords if they can avoid it, and I think over time, we’ll get better and better at understanding that user’s intent whenever we’re trying to match that up and find the best set of information or answers or documents – whatever it is the user’s looking for.” These days, Google is pretty hit and miss on the relevancy front when it comes to voice search, but I have no doubt that it will continue to improve rapidly. It’s already gotten significantly better than it was in earlier days.