Dec 20 2013

Google Says It’s Now Working To ‘Promote Good Guys’

Google’s Matt Cutts says Google is “now doing work on how to promote good guys.” More specifically, Google is working on changes to its algorithm that will make it better at promoting content from people who it considers authoritative on certain subjects. You may recall earlier this year when Cutts put out the following video talking about things Google would be working on this year. In that, he said, “We have also been working on a lot of ways to help regular webmasters. We’re doing a better job of detecting when someone is more of an authority on a specific space. You know, it could be medical. It could be travel. Whatever. And try to make sure that those rank a little more highly if you’re some sort of authority or a site, according to the algorithms, we think might be a little more appropriate for users.” Apparently that’s something Google is working on right now. Cutts appeared in a “This Week In Google” video (via Search Engine Land /Transcript via Craig Moore ) in which he said: We have been working on a lot of different stuff. We are actually now doing work on how to promote good guys. So if you are an authority in a space, if you search for podcasts, you want to return something like Twit.tv. So we are trying to figure out who are the authorities in the individual little topic areas and then how do we make sure those sites show up, for medical, or shopping or travel or any one of thousands of other topics. That is to be done algorithmically not by humans … So page rank is sort of this global importance. The New York times is important so if they link to you then you must also be important. But you can start to drill down in individual topic areas and say okay if Jeff Jarvis (Prof of journalism) links to me he is an expert in journalism and so therefore I might be a little bit more relevant in the journalistic field. We’re trying to measure those kinds of topics. Because you know you really want to listen to the experts in each area if you can. For quite a while now, authorship has given Google an important signal about individuals as they relate to the content they’re putting out. Interestingly, Google is scaling authorship back a bit. Image: YouTube

Dec 19 2013

Google: Your Various ccTLDs Will Probably Be Fine From The Same IP Address

  • Posted by in Web Pro News
  • Comments Off on Google: Your Various ccTLDs Will Probably Be Fine From The Same IP Address

Ever wondered if Google would mind if you had multiple ccTLD sites hosted from a single IP address? If you’re afraid they might not take kindly to that, you’re in for some good news. It’s not really that big a deal. Google’s Matt Cutts may have just saved you some time and money with this one. He takes on the following submitted question in the latest Webmaster Help video: For one customer we have about a dozen individual websites for different countries and languages, with different TLDs under one IP number. Is this okay for Google or do you prefer one IP number per country TLD? “In an ideal world, it would be wonderful if you could have, for every different .co.uk, .com, .fr, .de, if you could have a different, separate IP address for each one of those, and have them each placed in the UK, or France, or Germany, or something like that,” says Cutts. “But in general, the main thing is, as long as you have different country code top level domains, we are able to distinguish between them. So it’s definitely not the end of the world if you need to put them all on one IP address. We do take the top-level domain as a very strong indicator.” “So if it’s something where it’s a lot of money or it’s a lot of hassle to set that sort of thing up, I wouldn’t worry about it that much,” he adds. “Instead, I’d just go ahead and say, ‘You know what? I’m gonna go ahead and have all of these domains on one IP address, and just let the top-level domain give the hint about what country it’s in. I think it should work pretty well either way.” While on the subject, you might want to listen to what Cutts had to say about location and ccTLDs earlier this year in another video .

Dec 17 2013

Google Has A Lot To Say About Duplicate Content These Days

Duplicate content has been an issue in search engine optimization for many years now, yet there is still a lot of confusion around what you can and can’t do with it, in terms of staying on Google’s good side. In fact, even in 2013, Googles’ head of webspam Matt Cutts has had to discuss the issue in several of his regular Webmaster Help videos because people keep asking questions and looking for clarification. Do you believe your site has been negatively impacted by duplicate content issues in the past? If so, what were the circumstances? Let us know in the comments . Back in the summer, Cutts talked about duplicate content with regards to disclaimers and Terms and Conditions pages. “The answer is, I wouldn’t stress about this unless the content that you have is duplicated as spammy or keyword stuffing or something like that, you know, then we might be – an algorithm or a person might take action on – but if it’s legal boiler plate that’s sort of required to be there, we might, at most, might not want to count that, but it’s probably not going to cause you a big issue,” Cutts said at the time. “We do understand that lots of different places across the web do need to have various disclaimers, legal information, terms and conditions, that sort of stuff, and so it’s the sort of thing where if we were to not rank that stuff well, then that would probably hurt our overall search quality, so I wouldn’t stress about it,” he said. The subject of duplicate content came up again in September, when Cutts took on a question about e-commerce sites that sell products with “ingredients lists” exactly like other sites selling the same product. Cutts said, “Let’s consider an ingredients list, which is like food, and you’re listing the ingredients in that food and ingredients like, okay, it’s a product that a lot of affiliates have an affiliate feed for, and you’re just going to display that. If you’re listing something that’s vital, so you’ve got ingredients in food or something like that – specifications that are 18 pages long, but are short specifications, that probably wouldn’t get you into too much of an issue. However, if you just have an affiliate feed, and you have the exact same paragraph or two or three of text that everybody else on the web has, that probably would be more problematic.” “So what’s the difference between them?” he continued. “Well, hopefully an ingredients list, as you’re describing it as far as the number of components or something probably relatively small – hopefully you’ve got a different page from all the other affiliates in the world, and hopefully you have some original content – something that distinguishes you from the fly-by-night sites that just say, ‘Okay, here’s a product. I got the feed and I’m gonna put these two paragraphs of text that everybody else has.’ If that’s the only value add you have then you should ask yourself, ‘Why should my site rank higher than all these hundreds of other sites when they have the exact same content as well?’” He went on to note that if the majority of your content is the same content that appears everywhere else, and there’s nothing else to say, that’s probably something you should avoid. It all comes down to whether or not there’s added value, which is something Google has pretty much always stood by, and is reaffirmed in a newer video. Cutts took on the subject once again this week. This time, it was in response to this question: How does Google handle duplicate content and what negative effects can it have on rankings from an SEO perspective? “It’s important to realize that if you look at content on the web, something like 25 or 30 percent of all of the web’s content is duplicate content,” said Cutts. “There’s man page for Linux, you know, all those sorts of things. So duplicate content does happen. People will quote a paragraph of a blog, and then link to the blog. That sort of thing. So it’s not the case that every single time there’s duplicate content, it’s spam. If we made that assumption, the changes that happened as a result would end up, probably, hurting our search quality rather than helping our search quality.” “So the fact is, Google looks for duplicate content and where we can find it, we often try to group it all together and treat it as if it’s one piece of content,” he continued. “So most of the time, suppose we’re starting to return a set of search results, and we’ve got two pages that are actually kind of identical. Typically we would say, “Ok, you know what? Rather than show both of those pages (since they’re duplicates) let’s just show one of those pages, and we’ll crowd the other result out.’ And if you get to the bottom of the search results, and you really want to do an exhaustive search, you can change the filtering so that you can say, okay, I want to see every singe page, and then you’d see that other page.” “ But for the most part, duplicate content is not really treated as spam. ,” he said. “It’s just treated as something that we need to cluster appropriately. We need to make sure that it ranks correctly, but duplicate content does happen. Now, that said, it’s certainly the case that if you do nothing but duplicate content, and you’re doing in in abusive, deceptive or malicious or a manipulative way, we do reserve the right to take action on spam.” He mentions that someone on Twitter was asking how to do an RSS autoblog to a blog site, and not have that be viewed as spam. “The problem is that if you are automatically generating stuff that’s coming from nothing but an RSS feed, you’re not adding a lot of value,” said Cutts. “So that duplicate content might be a little more likely to be viewed as spam. But if you’re just making a regular website, and you’re worried about whether you have something on the .com and the .co.uk, or you might have two versions of your Terms and Conditions – an older version and a newer version – or something like that. That sort of duplicate content happens all the time on the web, and I really wouldn’t get stressed out about the notion that you might have a little bit of duplicate content. As long as you’re not trying to massively copy for every city and every state in the entire United States, show the same boiler plate text….for the most part, you should be in very good shape, and not really have to worry about it.” In case you’re wondering, quoting is not considered duplicate content in Google’s eyes. Cutts spoke on that late last year. As long as you’re just quoting, using an excerpt from something, and linking to the original source in a fair use kind of way, you should be fine. Doing this with entire articles (which happens all the time) is of course a different story. Google, as you know, designs its algorithms to abide by its quality guidelines, and duplicate content is part of that, so this is something you’re always going to have to consider. It says right in the guidelines , “Don’t create multiple pages, subdomains, or domains with substantially duplicate content.” They do, however, offer steps you can take to address any duplicate content issues that you do have. These include using 301s, being consistent, using top-level domains, syndicating “carefully,” using Webmaster Tools to tell Google how you prefer your site to be indexed, minimizing boilerplate repetition, avoiding publishing stubs (empty pages, placeholders), understanding your conteent management system and minimizing similar content. Google advises blocking it from indexing duplicate content though, so think about that too. This is because it won’t be able to detect when URLs point to the same content, and will have to treat them as separate pages. Use the canonical link element . Have you been affected by how Google handles duplicate content in any way? Please share .

Dec 16 2013

The Latest From Google On Guest Blogging

The subject of guest blogging has been coming up more and more lately in Google’s messaging to webmasters. Long story short, just don’t abuse it. Matt Cutts talked about it in response to a submitted question in a recent Webmaster Help video: He said, “It’s clear from the way that people are talking about it that there are a lot of low-quality guest blogger sites, and there’s a lot of low-quality guest blogging going on. And anytime people are automating that or abusing that or really trying to make a bunch of link without really doing the sort of hard work that really earns links on the basis of merit or because they’re editorial, then it’s safe to assume that Google will take a closer look at that.” “I wouldn’t recommend that you make it your only way of gathering links,” Cutts added. “I wouldn’t recommend that you send out thousands of blast emails offering to guest blog. I wouldn’t recommend that you guest blog with the same article on two different blogs. I wouldn’t recommend that you take one article and spin it lots of times. There’s definitely a lot of abuse and growing spam that we see in the guest blogging space, so regardless of the spam technique that people are using from month to month, we’re always looking at things that are starting to be more and more abused, and we’re always willing to respond to that and take the appropriate action to make sure that users get the best set of search results.” But you already knew that, right?

Dec 16 2013

Google Goes After Yet Another Link Network

Earlier this month, Google revealed that it would be cracking down on more link networks , following a larger trend that has been taking place throughout the year. Google’s Matt Cutts hinted on Twitter that Google was taking action on the network Anglo Rank. "There are absolutely NO footprints linking the websites together" Oh, Anglo Rank. — Matt Cutts (@mattcutts) December 6, 2013 He went on to note that they’d be “rolling up a few.” On Friday, Cutts tweeted similarly: "Our installation code/software used to publish the sold links is not detectable by the search engine bots." Au contraire! — Matt Cutts (@mattcutts) December 13, 2013 This was apparently in reference to another network, BackLinks.com. No surprises really, but Google is making it quite clear that it’s going to continue to penalize these types of sites. Hat tip to Search Engine Land . Image: BackLinks.com

Dec 9 2013

Google Gives Advice On Speedier Penalty Recovery

Google has shared some advice in a new Webmaster Help video about recovering from Google penalties that you have incurred as the result of a time period of spammy links. Now, as we’ve seen, sometimes this happens to a company unintentionally. A business could have hired the wrong person/people to do their SEO work, and gotten their site banished from Google, without even realizing they were doing anything wrong. Remember when Google had to penalize its own Chrome landing page because a third-party firm bent the rules on its behalf? Google is cautiously suggesting “radical” actions from webmasters, and sending a bit of a mixed message. How far would you go to get back in Google’s good graces? How important is Google to your business’ survival? Share your thoughts in the comments . The company’s head of webspam, Matt Cutts, took on the following question: How did Interflora turn their ban in 11 days? Can you explain what kind of penalty they had, how did they fix it, as some of us have spent months try[ing] to clean things up after an unclear GWT notification. As you may recall, Interflora, a major UK flowers site, was hit with a Google penalty early this year . Google didn’t exactly call out the company publicly, but after reports of the penalty came out, the company mysteriously wrote a blog post warning people not to engage in the buying and selling of links. But you don’t have to buy and sell links to get hit with a Google penalty for webspam, and Cutts’ response goes beyond that. He declines to discuss a specific company because that’s not typically not Google’s style, but proceeds to try and answer the question in more general terms. “Google tends to looking at buying and selling links that pass PageRank as a violation of our guidelines, and if we see that happening multiple times – repeated times – then the actions that we take get more and more severe, so we’re more willing to take stronger action whenever we see repeat violations,” he says. That’s the first thing to keep in mind, if you’re trying to recover. Don’t try to recover by breaking the rules more, because that will just make Google’s vengeance all the greater when it inevitably catches you. Google continues to bring the hammer down on any black hat link network it can get its hands on, by the way. Just the other day, Cutts noted that Google has taken out a few of them , following a larger trend that has been going on throughout the year. The second thing to keep in mind is that Google wants to know your’e taking its guidelines seriously, and that you really do want to get better – you really do want to play by the rules. “If a company were to be caught buying links, it would be interesting if, for example, [if] you knew that it started in the middle of 2012, and ended in March 2013 or something like that,” Cutts continues in the video. “If a company were to go back and disavow every single link that they had gotten in 2012, that’s a pretty monumentally epic, large action. So that’s the sort of thing where a company is willing to say, ‘You know what? We might have had good links for a number of years, and then we just had really bad advice, and somebody did everything wrong for a few months – maybe up to a year, so just to be safe, let’s just disavow everything in that timeframe.’ That’s a pretty radical action, and that’s the sort of thing where if we heard back in a reconsideration request that someone had taken that kind of a strong action, then we could look, and say, ‘Ok, this is something that people are taking seriously.” Now, don’t go getting carried away. Google has been pretty clear since the Disavow Links tool launched that this isn’t something that most people want to do. Cutts reiterates, “So it’s not something that I would typically recommend for everybody – to disavow every link that you’ve gotten for a period of years – but certainly when people start over with completely new websites they bought – we have seen a few cases where people will disavow every single link because they truly want to get a fresh start. It’s a nice looking domain, but the previous owners had just burned it to a crisp in terms of the amount of webspam that they’ve done. So typically what we see from a reconsideration request is people starting out, and just trying to prune a few links. A good reconsideration request is often using the ‘domain:’ query, and taking out large amounts of domains which have bad links.” “I wouldn’t necessarily recommend going and removing everything from the last year or everything from the last year and a half,” he adds. “But that sort of large-scale action, if taken, can have an impact whenever we’re assessing a domain within a reconsideration request.” In other words, if your’e willing to go to such great lengths and eliminate such a big number of links, Google’s going to notice. I don’t know that it’s going to get you out of the penalty box in eleven days (as the Interflora question mentions), but it will at least show Google that you mean business, and, in theory at least, help you get out of it. Much of what Cutts has to say this time around echoes things he has mentioned in the past. Earlier this year, he suggested using the Disavow Links tool like a “machete”. He noted that Google sees a lot of people trying to go through their links with a fine-toothed comb, when they should really be taking broader swipes. “For example, often it would help to use the ‘domain:’ operator to disavow all bad backlinks from an entire domain rather than trying to use a scalpel to pick out the individual bad links,” he said. “That’s one reason why we sometimes see it take a while to clean up those old, not-very-good links.” On another occasion, he discussed some common mistakes he sees people making with the Disavow Links tool. The first time someone attempts a reconsideration request, people are taking the scalpel (or “fine-toothed comb”) approach, rather than the machete approach. “You need to go a little bit deeper in terms of getting rid of the really bad links,” he said. “So, for example, if you’ve got links from some very spammy forum or something like that, rather than trying to identify the individual pages, that might be the opportunity to do a ‘domain:’. So if you’ve got a lot of links that you think are bad from a particular site, just go ahead and do ‘domain:’ and the name of that domain. Don’t maybe try to pick out the individual links because you might be missing a lot more links.” And remember, you need to make sure you’re using the right syntax. You need to use the “domain:” query in the following format: domain:example.com Don’t add an “http” or a ‘www” or anything like that. Just the domain. So, just to recap: Radical, large-scale actions could be just what you need to take to make Google seriously reconsider your site, and could get things moving more quickly than trying single out links from domains. But Google wouldn’t necessarily recommend doing it. Oh, Google. You and your crystal clear, never-mixed messaging. As Max Minzer commented on YouTube (or is that Google+? ), “everyone is going to do exactly that now…unfortunately.” Yes, this advice will no doubt lead many to unnecessarily obliterate many of the backlinks they’ve accumulated – including legitimate links – for fear of Google . Fear they won’t be able to make that recovery at all, let alone quickly. Hopefully the potential for overcompensation will be considered if Google decides to use Disavow Links as a ranking signal . Would you consider having Google disavow all links from a year’s time? Share your thoughts in the comments .

Dec 7 2013

Google Takes Action On More Link Networks

Google has been cracking down on link networks, penalizing the networks and the sites that take advantage of them to artificially inflate their link profiles, all year. Google’s Matt Cutts hinted on Twitter that the search engine has taken action on yet another one – Anglo Rank: "There are absolutely NO footprints linking the websites together" Oh, Anglo Rank. — Matt Cutts (@mattcutts) December 6, 2013 While engaging with the Search Engine Land crew on Twitter, he noted they’ve been “rolling up a few”. A giant ad for AngloRank can be seen at BlackHatWorld (h/t: Search Engine Land ). It promises “high PR English links from the most exclusive and unique private networks on the web.” Back in May, Cutts announced that Google would continue to tackle link networks, and that in fact, they had just taken action on “several thousand linksellers”. More recently, Google took out the link network GhostRank 2.0 . The moral of the story is: stay away from these networks, because Google will figure it out, and make you pay. But you already knew that. Image via BlackHatWorld

Dec 6 2013

Google Updates Toolbar PageRank After All

Just when you thought you were out, they’ve pulled you back in. Google has updated its data for Toolbar PageRank, after giving indication that it likely wouldn’t happen before the end of the year, if at all. Many of us assumed that it was pretty much going away because it has been so long since it has been updated, after years of regularity. Reactions to the update are mixed. Some are happy to see the new(er) data, while others wish it would just go away once and for all. As those in the SEO industry have known for years, the data simply isn’t that useful as a day-to-day tool, mainly due to the time that passes between updates. Yet others obsess about it. Here’s a real time look at what people are saying about the update on Twitter: Tweets about “pagerank” This is the first time Google has updated PageRank since February. Historically, they’ve updated it every thee or four months. Google’s Matt Cutts tweeted in October that he’d be surprised if there was another PR update before 2014. Shortly after that, Cutts discussed the topic in a Webmaster Help video: “Over time, the Toolbar PageRank is getting less usage just because recent versions of Internet Explorer don’t really let you install toolbars as easily, and Chrome doesn’t have the toolbar so over time, the PageRank indicator will probably start to go away a little bit,” he said. In another video earlier in the year, he said, “Maybe it will go away on its own or eventually we’ll reach the point where we say, ‘Okay, maintaining this is not worth the amount of work. Apparently that time has not come yet, as we thought it probably had. Are you happy to see the PR update? Should Google continue to update this information?

Dec 4 2013

Matt Cutts Talks Content Stitching In New Video

Google has a new Webmaster Help video out about content that takes text from other sources. Specifically, Matt Cutts responds to this question: Hi Matt, can a site still do well in Google if I copy only a small portion of content from different websites and create my own article by combining it all, considering I will mention the source of that content (by giving their URLs in the article)? “Yahoo especially used to really hate this particular technique,” says Cutts. “They called it ‘stitching’. If it was like two or three sentences from one article, and two or three sentences from another article, and two or three sentences from another article, they really considered that spam. If all you’re doing is just taking quotes from everybody else, that’s probably not a lot of added value. So I would really ask yourself: are you doing this automatically? Why are you doing this? Why? People don’t just like to watch a clip show on TV. They like to see original content.” I don’t know. SportsCenter is pretty popular, and I don’t think it’s entirely for all the glowing commentary. It’s also interesting that he’s talking about this from Yahoo’s perspective. “They don’t just want to see an excerpt and one line, and then an excerpt and one line, and that sort of thing,” Cutts continues. “Now it is possible to pull together a lot of different sources, and generate something really nice, but you’re usually synthesizing. For example, Wikipedia will have stuff that’s notable about a particular topic, and they’ll have their sources noted, and they cite all of their sources there, and they synthesize a little bit, you know. It’s not like they’re just copying the text, but they’re sort of summarizing or presenting as neutral of a case as they can. That’s something that a lot of people really enjoy, and if that’s the sort of thing that you’re talking about, that would probably be fine, but if you’re just wholesale copying sections from individual articles, that’s probably going to be a higher risk area, and I might encourage you to avoid that if you can.” If you’re creating good content that serves a valid purpose for your users, my guess is that you’ll be fine, but you know Google hates anything automated when it comes to content.