Mar 31 2014

Cutts Talks Evaluating Quality Of Google Algorithm Changes

  • Posted by in Web Pro News
  • Comments Off on Cutts Talks Evaluating Quality Of Google Algorithm Changes

In the latest “Webmaster Help” video from Google, Matt Cutts talks about what goes into evaluating the quality of an algorithm change before they push it out. Here’s the question he was answering: What are some of the metrics that Google uses to evaluate whether one iteration of the ranking algorithm is delivering better quality results to users than another? “Whenever an engineer is evaluating a new search quality change, and they want to know whether it’s an improvement, one thing that’s useful is we have hundreds of quality raters, who have previously rated URLs as good or bad, spam, all these sorts of different things,” says Cutts. “So when you make a change, you can see the flux. You can say, ‘What moves up? What moves down?’ And you can look at example searches where the results changed a lot, for example, and you can say, ‘Okay, given the changed search results, take the URLs that moved up – were those URLs typically higher rated than the URLs that moved down by the search quality raters?’” “And sometimes, since these are pre-computed numbers (as far as the ratings – we’ve already got a stored data bank of all those ratings from all the raters that we have), sometimes you’ll have question marks or these empty areas where things haven’t been rated,” he continues. “So you can also send that out to the raters, get the results of either a side-by-side, or you could look at the individual URLs, and in a side-by-side they say, ‘This set of search results is better,’ or ‘This set is better,’ or they might say, ‘This URL is good,’ or ‘This URL is spam,’ and you use all that to assess whether you’re making good progress.” “If you make further along, and you’re getting close to wanting to launch something, often times you’ll launch what’s called a ‘live experiment,’ where you actually take two different algorithms (say the old algorithm and the new algorithm), and you actually take results that would be generated by one, and then the other, and for example, you might interweave them,” Cutts continues. “Then, if there are more clicks on the newer set of search results, then you tend to say, ‘You know what? This newer set of search results generated by this algorithm might be a little bit better than this other algorithm). And that’s great, except for example, in webspam, people love to click on spam, and so sometimes our metrics look a little bit worse in webpspam, because people click on the spam, and we’re like, ‘Well, we got less spam, and therefore it looks like people don’t like the new algorithm as much.’ So you have to take all of those ratings with a little bit of a grain of salt because nothing replaces your judgment, and the judgment of the quality launch committee, but we do have a lot of different metrics.’” Here’s a video Google put out a few years ago showing how it makes “improvements” to its algorithm. Image via YouTube

Mar 27 2014

Cutts: Google Is Trying To Focus Less On Penalties, More On Positive Stuff

  • Posted by in Web Pro News
  • Comments Off on Cutts: Google Is Trying To Focus Less On Penalties, More On Positive Stuff

So you know how Google penalized a site over one link from a guest post that was on a topic that Google didn”t think belonged on the site (even though the site owner felt it did, and most other people can see the natural fit in topic)? Danny Sullivan wrote an article about that, which this guy shared on Twitter, saying that Google penalties have “jumped the shark”. Matt Cutts responded: @dtunkelang @dannysullivan But based on my experience looking at a lot of links and sites over the years, I'm pretty happy w/where we are. — Matt Cutts (@mattcutts) March 27, 2014 Danny also jumped in, and Matt again: @dtunkelang @mattcutts all for that, too. said so in my article, even. just think less focus on penalties, more on reward might be better — Danny Sullivan (@dannysullivan) March 27, 2014 @dannysullivan I think Google overall is trying to focus less on penalties and more on proactive, positive stuff like natural language, — Matt Cutts (@mattcutts) March 27, 2014 Aaron Wall shared a survey on that note: @mattcutts @dannysullivan survey says? — aaron wall (@aaronwall) March 27, 2014 Here’s what it’s showing now: Maybe perception would be different if Google hadn’t stopped putting out those monthly lists of algorithm updates, which might have illustrated some of that natural language-type stuff more. Maybe. Image via PollDaddy

Mar 27 2014

Cutts On How Google Views “Sister Sites”

In the latest “Webmaster Help” video from Google, Matt Cutts takes on the following question: Is there any way Google identifies “sister” websites? For example, relationships between and Does linking from one to the other taken as paid or unnatural? And I’m strictly talking about good, genuine ccTLDs for businesses? “It is the case that we try to interpret as best we can the relationships there are on the web,” he says. “At the same time it’s very helpful if you can tell us a little bit about what your sites are so that we ca return the correct content to users regardless of which country they’re coming from. So let’s look at the spectrum. On one hand, you’ve got and, and we need to know that those are somehow related, and then on the other hand, we’ve got all the way down to somebody who has a hundred different websites all about medical malpractice or something like that.” On the ccTLD case, he adds, “It is the case that we try to figure out that those sites are related, but we are doing the best we can, and if we get a little bit more help, then we can say, ‘Oh, this is a German user. They should get or’ If it’s a French user, they should get the .fr version…that sort of thing. So the best way to help is to use something called hreflang. You can do that inside of a webpage, where you can mark up, ‘Hey, on, a French version of this page is over here, and the German version of this page is over here, or if you don’t want to have that in all the different pages on your site, you can also make a sitemap. And you can just say, ‘Okay, over here is one version for a country, here’s another version for a country.’” He says doing this is really helpful because Google tries to determine where users are coming from, what their language is, and then show them the best version of your page. If you tell Google what the right versions are, they’re less likely to screw it up. He cautions that they might or might not trust links between any given sites on “any given basis.” For the most part, he says, however, that he wouldn’t worry about them being seen as paid or unnatural, because it’s pretty normal. He does advise against linking to all versions of the the site in the footer because it looks spammy. I’m pretty sure he’s covered all this before. When the sites aren’t about different languages or countries, and you have a bunch of sites, then he says you should be a lot more careful about your linking. Image via YouTube

Mar 26 2014

Matt Cutts Gives SEO Tip For Disavow Links Tool

Google’s Matt Cutts randomly tweeted a tip about the Disavow Links tool. Don’t delete your old file if you upload a new one because it “confuses folks,” and the last thing you’d want to do is confuse Google if you’re trying to fix problematic links. Here’s what he said exactly: Quick SEO tip: no need to delete your old disavow links file before uploading a new one. You can just upload the new one. — Matt Cutts (@mattcutts) March 25, 2014 Some SEOs have been deleting the file first, then uploading new, which creates two emails, and sometimes that confuses folks. — Matt Cutts (@mattcutts) March 25, 2014 The Disavow Links tool has come up in the SEO conversation several times this month. In early March, we heard about Google’s “ completely clear ” stance on disavowing “irrelevant” links. Then, a couple weeks ago, Cutts said that you should go ahead and disavow links even if you haven’t been penalized in some cases. Later still, Google’s John Mueller said that Google doesn’t use data from the tool against the sites whose URLs are being disavowed. Image via YouTube

Mar 26 2014

PSA: The Topics You Include On Your Blog Must Please Google

It’s no secret now that Google has launched an attack against guest blogging. Since penalizing MyBlogGuest earlier this month, Google has predictably reignited the link removal hysteria . More people are getting manual penalties related to guest posts. SEO Doc Sheldon got one specifically for running one post that Google deemed to not be on-topic enough for his site. Even though it was about marketing. Maybe there were more, but that’s the one Google pointed out. The message he received (via Search Engine Roundtable ) was: Google detected a pattern of unnatural, artificial, deceptive or manipulative outbound links on pages on this site. This may be the result of selling links that pass PageRank or participating in link schemes. He shared this in an open letter to Matt Cutts, Eric Schmidt, Larry Page, Sergey Brin, et al. Cutts responded to that letter with this: @DocSheldon what "Best Practices For Hispanic Social Networking" has to do with an SEO copywriting blog? Manual webspam notice was on point. — Matt Cutts (@mattcutts) March 24, 2014 To which Sheldon responded: @mattcutts My blog is about SEO, marketing, social media, web dev…. I'd say it has everything to do – or I wouldn't have run it — DocSheldon (@DocSheldon) March 25, 2014 Perhaps that link removal craze isn’t so irrational. Irrational on Google’s part perhaps, but who can really blame webmasters for succumbing to Google’s pressure to dictate what content they run on their sites when they rely on Google for traffic and ultimately business. @mattcutts So we can take this to mean that just that one link was the justification for a sitewide penalty? THAT sure sends a message! — DocSheldon (@DocSheldon) March 25, 2014 Here’s the article in question. Sheldon admitted it wasn’t the highest quality post in the world, but also added that it wasn’t totally without value, and noted that it wasn’t affected by the Panda update (which is supposed to handle the quality part algorithmically). I have a feeling that link removal craze is going to be ramping up a lot more. Ann Smarty, who runs MyBlogGuest weighed in on the conversation: I don't have the words RT @DocSheldon @mattcutts one link was the justification for a sitewide penalty? THAT sure sends a message! — Ann Smarty (@seosmarty) March 25, 2014 Image via YouTube

Mar 24 2014

Cutts On Determining If You Were Hit By An Algorithmic Penalty

If you’ve ever lost your search engine rankings to a competing site, you may have wondered if you were suffering from an algorithmic penalty from Google or if your content simply wasn’t as good as your competitors’. You’re not the only one. Google’s Matt Cutts takes on this question in the latest “Webmaster Help” video: How can you tell if your site is suffering from an algorithmic penalty, or you are simply being outgunned by better content? First he addresses manual penalties. Make sure that’s not what you’re dealing with by checking Webmaster Tools. You’ll get a notification if so, and then you can go from there. He also notes you can learn about crawl errors in WMT. Look for that kind of stuff. But if that seems all well and good, then you might want to think about the algorithm. “It’s tough because we don’t think as much or really much at all about algorithmic ‘penalties,’” Cutts says. “Really, the webspam team writes all sorts of code, but that goes into the holistic ranking that we do, and so if you’re affected by one algorithm, you call it a penalty, and if you’re affected by another algorithm, do you not call it a penalty, is a pretty tough call to make, especially when the webspam team is working on more and more general quality changes – not necessarily things specifically related to webspam – and sometimes general quality people work on things that are related to webspam, and so deciding which one to call which is kind of hard to do.” Webmasters might get a better idea of what exactly they’re dealing with if Google still provided its monthly lists of algorithm changes, but they think the world was “bored” with those , so they’re not putting them out anymore. “We rolled out something 665 different changes to how we rank search results in 2012,” Cutts continues. “So on any given day, the odds that we’re rolling out some algorithmic change are pretty good. In fact, we might be rolling out a couple if you just look at the raw number of changes that we’re doing. However, when we see an algorithmic change that we think will have a pretty big impact, we do try to give people a heads up about that. So for example, the Penguin algorithm, which is targeted towards webspam or the Panda algorithm, which is targeted towards quality content on the web…whenever we have large-scale changes that will affect things, then we tend to do an announcement that ‘Oh yeah, this changed,’ or ‘You should look at this particular date,’ and that can be a good indicator to know whether you’re affected by one of those sort of jolting algorithms that has a big impact.” Lately, they’ve mostly been announcing manual penalties, such as on link networks and on guest blogging sites . He continues, “What you’ve seen is, for example, Panda has become more and more integrated into indexing, and it’s had less of a jolting impact, and in fact we’ve gotten it so that it changes the index on a pretty regular basis, and it’s build into the index rather than rolling out on a certain day, and so it’s less useful to announce or talk about Panda launches at this point, whereas Penguin is still a switch that flips or is something that starts rolling out at a discreet time, and so we’re a little more willing to talk about those, and let people know and have a little heads up, ‘Hey, you might be affected by the Penguin algorithm.’” I think people would still be interested in knowing just when Panda is rearing its head, even if it’s getting “softer” in its old age. Again, even those monthly lists would be helpful. Cutts did say recently that Panda updates happen roughly once a month. “In general, if your site is not ranking where you want it to rank, the bad news is it’s a little hard and difficult to say whether you’d call it a penalty or not. It’s just part of ranking,” he says. “The good news is it is algorithmic, and so if you modify your site…if you change your site…if you apply your best guess about what the other site is doing that you should be doing or that it is doing well, then it’s always possible for the algorithms to re-score your site or for us to re-crawl and re-index the site, and for it to start ranking highly again. It’s kind of tricky because we have large amount of algorithms that all interact…” A large number of algorithms that webmasters used to get hints about via monthly lists of algorithm updates that Google is no longer providing. Image via YouTube

Mar 19 2014

How Googlebot Treats Multiple Breadcrumbs On E-Commerce Pages

  • Posted by in Web Pro News
  • Comments Off on How Googlebot Treats Multiple Breadcrumbs On E-Commerce Pages

Google has a new “Webmaster Help” video out about e-commerce pages with multiple breadcrumb trails. This is the second video in a row to deal specifically with e-commerce sites. Last time, Matt Cutts discussed product pages for products that are no longer available. This time, he takes on the following question: Many of my items belong to multiple categories on my eCommerce site. Can I place multiple breadcrumbs on a page? Do they confuse Googlebot? Do you properly understand the logical structure of my site? “It turns out, if you do breadcrumbs, we will currently pick the first one,” he says. “I would try to get things in the right category or hierarchy as much as you can, but that said, if an item does belong to multiple areas within your hierarchy it is possible to go ahead and have multiple breadcrumbs on a page, and in fact that can, in some circumstances, actually help Googlebot understand a little bit more about the site.” “But don’t worry about it if it only fits in one, or if you’ve only got breadcrumbs for one,” Cutts continues. “That’s the way that most people do it. That’s the normal way to do it. We encourage that, but if you do have the taxonomy (the category, the hierarchy), you know, and it’s already there, and it’s not like twenty different spots within your categories…if it’s in a few spots, you know, two or three or four…something like that, it doesn’t hurt to have those other breadcrumbs on the page. And we’ll take the first one. That’s our current behavior, and then we might be able to do a little bit of deeper understanding over time about the overall structure of your site.” For more about how Google treats breadcrumbs, you might want to take a look at this page in Google’s webmaster help center. In fact, it even gives an example of a page having more than one breadcrumb trail (Books> Authors> Stephen King and Books> Fiction> Horror) Image via YouTube

Mar 19 2014

Google Takes Action On Guest Blogging

Google has been warning webmasters about spammy guest blogging for quite a while, but now, the search engine is getting serious. Head of webspam Matt Cutts tweeted early this morning that Google has taken action on a large guest blog network, and reminded people about “the spam risks of guest blogging”. Today we took action on a large guest blog network. A reminder about the spam risks of guest blogging: — Matt Cutts (@mattcutts) March 19, 2014 That link points to a post from January on Matt’s personal blog where he proclaimed that “guest blogging is done.” He later clarified that he meant guest blogging specifically for SEO. He didn’t specify which network Google just took action on, but Pushfire CEO Rae Hoffman suggested that MyBlogGuest appears to be the “winner”. It looks like MyBlogGuest was the "winner" – not appearing on branded terms RT @mattcutts Today we took action on a large guest blog network — Rae Hoffman (@sugarrae) March 19, 2014 @gcharlton @mattcutts @patrickaltoft examples as in? "branded terms" – they ranked for their name yesterday, they don't today… — Rae Hoffman (@sugarrae) March 19, 2014 Still, from where we’re sitting, the site is in the top three for its name, appearing only under its own Twitter and Facebook pages. There has been no update from MyBlogGuest on the topic so far this morning. Update: Smarty has confirmed that the network has been penalized. [Official] Even though #myblogguest has been against paying for links (unlike other platforms), @mattcutts team decided to penalize us… — Ann Smarty (@seosmarty) March 19, 2014 I don’t think our publishers will be penalized, but let’s ask @mattcutts — Ann Smarty (@seosmarty) March 19, 2014 The site promises on its homepage , “We don’t allow in any way to manipulate Google Rankings or break any Google rules.” It does promise bloggers a way to build links, which everyone knows is a key signal in Google’s ranking algorithm (Cutts recently said links are still “super important”). Barry Schwartz at Search Engine Land points out that Ann Smarty, who owns MyBlogGuest, wrote a blog post after Cutts’ January post, saying her network wouldn’t nofollow links, so it does seem like a likely target. She wrote : MyBlogGuest is NOT going to allow nofollow links or paid guest blogging (even though Matt Cutts seems to be forcing us to for whatever reason). Instead we will keep promoting the pure and authentic guest blogging concept we believe in. She went on to note that she is an SEO who stopped depending on organic rankings a long time ago. “I believe in the Internet and its ability of giving little people (like myself) the power of being heard. I can say, I don’t care about Google,” she wrote. “I don’t think Google is THE Internet.” She’s right, and one can’t help but admire her attitude, but one also can’t help but wonder how many of those utilizing the network have that attitude. It stands to reason that Google is going to be going after more of them the way it has been doing with other link networks . Google isn’t the Internet, but how much are people spending time and effort writing guest blog posts depending on it? Update: Apparently Smarty does care about Google. Bill Hartzer writes that she told him before Cutts made the announcement, “I really hope that they don’t target MyBlogGuest. There are other guest blogging networks that should targeted, such as PostJoint, a paid guest blogging network. MylLogGuest is not a paid network.” Image via YouTube

Mar 17 2014

What You Should Do For Google On Product Pages For Products That Are No Longer Available

  • Posted by in Web Pro News
  • Comments Off on What You Should Do For Google On Product Pages For Products That Are No Longer Available

Google has a new “Webmaster Help” video out, which many ecommerce businesses may find useful. Head of webspam Matt Cutts discusses what to do on your product pages for products that are no longer available. Specifically, he answers this user-submitted question: How would Google recommend handling eCommerce products that are no longer available? (Does this change as the number of discontinued products outnumbers the active products?) He runs down a few different types of cases. He begins, “It does matter based on how many products you have and really what the throughput of those products is, how long they last, how long they’re active before they become inactive. So let’s talk about like three examples. On one example, suppose you’re a handmade furniture manufacturer – like each piece you make you handcraft, it’s a lot of work – so you only have, ten, fifteen, twenty pages of different couches and tables, and those sorts of shelves that you make. In the middle, you might have a lot more product pages, and then all the way on the end, suppose you’re craigslist, right? So you have millions and millions of pages, and on any given day, a lot of those pages become inactive because they’re no longer, you know, as relevant or because the listing has expired. So on the one side, when you have a very small number of pages (a small number of products), it probably is worth, not just doing a true 404, and saying, you know, this page is gone forever, but sort of saying, ‘Okay, if you are interested in this, you know, cherry wood shelf, well maybe you’d be interested in this mahogany wood shelf that I have instead,’ and sort of showing related products. And that’s a perfectly viable strategy. It’s a great idea whenever something is sort of a lot of work, you know, whenever you’re putting a lot of effort into those individual product pages.” “Then suppose you’ve got your average e-commerce site. You’ve got much more than ten pages or twenty pages,” Cutts continues. “You’ve got hundreds or thousands of pages. For those sorts of situations, I would probably think about just going ahead and doing a 404 because those products have gone away. That product is not available anymore, and you don’t want to be known as the product site that whenever you visit it, it’s like, ‘Oh yeah, you can’t buy this anymore.’ Because users get just as angry getting an out-of-stock message as they do “no results found’ when they think that they’re going to find reviews. Now if it’s going to come back in stock then you can make clear that it’s temporarily out of stock, but if you really don’t have that product anymore, it’s kind of frustrating to just land on that page, and see, ‘Yep, you can’t get it here.’” He goes on to discuss the Craigslist case a little more, noting that Google has a metatag that sites can use called “unavailable_after”. Here’s the original blog post where Google announced it in 2007, which discusses it more. The tag basically tells Google that after a certain date, the page is no longer relevant, so Google won’t show it in search results after that. Image via YouTube

Mar 14 2014

Google: Go Ahead And Disavow Links Even If You Haven’t Been Penalized

  • Posted by in Web Pro News
  • Comments Off on Google: Go Ahead And Disavow Links Even If You Haven’t Been Penalized

Google suggests you go ahead and use its Disavow Links tool if you know of bad links you have out there, even if Google has not penalized you. If you only have a couple, don’t worry about it, but the more you have, the more you’ll want to do it (again, according to Google). Here’s what Google’s head of webspam Matt Cutts told Rae Hoffman about it on Twitter: @sugarrae only thing I'd add is if it's 1-2 links, may not be a big deal. The more it gets close to "lots," the more worthwhile it may be. — Matt Cutts (@mattcutts) March 13, 2014 Okay, if you “know” you have “bad” links, why not? The problem is that people often don’t know which ones are actually “bad,” and as we’ve seen in the past, people go hog-wild on getting backlinks removed just because they’re afraid Google won’t like them , regardless of whether or not there is evidence of this. Cutts also said when the Disavow Links tool came out that most people shouldn’t use it. Here he is talking about when you should worry about your links. Should you spend time analyzing your links and trying to remove ones you didn’t create that look spammy? The “simple answer is no.” And here’s Google’s “completely clear” stance on disavowing irrelevant links. Image via YouTube