May 22 2015

Google Has Replaced Matt Cutts

It looks like Matt Cutts has been officially replaced as the head of web spam at Google. We don’t know who his replacement is, and we might not anytime soon, but the company has confirmed his replacement nevertheless. In July, it will be the one-year anniversary of when Cutts announced he was taking leave from Google. It was originally supposed to last at least through the following October. At the time, he wrote on his personal blog : I wanted to let folks know that I’m about to take a few months of leave. When I joined Google, my wife and I agreed that I would work for 4-5 years, and then she’d get to see more of me. I talked about this as recently as last month and as early as 2006. And now, almost fifteen years later I’d like to be there for my wife more. I know she’d like me to be around more too, and not just physically present while my mind is still on work. So we’re going to take some time off for a few months. My leave starts next week. Currently I’m scheduled to be gone through October. Thanks to a deep bench of smart engineers and spam fighters, the webspam team is in more-than-capable hands. Seriously, they’re much better at spam fighting than I am, so don’t worry on that score. At the end of October, Cutts revealed in a tweet that he was extending his leave into 2015: I'm planning to extend my leave into 2015: https://t.co/T5adq50x4L — Matt Cutts (@mattcutts) November 1, 2014 In November, Cutts made some comments on a web chat show indicating that he might be interested in doing different work at Google when he decides to go back to work. Search Engine Land is now reporting that Google has someone new in the Matt Cutts position of head of web spam, but that this person won’t be “the all-around spokesperson” that Cutts was, so they’re not naming who it is. Danny Sullivan writes: Going forward, Google says to continue to expect what’s already been happening while Cutts has been away. Various individual Googlers will keep splitting the role of providing advice and answers to SEOs and publishers in online forums, conferences and other places. So far, webmaster trends analyst John Mueller has been the most publicly visible voice of webmaster issues for Google on the internet, regularly hosting lengthy webmaster hangouts and talking about various Google updates and whatnot. Matt’s Twitter bio still says, “I’m the head of the webspam team at Google. (Currently on leave).” Image via YouTube

Jul 7 2014

Matt Cutts Is Disappearing For A While

Just ahead of the holiday weekend, Google’s head of webspam Matt Cutts announced that he is taking leave from Google through at least October, which means we shouldn’t be hearing from him (at least about Google) for at least three months or so. That’s a pretty significant amount of time when you consider how frequently Google makes announcements and changes things up. Is the SEO industry ready for three Matt Cutts-less months? Cutts explains on his personal blog : I wanted to let folks know that I’m about to take a few months of leave. When I joined Google, my wife and I agreed that I would work for 4-5 years, and then she’d get to see more of me. I talked about this as recently as last month and as early as 2006. And now, almost fifteen years later I’d like to be there for my wife more. I know she’d like me to be around more too, and not just physically present while my mind is still on work. So we’re going to take some time off for a few months. My leave starts next week. Currently I’m scheduled to be gone through October. Thanks to a deep bench of smart engineers and spam fighters, the webspam team is in more-than-capable hands. Seriously, they’re much better at spam fighting than I am, so don’t worry on that score. He says he wont’ be checking his work email at all while he’s on leave, but will have some of his outside email forwarded to “a small set of webspam folks,” noting that they won’t be replying. Cutts is a frequent Twitter user, and didn’t say whether or not he’ll be staying off there, but either way, I wouldn’t expect him to tweet much about search during his leave. If you need to reach Google on a matter that you would have typically tried to go to Matt Cutts about, he suggests webmaster forums, Office Hours Hangouts, the Webmaster Central Twitter account, the Google Webmasters Google+ account, or or trying other Googlers. He did recently pin this tweet from 2010 to the top of his timeline: When you've got 5 minutes to fill, Twitter is a great way to fill 35 minutes. — Matt Cutts (@mattcutts) May 11, 2010 So far, he hasn’t stopped tweeting, but his latest – from six hours ago – is just about his leave: I got my inbox down to zero for a shiny moment, then unpinned and closed the tab with work email: http://t.co/o7zBOvskBE — Matt Cutts (@mattcutts) July 7, 2014 That would seem to suggest he doesn’t plan to waste much of his time off on Twitter. So what will Matt be doing while he’s gone? Taking a ballroom dance class with his wife, trying a half-Iornman race, and going on a cruise. He says they might also do some additional traveling ahead of their fifteen-year wedding anniversary, and will spend more time with their parents. Long story short, leave Cutts alone. He’s busy. Image via YouTube

May 30 2014

Google’s Transparency Called Into Question Again

Though it’s back in Google’s results now, another company is making headlines for being penalized by Google. This time it’s Vivint, which produces smart thermostats, and competes with Nest, which Google acquired earlier this year. PandoDaily’s James Robinson wrote an article about it , noting that Vivint had received warnings from Google about external links that didn’t comply with its quality guidelines, but didn’t confirm what the links were. Rather, the company was “left to fish in the dark to figure out what i had done to upset its rival.” As Robinson correctly noted, Rap Genius was removed from Google’s search results last year for violating guidelines, and was back in business within two weeks. At the time, Google was accused by some of employing a double standard for letting the site recover so quickly compared to others. Google’s Matt Cutts had some comments about the Pando article on Hacker News . He wrote: It’s a shame that Pando’s inquiry didn’t make it to me, because the suggestion that Google took action on vivint.com because it was somehow related to Nest is silly. As part of a crackdown on a spammy blog posting network, we took action on vivint.com–along with hundreds of other sites at the same time that were attempting to spam search results. We took action on vivint.com because it was spamming with low-quality or spam articles… He listed several example links, and continued: and a bunch more links, not to mention 25,000+ links from a site with a paid relationship where the links should have been nofollowed. When we took webspam action, we alerted Vivint via a notice in Webmaster Tools about unnatural links to their site. And when Vivint had done sufficient work to clean up the spammy links, we granted their reconsideration request. This had nothing whatsoever to do with Nest. The webspam team caught Vivint spamming. We held them (along with many other sites using the same spammy guest post network) accountable until they cleaned the spam up. That’s all. He said later in the thread that Google “started dissecting” the guest blog posting network in question in November, noting that Google didn’t acquire Nest until January. In case you’re wondering when acquisition talks began, Cutts said, “You know Larry Page doesn’t have me on speed dial for companies he’s planning to buy, right? No one involved with this webspam action (including me) knew about the Nest acquisition before it was publicly announced.” “Vivint was link spamming (and was caught by the webspam team for spamming) before Google even acquired Nest,” he said. Robinson, in a follow-up article , takes issue with Cutts calling Pando’s reporting “silly,” and mockingly says Cutts “wants you to know Google is totally transparent.” Here’s an excerpt: “It’s a shame that Pando’s inquiry didn’t make it to me,” Cutts writes, insinuating we didn’t contact the company for comment. Pando had in fact reached out to Google’s press team and consulted in detail with the company spokesperson who was quoted in our story. It is now clear why Google didn’t pass on our questions to Cutts. He goes on to say that Cutts’ assessment of VIvint’s wrongdoing is “exactly what we described in our article — no one is disputing that Vivint violated Google’s search rules.” He also calls Cutts’ comments “a slightly simplistic version of events, given the months-long frustration Vivint spoke of in trying to fix the problem.” Robinson concludes the article: The point of our reporting is to highlight the unusual severity of the punishment (locked out for months, completely delisted from results until this week) given Vivint’s relationship to a Google-owned company and the lack of transparency Google offers in assisting offending sites. Multiple sources at Vivint told us that the company was told that it had “unnatural links” but was left to guess at what these were, having to repeatedly cut content blindly and ask for reinstatement from Google, until it hit upon the magic recipe. To these charges, Cutts has no answer. That’s a shame. Now, I’m going to pull an excerpt from an article of my own from November because it seems highly relevant here: Many would say that Google has become more transparent over the years. It gives users, businesses and webmasters access to a lot more information about its intentions and business practices than it did long ago, but is it going far enough? When it comes to its search algorithm and changes to how it ranks content, Google has arguably scaled back a bit on the transparency over the past year or so. Google, as a company, certainly pushes the notion that it is transparent. Just last week, Google updated its Transparency Report for the eighth time, showing government requests for user information (which have doubled over three years, by the way). That’s one thing. For the average online business that relies on Internet visibility for customers, however, these updates are of little comfort. A prime example of where Google has reduced its transparency is the monthly lists of algorithm changes it used to put out, but stopped. Cutts said the “world got bored” with those . Except it really didn’t as far as we can tell. Image via YouTube

May 30 2014

Google’s Transparency Called Into Question Again

Though it’s back in Google’s results now, another company is making headlines for being penalized by Google. This time it’s Vivint, which produces smart thermostats, and competes with Nest, which Google acquired earlier this year. PandoDaily’s James Robinson wrote an article about it , noting that Vivint had received warnings from Google about external links that didn’t comply with its quality guidelines, but didn’t confirm what the links were. Rather, the company was “left to fish in the dark to figure out what i had done to upset its rival.” As Robinson correctly noted, Rap Genius was removed from Google’s search results last year for violating guidelines, and was back in business within two weeks. At the time, Google was accused by some of employing a double standard for letting the site recover so quickly compared to others. Google’s Matt Cutts had some comments about the Pando article on Hacker News . He wrote: It’s a shame that Pando’s inquiry didn’t make it to me, because the suggestion that Google took action on vivint.com because it was somehow related to Nest is silly. As part of a crackdown on a spammy blog posting network, we took action on vivint.com–along with hundreds of other sites at the same time that were attempting to spam search results. We took action on vivint.com because it was spamming with low-quality or spam articles… He listed several example links, and continued: and a bunch more links, not to mention 25,000+ links from a site with a paid relationship where the links should have been nofollowed. When we took webspam action, we alerted Vivint via a notice in Webmaster Tools about unnatural links to their site. And when Vivint had done sufficient work to clean up the spammy links, we granted their reconsideration request. This had nothing whatsoever to do with Nest. The webspam team caught Vivint spamming. We held them (along with many other sites using the same spammy guest post network) accountable until they cleaned the spam up. That’s all. He said later in the thread that Google “started dissecting” the guest blog posting network in question in November, noting that Google didn’t acquire Nest until January. In case you’re wondering when acquisition talks began, Cutts said, “You know Larry Page doesn’t have me on speed dial for companies he’s planning to buy, right? No one involved with this webspam action (including me) knew about the Nest acquisition before it was publicly announced.” “Vivint was link spamming (and was caught by the webspam team for spamming) before Google even acquired Nest,” he said. Robinson, in a follow-up article , takes issue with Cutts calling Pando’s reporting “silly,” and mockingly says Cutts “wants you to know Google is totally transparent.” Here’s an excerpt: “It’s a shame that Pando’s inquiry didn’t make it to me,” Cutts writes, insinuating we didn’t contact the company for comment. Pando had in fact reached out to Google’s press team and consulted in detail with the company spokesperson who was quoted in our story. It is now clear why Google didn’t pass on our questions to Cutts. He goes on to say that Cutts’ assessment of VIvint’s wrongdoing is “exactly what we described in our article — no one is disputing that Vivint violated Google’s search rules.” He also calls Cutts’ comments “a slightly simplistic version of events, given the months-long frustration Vivint spoke of in trying to fix the problem.” Robinson concludes the article: The point of our reporting is to highlight the unusual severity of the punishment (locked out for months, completely delisted from results until this week) given Vivint’s relationship to a Google-owned company and the lack of transparency Google offers in assisting offending sites. Multiple sources at Vivint told us that the company was told that it had “unnatural links” but was left to guess at what these were, having to repeatedly cut content blindly and ask for reinstatement from Google, until it hit upon the magic recipe. To these charges, Cutts has no answer. That’s a shame. Now, I’m going to pull an excerpt from an article of my own from November because it seems highly relevant here: Many would say that Google has become more transparent over the years. It gives users, businesses and webmasters access to a lot more information about its intentions and business practices than it did long ago, but is it going far enough? When it comes to its search algorithm and changes to how it ranks content, Google has arguably scaled back a bit on the transparency over the past year or so. Google, as a company, certainly pushes the notion that it is transparent. Just last week, Google updated its Transparency Report for the eighth time, showing government requests for user information (which have doubled over three years, by the way). That’s one thing. For the average online business that relies on Internet visibility for customers, however, these updates are of little comfort. A prime example of where Google has reduced its transparency is the monthly lists of algorithm changes it used to put out, but stopped. Cutts said the “world got bored” with those . Except it really didn’t as far as we can tell. Image via YouTube

May 27 2014

Google: Panda 4.0 Brings in ‘Softer Side,’ Lays Groundwork For Future

  • Posted by in Web Pro News
  • Comments Off on Google: Panda 4.0 Brings in ‘Softer Side,’ Lays Groundwork For Future

Back in March, Google’s Matt Cutts spoke at the Search Marketing Expo , and said that Google was working on the next generation of the Panda update, which he said would be softer and more friendly to small sites and businesses. Last week, Google pushed Panda 4.0 , which Cutts reiterated is a bit softer than previous versions, and also said will “lay the groundwork” for future iterations. @Marie_Haynes think of it like P4 is a new architecture. Brings in some of the softer side, but also lays groundwork for future iteration. — Matt Cutts (@mattcutts) May 23, 2014 Barry Schwartz at SMX sister site Search Engine Land, who was in attendance at the session in which Cutts spoke about the update, gave a recap of his words at the time: Cutts explained that this new Panda update should have a direct impact on helping small businesses do better. One Googler on his team is specifically working on ways to help small web sites and businesses do better in the Google search results. This next generation update to Panda is one specific algorithmic change that should have a positive impact on the smaller businesses. It’s interesting that Google even announced the update at all, as it had pretty much stopped letting people know when new Panda refreshes were launched. The world is apparently not bored enough with Panda updates for Google to stop announcing them entirely. Here’s a look at Searchmetrics’ attempt to identify the top winners and losers of Panda 4.0. Image via YouTube

May 27 2014

Google: Panda 4.0 Brings in ‘Softer Side,’ Lays Groundwork For Future

  • Posted by in Web Pro News
  • Comments Off on Google: Panda 4.0 Brings in ‘Softer Side,’ Lays Groundwork For Future

Back in March, Google’s Matt Cutts spoke at the Search Marketing Expo , and said that Google was working on the next generation of the Panda update, which he said would be softer and more friendly to small sites and businesses. Last week, Google pushed Panda 4.0 , which Cutts reiterated is a bit softer than previous versions, and also said will “lay the groundwork” for future iterations. @Marie_Haynes think of it like P4 is a new architecture. Brings in some of the softer side, but also lays groundwork for future iteration. — Matt Cutts (@mattcutts) May 23, 2014 Barry Schwartz at SMX sister site Search Engine Land, who was in attendance at the session in which Cutts spoke about the update, gave a recap of his words at the time: Cutts explained that this new Panda update should have a direct impact on helping small businesses do better. One Googler on his team is specifically working on ways to help small web sites and businesses do better in the Google search results. This next generation update to Panda is one specific algorithmic change that should have a positive impact on the smaller businesses. It’s interesting that Google even announced the update at all, as it had pretty much stopped letting people know when new Panda refreshes were launched. The world is apparently not bored enough with Panda updates for Google to stop announcing them entirely. Here’s a look at Searchmetrics’ attempt to identify the top winners and losers of Panda 4.0. Image via YouTube The post Google: Panda 4.0 Brings in ‘Softer Side,’ Lays Groundwork For Future appeared first on WebProNews .

May 27 2014

Google: Panda 4.0 Brings in ‘Softer Side,’ Lays Groundwork For Future

  • Posted by in Web Pro News
  • Comments Off on Google: Panda 4.0 Brings in ‘Softer Side,’ Lays Groundwork For Future

Back in March, Google’s Matt Cutts spoke at the Search Marketing Expo , and said that Google was working on the next generation of the Panda update, which he said would be softer and more friendly to small sites and businesses. Last week, Google pushed Panda 4.0 , which Cutts reiterated is a bit softer than previous versions, and also said will “lay the groundwork” for future iterations. @Marie_Haynes think of it like P4 is a new architecture. Brings in some of the softer side, but also lays groundwork for future iteration. — Matt Cutts (@mattcutts) May 23, 2014 Barry Schwartz at SMX sister site Search Engine Land, who was in attendance at the session in which Cutts spoke about the update, gave a recap of his words at the time: Cutts explained that this new Panda update should have a direct impact on helping small businesses do better. One Googler on his team is specifically working on ways to help small web sites and businesses do better in the Google search results. This next generation update to Panda is one specific algorithmic change that should have a positive impact on the smaller businesses. It’s interesting that Google even announced the update at all, as it had pretty much stopped letting people know when new Panda refreshes were launched. The world is apparently not bored enough with Panda updates for Google to stop announcing them entirely. Here’s a look at Searchmetrics’ attempt to identify the top winners and losers of Panda 4.0. Image via YouTube

May 21 2014

Google Launches Two Algorithm Updates Including New Panda

Google makes changes to its algorithm every day (sometimes multiple changes in one day). When the company actually announces them, you know they’re bigger than the average update, and when one of them is named Panda, it’s going to get a lot of attention. Have you been affected either positively or negatively by new Google updates? Let us know in the comments . Google’s head of webspam Matt Cutts tweeted about the updates on Tuesday night: Google is rolling out our Panda 4.0 update starting today. — Matt Cutts (@mattcutts) May 20, 2014 This past weekend we started rolling out a ranking update for very spammy queries: http://t.co/NpUZRqpnBI — Matt Cutts (@mattcutts) May 21, 2014 Panda has been refreshed on a regular basis for quite some time now, and Google has indicated in the past that it no longer requires announcements because of that. At one point, it was actually softened . But now, we have a clear announcement about it, and a new version number (4.0), so it must be significant. For one, this indicates that the algorithm was actually updated as opposed to just refreshed, opening up the possibility for some big shuffling of rankings. The company told Search Engine Land that the new Panda affects different languages to different degrees, and impacts roughly 7.5% of queries in English to the degree regular users might notice. The other update is the what is a new version of what is sometimes referred to as the “payday loans” update. The first one was launched just a little more than a year ago. Cutts discussed it in this video before launching it: “We get a lot of great feedback from outside of Google, so, for example, there were some people complaining about searches like ‘payday loans’ on Google.co.uk,” he said. “So we have two different changes that try to tackle those kinds of queries in a couple different ways. We can’t get into too much detail about exactly how they work, but I’m kind of excited that we’re going from having just general queries be a little more clean to going to some of these areas that have traditionally been a little more spammy, including for example, some more pornographic queries, and some of these changes might have a little bit more of an impact on those kinds of areas that are a little more contested by various spammers and that sort of thing.” He also discussed it at SMX Advanced last year. As Barry Schwartz reported at the time: Matt Cutts explained this goes after unique link schemes, many of which are illegal. He also added this is a world-wide update and is not just being rolled out in the U.S. but being rolled out globally. This update impacted roughly 0.3% of the U.S. queries but Matt said it went as high as 4% for Turkish queries were web spam is typically higher. That was then. This time, according to Schwartz , who has spoken with Cutts, it impacts English queries by about 0.2% to a noticeable degree. Sites are definitely feeling the impact of Google’s new updates. Here are a few comments from the WebmasterWorld forum from various webmasters: We’ve seen a nice jump in Google referrals and traffic over the past couple of days, with the biggest increase on Monday (the announced date of the Panda 4.0 rollout). Our Google referrals on Monday were up by 130 percent…. … I am pulling out my hair. I’ve worked hard the past few months to overcome the Panda from March and was hoping to come out of it with the changes I made. Absolutely no change at all in the SERPS. I guess I’ll have to start looking for work once again. … While I don’t know how updates are rolled out, my site that has had panda problems since April 2011first showed evidence of a traffic increase at 5 p.m. (central, US) on Monday (5/19/2014). … This is the first time I have seen a couple sites I deal with actually get a nice jump in rankings after a Panda… It appears that eBay has taken a hit. Dr. Peter J. Meyers at Moz found that eBay lost rankings on a variety of keywords, and that the main eBay subodmain fell out of Moz’s “Big 10,” which is its metric of the ten domains with the most real estate in the top 10. “Over the course of about three days, eBay fell from #6 in our Big 10 to #25,” he writes. “Change is the norm for Google’s SERPs, but this particular change is clearly out of place, historically speaking. eBay has been #6 in our Big 10 since March 1st, and prior to that primarily competed with Twitter.com for either the #6 or #7 place. The drop to #25 is very large. Overall, eBay has gone from right at 1% of the URLs in our data set down to 0.28%, dropping more than two-thirds of the ranking real-estate they previously held.” He goes on to highlight specific key phrases where eBay lost rankings. It lost two top ten rankings for three separate phrases: “fiber optic christmas tree,” “tongue rings,” and “vermont castings”. Each of these, according to Meyers, was a category page on eBay. eBay also fell out of the top ten, according to this report, for queries like “beats by dr dre,” “honeywell thermostat,” “hooked on phonics,” “batman costume,” “lenovo tablet,” “george foreman grill,” and many others. It’s worth noting that eBay tended to be on the lower end of the top ten rankings for these queries. They’re not dropping out of the number one spot, apparently. Either way, this is isn’t exactly good news for eBay sellers. Of course, it’s unlikely that Google was specifically targeting eBay with either update, and they could certainly bounce back. Have you noticed any specific types of sites (or specific sites) that have taken a noticeable hit? Do Google’s results look better in general? Let us know in the comments . Image via Thinkstock

May 19 2014

Google Responds To Link Removal Overreaction

People continue to needlessly ask sites that have legitimately linked to theirs to remove links because they’re afraid Google won’t like these links or because they simply want to be cautious about what Google may find questionable at any given time. With Google’s algorithms and manual penalty focuses changing on an ongoing basis, it’s hard to say what will get you in trouble with the search engine down the road. Guest blogging, for example, didn’t used to be much of a concern, but in recent months, Google has people freaking out about that. Have you ever felt compelled to have a natural link removed? Let us know in the comments . People take different views on specific types of links whether they’re from guest blog posts, directories, or something else entirely, but things have become so bass ackwards that people seek to have completely legitimate links to their sites removed. Natural links. The topic is getting some attention once again thanks to a blog post from Jeremy Palmer called “Google is Breaking the Internet.” He talks about getting an email from a site his site linked to. “In short, the email was a request to remove links from our site to their site,” he says. “We linked to this company on our own accord, with no prior solicitation, because we felt it would be useful to our site visitors, which is generally why people link to things on the Internet.” “For the last 10 years, Google has been instilling and spreading irrational fear into webmasters,” he writes. “They’ve convinced site owners that any link, outside of a purely editorial link from an ‘authority site’, could be flagged as a bad link, and subject the site to ranking and/or index penalties. This fear, uncertainty and doubt (FUD) campaign has webmasters everywhere doing unnatural things, which is what Google claims they’re trying to stop.” It’s true. We’ve seen similar emails, and perhaps you have too. A lot of sites have. Barry Schwartz at Search Engine Roundtable says he gets quite a few of them, and has just stopped responding. It’s gotten so bad that people even ask StumbleUpon to remove links . You know, Stumbleupon – one of the biggest drivers of traffic on the web. “We typically receive a few of these requests a week,” a spokesperson for the company told WebProNews last year. “We evaluate the links based on quality and if they don’t meet our user experience criteria we take them down. Since we drive a lot of traffic to sites all over the Web, we encourage all publishers to keep and add quality links to StumbleUpon. Our community votes on the content they like and don’t like so the best content is stumbled and shared more often while the less popular content is naturally seen less frequently.” Palmer’s post made its way to Hacker News, and got the attention of a couple Googlers including Matt Cutts himself. It actually turned into quite a lengthy conversation . Cutts wrote: Note that there are two different things to keep in mind when someone writes in and says “Hey, can you remove this link from your site?” Situation #1 is by far the most common. If a site gets dinged for linkspam and works to clean up their links, a lot of them send out a bunch of link removal requests on their own prerogative. Situation #2 is when Google actually sends a notice to a site for spamming links and gives a concrete link that we believe is part of the problem. For example, we might say “we believe site-a.com has a problem with spam or inorganic links. An example link is site-b.com/spammy-link.html.” The vast majority of the link removal requests that a typical site gets are for the first type, where a site got tagged for spamming links and now it’s trying hard to clean up any links that could be considered spammy. He also shared this video discussion he recently ad with Leo Laporte and Gina Trapani. Cutts later said in the Hacker News thread, “It’s not a huge surprise that some sites which went way too far spamming for links will sometimes go overboard when it’s necessary to clean the spammy links up. The main thing I’d recommend for a site owner who gets a fairly large number of link removal requests is to ask ‘Do these requests indicate a larger issue with my site?’ For example, if you run a forum and it’s trivially easy for blackhat SEOs to register for your forum and drop a link on the user profile page, then that’s a loophole that you probably want to close. But if the links actually look organic to you or you’re confident that your site is high-quality or doesn’t have those sorts of loopholes, you can safely ignore these requests unless you’re feeling helpful.” Side note: Cutts mentionedin the thread that Google hasn’t been using the disavow links tool as a reason not to trust a source site. Googler Ryan Moulton weighed in on the link removal discussion in the thread, saying, “The most likely situation is that the company who sent the letter hired a shady SEO. That SEO did spammy things that got them penalized. They brought in a new SEO to clean up the mess, and that SEO is trying to undo all the damage the previous one caused. They are trying to remove every link they can find since they didn’t do the spamming in the first place and don’t know which are causing the problem.” That’s a fair point that has gone largely overlooked. Either way, it is indeed clear that sites are overreacting in getting links removed from sites. Natural links. Likewise, some sites are afraid to link out naturally for similar reasons. After the big guest blogging bust of 2014, Econsultancy, a reasonably reputable digital marketing and ecommerce resource site, announced that it was adding nofollow to links in the bios of guest authors as part of a “safety first approach”. Keep in mind, they only accept high quality posts in the first place, and have strict guidelines. Econsultancy’s Chris Lake wrote at the time, “Google is worried about links in signatures. I guess that can be gamed, on less scrupulous blogs. It’s just that our editorial bar is very high, and all outbound links have to be there on merit, and justified. From a user experience perspective, links in signatures are entirely justifiable. I frequently check out writers in more detail, and wind up following people on the various social networks. But should these links pass on any linkjuice? It seems not, if you want to play it safe (and we do).” Of course Google is always talking about how important the user experience is. Are people overreacting with link removals? Should the sites doing the linking respond to irrational removal requests? Share your thoughts in the comments . Image via Twit.tv

May 12 2014

What Have Google’s Biggest Mistakes Been?

Do you feel like Google makes many mistakes when it comes to trying to improve its search results? Do you think they’ve gone overboard or not far enough with regards to some aspect of spam-fighting? In the latest Google Webmaster Help video, head of webspam Matt Cutts talks about what he views as mistakes that he has made. He discusses two particular mistakes, which both involve things he thinks Google just didn’t address quickly enough: paid links and content farms. What do you think is the biggest mistake Google has made? Share your thoughts in the comments . The exact viewer-submitted question Cutts responds to is: “Was there a key moment in your spam fighting career where you made a mistake that you regret, related to spam?” Cutts recalls, “I remember talking to a very well-known SEO at a search conference in San Jose probably seven years ago (give or take), and that SEO said, ‘You know what? Paid links are just too prevalent. They’re too common. There’s no way that you guys would be able to crack down on them, and enforce that, and come up with good algorithms or take manual action to sort of put the genie back in the bottle,’ as he put it. That was when I realized I’d made a mistake that we’d allowed paid links that pass PageRank to go a little bit too far and become a little bit too common on the web.” “So in the early days of 2005, 2006, you’d see Google cracking down a lot more aggressively, and taking a pretty hard line on our rhetoric about paid links that pass PageRank,” he continues. “At this point, most people know that Google disapproves of it, it probably violates the Federal Trade Commission’s guidelines, all those sorts of things. We have algorithms to target it. We take spam reports about it, and so for the most part, people realize, it’s not a good idea, and if they do that, they might face the consequences, and so for the most part, people try to steer clear of paid links that pass PageRank at this point. But we probably waited too long before we started to take a strong stand on that particular issue.” Yes, most people who engage in paid links are probably aware of Google’s stance on this. In most cases, gaming Google is probably the ultimate goal. That doesn’t mean they’re not doing it though, and it also doesn’t mean that Google’s catching most of those doing it. How would we know? We’re not going to hear about them unless they do get caught, but who’s to say there aren’t still many, many instances of paid links influencing search results as we speak? The other mistake Cutts talks about will be fun for anyone who has ever been affected by the Panda update (referred to repeatedly as the “farmer” update in its early days). Cutts continues, “Another mistake that I remember is there was a group of content farms, and we were getting some internal complaints where people said, ‘Look, this website or that website is really bad. It’s just poor quality stuff. I don’t know whether you’d call it spam or low-quality, but it’s a really horrible user experience.’ And I had been to one particular page on one of these sites because at one point my toilet was running, and I was like, ‘Ok, how do you diagnose a toilet running?’ and I had gotten a good answer from that particular page, and I think I might have over-generalized a little bit, and been like, ‘No, no. There’s lots of great quality content on some of these sites because look, here was this one page that helped solve the diagnostic of why does your toilet run, and how do you fix it, and all that sort of stuff.’” “And the mistake that I made was judging from that one anecdote, and not doing larger scale samples and listening to the feedback, or looking at more pages on the site,” he continues. “And so I think it took us a little bit longer to realize that some of these lower-quality sites or content farms or whatever you want to call them were sort of mass-creating pages rather than really solving users’ needs with fantastic content. And so as a result, I think we did wake up to that, and started working on it months before it really became wide-scale in terms of complaints, but we probably could’ve been working on it even earlier.” The complaints were pretty loud and frequent by the time the Panda update was first pushed, but it sounds like it could have been rolled out (and hurt more sites ) a lot earlier than it eventually did. You have to wonder how that would have changed things. Would the outcome have been different if it had been pushed out months before it was? “Regardless, we’re always looking for good feedback,” says Cutts. “We’re always looking for what are we missing? What do we need to do to make our web results better quality, and so anytime we roll something out, there’s always the question of, ‘Could you have thought of some way to stop that or to take better action or a more clever algorithm, and could you have done it sooner? I feel like Google does a lot of great work, and that’s very rewarding, and we feel like, ‘Okay, we have fulfilled our working hours with meaningful work,’ and yet at the same time, you always wonder could you be doing something better. Could you find a cleaner way to do it – a more elegant way to do it – something with higher precision – higher recall, and that’s okay. It’s healthy for us to be asking ourselves that.” It’s been a while since Google pushed out any earth-shattering algorithm updates. Is there something Google is missing right now that Cutts is going to look back on, and wonder why Google didn’t do something earlier? Would you say that Google’s results are better as a result of its actions against paid links and content farms? What do you think Google’s biggest mistake has been? Let us know in the comments . The post What Have Google’s Biggest Mistakes Been? appeared first on WebProNews .