Jul 7 2014

Matt Cutts Is Disappearing For A While

Just ahead of the holiday weekend, Google’s head of webspam Matt Cutts announced that he is taking leave from Google through at least October, which means we shouldn’t be hearing from him (at least about Google) for at least three months or so. That’s a pretty significant amount of time when you consider how frequently Google makes announcements and changes things up. Is the SEO industry ready for three Matt Cutts-less months? Cutts explains on his personal blog : I wanted to let folks know that I’m about to take a few months of leave. When I joined Google, my wife and I agreed that I would work for 4-5 years, and then she’d get to see more of me. I talked about this as recently as last month and as early as 2006. And now, almost fifteen years later I’d like to be there for my wife more. I know she’d like me to be around more too, and not just physically present while my mind is still on work. So we’re going to take some time off for a few months. My leave starts next week. Currently I’m scheduled to be gone through October. Thanks to a deep bench of smart engineers and spam fighters, the webspam team is in more-than-capable hands. Seriously, they’re much better at spam fighting than I am, so don’t worry on that score. He says he wont’ be checking his work email at all while he’s on leave, but will have some of his outside email forwarded to “a small set of webspam folks,” noting that they won’t be replying. Cutts is a frequent Twitter user, and didn’t say whether or not he’ll be staying off there, but either way, I wouldn’t expect him to tweet much about search during his leave. If you need to reach Google on a matter that you would have typically tried to go to Matt Cutts about, he suggests webmaster forums, Office Hours Hangouts, the Webmaster Central Twitter account, the Google Webmasters Google+ account, or or trying other Googlers. He did recently pin this tweet from 2010 to the top of his timeline: When you've got 5 minutes to fill, Twitter is a great way to fill 35 minutes. — Matt Cutts (@mattcutts) May 11, 2010 So far, he hasn’t stopped tweeting, but his latest – from six hours ago – is just about his leave: I got my inbox down to zero for a shiny moment, then unpinned and closed the tab with work email: http://t.co/o7zBOvskBE — Matt Cutts (@mattcutts) July 7, 2014 That would seem to suggest he doesn’t plan to waste much of his time off on Twitter. So what will Matt be doing while he’s gone? Taking a ballroom dance class with his wife, trying a half-Iornman race, and going on a cruise. He says they might also do some additional traveling ahead of their fifteen-year wedding anniversary, and will spend more time with their parents. Long story short, leave Cutts alone. He’s busy. Image via YouTube

May 30 2014

Google’s Transparency Called Into Question Again

Though it’s back in Google’s results now, another company is making headlines for being penalized by Google. This time it’s Vivint, which produces smart thermostats, and competes with Nest, which Google acquired earlier this year. PandoDaily’s James Robinson wrote an article about it , noting that Vivint had received warnings from Google about external links that didn’t comply with its quality guidelines, but didn’t confirm what the links were. Rather, the company was “left to fish in the dark to figure out what i had done to upset its rival.” As Robinson correctly noted, Rap Genius was removed from Google’s search results last year for violating guidelines, and was back in business within two weeks. At the time, Google was accused by some of employing a double standard for letting the site recover so quickly compared to others. Google’s Matt Cutts had some comments about the Pando article on Hacker News . He wrote: It’s a shame that Pando’s inquiry didn’t make it to me, because the suggestion that Google took action on vivint.com because it was somehow related to Nest is silly. As part of a crackdown on a spammy blog posting network, we took action on vivint.com–along with hundreds of other sites at the same time that were attempting to spam search results. We took action on vivint.com because it was spamming with low-quality or spam articles… He listed several example links, and continued: and a bunch more links, not to mention 25,000+ links from a site with a paid relationship where the links should have been nofollowed. When we took webspam action, we alerted Vivint via a notice in Webmaster Tools about unnatural links to their site. And when Vivint had done sufficient work to clean up the spammy links, we granted their reconsideration request. This had nothing whatsoever to do with Nest. The webspam team caught Vivint spamming. We held them (along with many other sites using the same spammy guest post network) accountable until they cleaned the spam up. That’s all. He said later in the thread that Google “started dissecting” the guest blog posting network in question in November, noting that Google didn’t acquire Nest until January. In case you’re wondering when acquisition talks began, Cutts said, “You know Larry Page doesn’t have me on speed dial for companies he’s planning to buy, right? No one involved with this webspam action (including me) knew about the Nest acquisition before it was publicly announced.” “Vivint was link spamming (and was caught by the webspam team for spamming) before Google even acquired Nest,” he said. Robinson, in a follow-up article , takes issue with Cutts calling Pando’s reporting “silly,” and mockingly says Cutts “wants you to know Google is totally transparent.” Here’s an excerpt: “It’s a shame that Pando’s inquiry didn’t make it to me,” Cutts writes, insinuating we didn’t contact the company for comment. Pando had in fact reached out to Google’s press team and consulted in detail with the company spokesperson who was quoted in our story. It is now clear why Google didn’t pass on our questions to Cutts. He goes on to say that Cutts’ assessment of VIvint’s wrongdoing is “exactly what we described in our article — no one is disputing that Vivint violated Google’s search rules.” He also calls Cutts’ comments “a slightly simplistic version of events, given the months-long frustration Vivint spoke of in trying to fix the problem.” Robinson concludes the article: The point of our reporting is to highlight the unusual severity of the punishment (locked out for months, completely delisted from results until this week) given Vivint’s relationship to a Google-owned company and the lack of transparency Google offers in assisting offending sites. Multiple sources at Vivint told us that the company was told that it had “unnatural links” but was left to guess at what these were, having to repeatedly cut content blindly and ask for reinstatement from Google, until it hit upon the magic recipe. To these charges, Cutts has no answer. That’s a shame. Now, I’m going to pull an excerpt from an article of my own from November because it seems highly relevant here: Many would say that Google has become more transparent over the years. It gives users, businesses and webmasters access to a lot more information about its intentions and business practices than it did long ago, but is it going far enough? When it comes to its search algorithm and changes to how it ranks content, Google has arguably scaled back a bit on the transparency over the past year or so. Google, as a company, certainly pushes the notion that it is transparent. Just last week, Google updated its Transparency Report for the eighth time, showing government requests for user information (which have doubled over three years, by the way). That’s one thing. For the average online business that relies on Internet visibility for customers, however, these updates are of little comfort. A prime example of where Google has reduced its transparency is the monthly lists of algorithm changes it used to put out, but stopped. Cutts said the “world got bored” with those . Except it really didn’t as far as we can tell. Image via YouTube

May 19 2014

Google Responds To Link Removal Overreaction

People continue to needlessly ask sites that have legitimately linked to theirs to remove links because they’re afraid Google won’t like these links or because they simply want to be cautious about what Google may find questionable at any given time. With Google’s algorithms and manual penalty focuses changing on an ongoing basis, it’s hard to say what will get you in trouble with the search engine down the road. Guest blogging, for example, didn’t used to be much of a concern, but in recent months, Google has people freaking out about that. Have you ever felt compelled to have a natural link removed? Let us know in the comments . People take different views on specific types of links whether they’re from guest blog posts, directories, or something else entirely, but things have become so bass ackwards that people seek to have completely legitimate links to their sites removed. Natural links. The topic is getting some attention once again thanks to a blog post from Jeremy Palmer called “Google is Breaking the Internet.” He talks about getting an email from a site his site linked to. “In short, the email was a request to remove links from our site to their site,” he says. “We linked to this company on our own accord, with no prior solicitation, because we felt it would be useful to our site visitors, which is generally why people link to things on the Internet.” “For the last 10 years, Google has been instilling and spreading irrational fear into webmasters,” he writes. “They’ve convinced site owners that any link, outside of a purely editorial link from an ‘authority site’, could be flagged as a bad link, and subject the site to ranking and/or index penalties. This fear, uncertainty and doubt (FUD) campaign has webmasters everywhere doing unnatural things, which is what Google claims they’re trying to stop.” It’s true. We’ve seen similar emails, and perhaps you have too. A lot of sites have. Barry Schwartz at Search Engine Roundtable says he gets quite a few of them, and has just stopped responding. It’s gotten so bad that people even ask StumbleUpon to remove links . You know, Stumbleupon – one of the biggest drivers of traffic on the web. “We typically receive a few of these requests a week,” a spokesperson for the company told WebProNews last year. “We evaluate the links based on quality and if they don’t meet our user experience criteria we take them down. Since we drive a lot of traffic to sites all over the Web, we encourage all publishers to keep and add quality links to StumbleUpon. Our community votes on the content they like and don’t like so the best content is stumbled and shared more often while the less popular content is naturally seen less frequently.” Palmer’s post made its way to Hacker News, and got the attention of a couple Googlers including Matt Cutts himself. It actually turned into quite a lengthy conversation . Cutts wrote: Note that there are two different things to keep in mind when someone writes in and says “Hey, can you remove this link from your site?” Situation #1 is by far the most common. If a site gets dinged for linkspam and works to clean up their links, a lot of them send out a bunch of link removal requests on their own prerogative. Situation #2 is when Google actually sends a notice to a site for spamming links and gives a concrete link that we believe is part of the problem. For example, we might say “we believe site-a.com has a problem with spam or inorganic links. An example link is site-b.com/spammy-link.html.” The vast majority of the link removal requests that a typical site gets are for the first type, where a site got tagged for spamming links and now it’s trying hard to clean up any links that could be considered spammy. He also shared this video discussion he recently ad with Leo Laporte and Gina Trapani. Cutts later said in the Hacker News thread, “It’s not a huge surprise that some sites which went way too far spamming for links will sometimes go overboard when it’s necessary to clean the spammy links up. The main thing I’d recommend for a site owner who gets a fairly large number of link removal requests is to ask ‘Do these requests indicate a larger issue with my site?’ For example, if you run a forum and it’s trivially easy for blackhat SEOs to register for your forum and drop a link on the user profile page, then that’s a loophole that you probably want to close. But if the links actually look organic to you or you’re confident that your site is high-quality or doesn’t have those sorts of loopholes, you can safely ignore these requests unless you’re feeling helpful.” Side note: Cutts mentionedin the thread that Google hasn’t been using the disavow links tool as a reason not to trust a source site. Googler Ryan Moulton weighed in on the link removal discussion in the thread, saying, “The most likely situation is that the company who sent the letter hired a shady SEO. That SEO did spammy things that got them penalized. They brought in a new SEO to clean up the mess, and that SEO is trying to undo all the damage the previous one caused. They are trying to remove every link they can find since they didn’t do the spamming in the first place and don’t know which are causing the problem.” That’s a fair point that has gone largely overlooked. Either way, it is indeed clear that sites are overreacting in getting links removed from sites. Natural links. Likewise, some sites are afraid to link out naturally for similar reasons. After the big guest blogging bust of 2014, Econsultancy, a reasonably reputable digital marketing and ecommerce resource site, announced that it was adding nofollow to links in the bios of guest authors as part of a “safety first approach”. Keep in mind, they only accept high quality posts in the first place, and have strict guidelines. Econsultancy’s Chris Lake wrote at the time, “Google is worried about links in signatures. I guess that can be gamed, on less scrupulous blogs. It’s just that our editorial bar is very high, and all outbound links have to be there on merit, and justified. From a user experience perspective, links in signatures are entirely justifiable. I frequently check out writers in more detail, and wind up following people on the various social networks. But should these links pass on any linkjuice? It seems not, if you want to play it safe (and we do).” Of course Google is always talking about how important the user experience is. Are people overreacting with link removals? Should the sites doing the linking respond to irrational removal requests? Share your thoughts in the comments . Image via Twit.tv

Mar 26 2014

PSA: The Topics You Include On Your Blog Must Please Google

It’s no secret now that Google has launched an attack against guest blogging. Since penalizing MyBlogGuest earlier this month, Google has predictably reignited the link removal hysteria . More people are getting manual penalties related to guest posts. SEO Doc Sheldon got one specifically for running one post that Google deemed to not be on-topic enough for his site. Even though it was about marketing. Maybe there were more, but that’s the one Google pointed out. The message he received (via Search Engine Roundtable ) was: Google detected a pattern of unnatural, artificial, deceptive or manipulative outbound links on pages on this site. This may be the result of selling links that pass PageRank or participating in link schemes. He shared this in an open letter to Matt Cutts, Eric Schmidt, Larry Page, Sergey Brin, et al. Cutts responded to that letter with this: @DocSheldon what "Best Practices For Hispanic Social Networking" has to do with an SEO copywriting blog? Manual webspam notice was on point. — Matt Cutts (@mattcutts) March 24, 2014 To which Sheldon responded: @mattcutts My blog is about SEO, marketing, social media, web dev…. I'd say it has everything to do – or I wouldn't have run it — DocSheldon (@DocSheldon) March 25, 2014 Perhaps that link removal craze isn’t so irrational. Irrational on Google’s part perhaps, but who can really blame webmasters for succumbing to Google’s pressure to dictate what content they run on their sites when they rely on Google for traffic and ultimately business. @mattcutts So we can take this to mean that just that one link was the justification for a sitewide penalty? THAT sure sends a message! — DocSheldon (@DocSheldon) March 25, 2014 Here’s the article in question. Sheldon admitted it wasn’t the highest quality post in the world, but also added that it wasn’t totally without value, and noted that it wasn’t affected by the Panda update (which is supposed to handle the quality part algorithmically). I have a feeling that link removal craze is going to be ramping up a lot more. Ann Smarty, who runs MyBlogGuest weighed in on the conversation: I don't have the words RT @DocSheldon @mattcutts one link was the justification for a sitewide penalty? THAT sure sends a message! — Ann Smarty (@seosmarty) March 25, 2014 Image via YouTube

Mar 19 2014

Google Takes Action On Guest Blogging

Google has been warning webmasters about spammy guest blogging for quite a while, but now, the search engine is getting serious. Head of webspam Matt Cutts tweeted early this morning that Google has taken action on a large guest blog network, and reminded people about “the spam risks of guest blogging”. Today we took action on a large guest blog network. A reminder about the spam risks of guest blogging: http://t.co/rc9O82fjfn — Matt Cutts (@mattcutts) March 19, 2014 That link points to a post from January on Matt’s personal blog where he proclaimed that “guest blogging is done.” He later clarified that he meant guest blogging specifically for SEO. He didn’t specify which network Google just took action on, but Pushfire CEO Rae Hoffman suggested that MyBlogGuest appears to be the “winner”. It looks like MyBlogGuest was the "winner" – not appearing on branded terms RT @mattcutts Today we took action on a large guest blog network — Rae Hoffman (@sugarrae) March 19, 2014 @gcharlton @mattcutts @patrickaltoft examples as in? "branded terms" – they ranked for their name yesterday, they don't today… — Rae Hoffman (@sugarrae) March 19, 2014 Still, from where we’re sitting, the site is in the top three for its name, appearing only under its own Twitter and Facebook pages. There has been no update from MyBlogGuest on the topic so far this morning. Update: Smarty has confirmed that the network has been penalized. [Official] Even though #myblogguest has been against paying for links (unlike other platforms), @mattcutts team decided to penalize us… — Ann Smarty (@seosmarty) March 19, 2014 I don’t think our publishers will be penalized, but let’s ask @mattcutts — Ann Smarty (@seosmarty) March 19, 2014 The site promises on its homepage , “We don’t allow in any way to manipulate Google Rankings or break any Google rules.” It does promise bloggers a way to build links, which everyone knows is a key signal in Google’s ranking algorithm (Cutts recently said links are still “super important”). Barry Schwartz at Search Engine Land points out that Ann Smarty, who owns MyBlogGuest, wrote a blog post after Cutts’ January post, saying her network wouldn’t nofollow links, so it does seem like a likely target. She wrote : MyBlogGuest is NOT going to allow nofollow links or paid guest blogging (even though Matt Cutts seems to be forcing us to for whatever reason). Instead we will keep promoting the pure and authentic guest blogging concept we believe in. She went on to note that she is an SEO who stopped depending on organic rankings a long time ago. “I believe in the Internet and its ability of giving little people (like myself) the power of being heard. I can say, I don’t care about Google,” she wrote. “I don’t think Google is THE Internet.” She’s right, and one can’t help but admire her attitude, but one also can’t help but wonder how many of those utilizing the network have that attitude. It stands to reason that Google is going to be going after more of them the way it has been doing with other link networks . Google isn’t the Internet, but how much are people spending time and effort writing guest blog posts depending on it? Update: Apparently Smarty does care about Google. Bill Hartzer writes that she told him before Cutts made the announcement, “I really hope that they don’t target MyBlogGuest. There are other guest blogging networks that should targeted, such as PostJoint, a paid guest blogging network. MylLogGuest is not a paid network.” Image via YouTube

Mar 12 2014

Matt Cutts On How To Get Google To Recognize Your Mobile Pages

Google has a new “Webmaster Help” video out. This time Matt Cutts discusses optimizing for the mobile web. Specifically, he takes on this submitted question: Is there a way to tell Google there is a mobile version of a page, so it can show the alternate page in mobile search results? Or similarly, that a page is responsive and the same URL works on mobile? Cutts says this is a very popular question. Google has plenty of information on the subject out there already, but obviously people still aren’t quite grasping it. “Don’t block javascript and CSS. That actually makes a pretty big difference,” he says. “If we’re able to fetch the javascript and CSS we can basically try to figure out whether it’s responsive design on our side. So my advice – and I’m going to keep hitting this over and over and over again – is never block javascript and CSS. Go ahead and let Googlebot fetch those, interpret those, figure out whether a site is responsive, and do all sorts of other ways like executing or rendering javascript to find new links, and being able to crawl your website better.” “The other way to do it is to have one version of the page for desktop and another version of the page for regular mobile smartphone users, and to have separate URLs,” he continues. “So how do you handle that case correctly? Well, first off, you want to make sure that on your desktop page, you do a rel-alternate that points to the smartphone version of your page. That basically let’s Google know, ‘Yes, these two versions of the same page are related to each other because this is the smartphone version, and this is the desktop version.’ Likewise, on the smartphone version, you want to do a rel=canonical to the desktop version. What that does is it tells Googlebot, ‘Hey, even though this is its own separate URL – while the content is the same – it should really be glommed together with the desktop version.’ And so as long as you have those bi-directional links (a rel-alternate pointing from the desktop to the smartphone and a rel=canonical pointing from the smartphone to the desktop) then Google will be able to suss out the difference between those, and be able to return the correct version to the correct user.” “Now there’s one other thing to bear in mind, which is that you can also just make sure that you redirect any smartphone agents from the desktop version to the smartphone version,” Cutts adds. “So we look for that. If we crawl with Googlebot Mobile, and we see that we get redirected to a separate URL then we start to interpret, and say, ‘Ah, it looks like most people are doing that – where they have a desktop version of the page, a smartphone user agent comes in, and they get redirected to a different URL.’ Of course, the thing to bear in mind is just like earlier – we said not to block javascript and CSS – one common mistake that we see is blocking Googlebot Mobile or blocking Googlebot whenever it tries to fetch the smartphone version of the page. Don’t block Googlebot in either case. Just make sure that you return the correct things, and treat Googlebot Mobile like you would treat s smartphone user agent, and treat Googlebot (regular) just like you would treat a desktop user.” As long as you follow these best practices, he says, Google will figure it out. The video page points to this page on building smartphone-optimized websites, which includes an overview of Google’s recommendations, common mistakes, and more info about various elements of the subject. Image via YouTube

Mar 11 2014

Google Launches Official Google+ Page For Webmasters

There is now an official Google+ page for Google Webmasters. Matt Cutts tweeted a link to it, and the page made its first post last night: So far, that’s all that it has to offer, but we can probably expect similar (if not the same) posts as what we see from the Google Webmasters Twitter account: Tweets by @googlewmc That means links to Webmaster Central blog posts, links to Google Webmaster Help videos and various other updates that webmasters should know about. If you’re a Google+ junkie, you now have another way to keep up with all this stuff. The Google+ page only has 3,700 followers so far. That’s compared to the 111,000 on Twitter. Image via Google+

Mar 6 2014

Now There’s A Matt Cutts Whack-A-Mole Game

So remember that Matt Cutts Donkey Kong game from a few weeks ago? That has inspired a new Matt Cutts Whack-A-Mole game, in which (you guessed it) you get to whack Matt Cutts with a gavel over and over again.

Jan 10 2014

Google Tweaks Guidance On Link Schemes

Google has made a subtle, but noteworthy change to its help center article on link schemes , which is part of its quality guidelines dissuading webmasters from engaging in spammy SEO tactics. Google put out a video last summer about adding rel=”nofollow” to links that are included in widgets: In that, Matt Cutts, Google’s head of webspam, said, “I would not rely on widgets and infographics as your primary way to gather links, and I would recommend putting a nofollow, especially on widgets, because most people when they just copy and paste a segment of code, they don’t realize what all is going with that, and it’s usually not as much of an editorial choice because they might not see the links that are embedded in that widget.” “Depending on the scale of the stuff that you’re doing with infographics, you might consider putting a rel nofollow on infographic links as well,” he continued. “The value of those things might be branding. They might be to drive traffic. They might be to sort of let people know that your site or your service exists, but I wouldn’t expect a link from a widget to necessarily carry the same weight as an editorial link freely given where someone is recommending something and talking about it in a blog post. That sort of thing.” In Google’s guidance for link schemes, it gives “common examples of unnatural links that may violate our guidelines.” It used to include: “Links embedded in widgets that are distributed across various sites.” As Search Engine Land brings to our attention , that part now reads: “Keyword-rich, hidden or low-quality links embedded in widgets that are distributed across various sites.” That’s a little more specific, and seems to indicate that the previous guidance cast a broader net over such links than what Google really frowns upon. That’s worth noting. You’d do well to pay attention to what Google thinks about link schemes, as the search engine has made a big point of cracking down on them lately (even if some have gotten off lightly ).

Dec 20 2013

Google Says It’s Now Working To ‘Promote Good Guys’

Google’s Matt Cutts says Google is “now doing work on how to promote good guys.” More specifically, Google is working on changes to its algorithm that will make it better at promoting content from people who it considers authoritative on certain subjects. You may recall earlier this year when Cutts put out the following video talking about things Google would be working on this year. In that, he said, “We have also been working on a lot of ways to help regular webmasters. We’re doing a better job of detecting when someone is more of an authority on a specific space. You know, it could be medical. It could be travel. Whatever. And try to make sure that those rank a little more highly if you’re some sort of authority or a site, according to the algorithms, we think might be a little more appropriate for users.” Apparently that’s something Google is working on right now. Cutts appeared in a “This Week In Google” video (via Search Engine Land /Transcript via Craig Moore ) in which he said: We have been working on a lot of different stuff. We are actually now doing work on how to promote good guys. So if you are an authority in a space, if you search for podcasts, you want to return something like Twit.tv. So we are trying to figure out who are the authorities in the individual little topic areas and then how do we make sure those sites show up, for medical, or shopping or travel or any one of thousands of other topics. That is to be done algorithmically not by humans … So page rank is sort of this global importance. The New York times is important so if they link to you then you must also be important. But you can start to drill down in individual topic areas and say okay if Jeff Jarvis (Prof of journalism) links to me he is an expert in journalism and so therefore I might be a little bit more relevant in the journalistic field. We’re trying to measure those kinds of topics. Because you know you really want to listen to the experts in each area if you can. For quite a while now, authorship has given Google an important signal about individuals as they relate to the content they’re putting out. Interestingly, Google is scaling authorship back a bit. Image: YouTube