Nov 6 2014

Is The Matt Cutts Era Over?

It’s not 100% clear yet, but it’s looking like for webmasters and SEOs, the era of Matt Cutts is a thing of the past. His career at Google may continue, but it doesn’t sound like he’ll be the head of webspam going forward. Would you like to see Matt Cutts return to the role he’s held for years, or do you look forward to change in the search department? Share your thoughts in the comments . It’s a pretty interesting time in search right now. Matt Cutts, who has been the go-to guy for webmaster help and Q&A related to Google search for quite a few years, has been on leave from the company since July. Meanwhile, his counterpart over at Bing has been let go from his duties at Microsoft . @DuaneForrester sending you good thoughts today. Thanks for providing info to so many people and tough love when needed. — Matt Cutts (@mattcutts) October 30, 2014 When Cutts announced his leave , he didn’t really make it sound like he wouldn’t be back, but rather like he would be taking a nice,long, much-deserved vacation. He wrote on his blog : I wanted to let folks know that I’m about to take a few months of leave. When I joined Google, my wife and I agreed that I would work for 4-5 years, and then she’d get to see more of me. I talked about this as recently as last month and as early as 2006. And now, almost fifteen years later I’d like to be there for my wife more. I know she’d like me to be around more too, and not just physically present while my mind is still on work. So we’re going to take some time off for a few months. My leave starts next week. Currently I’m scheduled to be gone through October. Thanks to a deep bench of smart engineers and spam fighters, the webspam team is in more-than-capable hands. Seriously, they’re much better at spam fighting than I am, so don’t worry on that score. Scheduled to be gone through October. See? Pretty much sounds like a vacation. As you know, October has since come and gone. On October 31, Cutts provided another update, saying he was extending his leave, and wouldn’t be back at Google this year. I'm planning to extend my leave into 2015: — Matt Cutts (@mattcutts) November 1, 2014 Ok, fine. Cutts has been at Google for fourteen years, and can probably take a considerable amount of time off with no problem. But he’d be back in the swing of things in the new year, right? Well, he might be back, but what he’ll be doing remains to be seen. Cutts appeared on the web chat show This Week in Google , hosted by Leo Laporte, who asked him if he’ll go back to the same role, or if this is a chance for him to try something different. This part of the conversation starts at about 9 minutes and 50 seconds in to the video below (h/t: Search Engine Roundtable ). “Well, I really have been impressed with how well everyone else on the team is doing, and it’s created a little bit of an opportunity for them to try new things, explore different stuff, you know, approach problems from a different way, and so we’ll have to see how it goes,” Cutts responded. “I loved the part of my job that dealt with keeping an eye on what important news was happening related to Google, but you know, it’s not clear that having me as a lightning rod, you know for, you know unhappy black hat SEOs or something is the best use of anybody’s time compared to working on other things that could be making the world better for Google or in general. So we’ll see how it all works.” It doesn’t really sounds like he intends to go back to the classic Matt Cutts role. In fact, later in the discussion, he referred to the initial leave as the “official” leave, implying that the one he’s now on is open-ended. Laporte asked him if he has the ability at the company to just do something different if he wants to. He said, “The interesting thing is that at Google they try to get you and go do different projects, so the product managers, they encourage you to rotate every two or three years, and so it’s relatively rare to find people who have been around forever in a specific area. You’ll find Amit [Singhal] in search, Sridhar [Ramaswamy], you know, some of these people that are really, really senior, you know – higher ranking than me for sure – they do stick around in one area, but a lot of other people jump to different parts of the company to furnish different skills and try different things, which is a pretty good idea, I think.” Again, it sounds like he would really like to do something different within the company. He also reiterated his confidence in the current webspam team. On his “colleagues” (he prefers that term to “minions”), he said, “I just have so much admiration for you know, for example, last year, there was a real effort on child porn because of some stuff that happened in the United Kingdom, and a lot of people chipped in, and that is not an easy job at all. So you really have to think hard about how you’re gonna try to tackle this kind of thing.” Jeff Jarvis, who was also on the show, asked Cutts what other things interest him. Cutts responded, “Oh man, I was computer graphics and actually inertial trackers and accelerometers in grad school. At one point I said, you know, you could use commodity hardware, but as a grad student, you don’t have access to influence anybody’s minds, so why don’t I just go do something else for ten years, and somebody else will come up with all these sensors, and sure enough, you’ve got Kinect, you have the Wii, you know, the iPhone. Now everybody’s got a computer in their pocket that can do 3D sensing as long as write the computer programs well. So there’s all kinds of interesting stuff you could do.” Will we see Matt working on the Android team? As a matter of fact, Laporte followed that up by mentioning Andy Rubin – the guy who created Android and brought it to Google – leaving the company. News of that came out last week . Matt later said, “I’ll always have a connection and soft spot for Google…” That’s actually a bit more mysterious of a comment. I don’t want to put any words in the guy’s mouth, but to me, that sounds like he’s not married to the company for the long haul. Either way, webmasters are already getting used to getting updates and helpful videos from Googlers like Pierre Far and John Mueller. We’ve already seen Google roll out new Panda and Penguin updates since Cutts has been on leave, and the SEO world hasn’t come crumbling down. I’m guessing Cutts is getting less hate mail these days. He must have been getting tired of disgruntled website owners bashing him online all the time. It’s got to be nice to not have to deal with that all the time. As I said at the beginning of the article, it’s really not clear what Matt’s future holds, so all we can really do is listen to what he’s said, and look for him to update people further on his plans. In the meantime, if you miss him, you can peruse the countless webmaster videos and comments he’s made over the years that we’ve covered here . Do you expect Matt Cutts to return to search in any capacity? Do you expect him to return to Google? Should he? Do you miss him already? Let us know what you think .

May 30 2014

Google’s Transparency Called Into Question Again

Though it’s back in Google’s results now, another company is making headlines for being penalized by Google. This time it’s Vivint, which produces smart thermostats, and competes with Nest, which Google acquired earlier this year. PandoDaily’s James Robinson wrote an article about it , noting that Vivint had received warnings from Google about external links that didn’t comply with its quality guidelines, but didn’t confirm what the links were. Rather, the company was “left to fish in the dark to figure out what i had done to upset its rival.” As Robinson correctly noted, Rap Genius was removed from Google’s search results last year for violating guidelines, and was back in business within two weeks. At the time, Google was accused by some of employing a double standard for letting the site recover so quickly compared to others. Google’s Matt Cutts had some comments about the Pando article on Hacker News . He wrote: It’s a shame that Pando’s inquiry didn’t make it to me, because the suggestion that Google took action on because it was somehow related to Nest is silly. As part of a crackdown on a spammy blog posting network, we took action on–along with hundreds of other sites at the same time that were attempting to spam search results. We took action on because it was spamming with low-quality or spam articles… He listed several example links, and continued: and a bunch more links, not to mention 25,000+ links from a site with a paid relationship where the links should have been nofollowed. When we took webspam action, we alerted Vivint via a notice in Webmaster Tools about unnatural links to their site. And when Vivint had done sufficient work to clean up the spammy links, we granted their reconsideration request. This had nothing whatsoever to do with Nest. The webspam team caught Vivint spamming. We held them (along with many other sites using the same spammy guest post network) accountable until they cleaned the spam up. That’s all. He said later in the thread that Google “started dissecting” the guest blog posting network in question in November, noting that Google didn’t acquire Nest until January. In case you’re wondering when acquisition talks began, Cutts said, “You know Larry Page doesn’t have me on speed dial for companies he’s planning to buy, right? No one involved with this webspam action (including me) knew about the Nest acquisition before it was publicly announced.” “Vivint was link spamming (and was caught by the webspam team for spamming) before Google even acquired Nest,” he said. Robinson, in a follow-up article , takes issue with Cutts calling Pando’s reporting “silly,” and mockingly says Cutts “wants you to know Google is totally transparent.” Here’s an excerpt: “It’s a shame that Pando’s inquiry didn’t make it to me,” Cutts writes, insinuating we didn’t contact the company for comment. Pando had in fact reached out to Google’s press team and consulted in detail with the company spokesperson who was quoted in our story. It is now clear why Google didn’t pass on our questions to Cutts. He goes on to say that Cutts’ assessment of VIvint’s wrongdoing is “exactly what we described in our article — no one is disputing that Vivint violated Google’s search rules.” He also calls Cutts’ comments “a slightly simplistic version of events, given the months-long frustration Vivint spoke of in trying to fix the problem.” Robinson concludes the article: The point of our reporting is to highlight the unusual severity of the punishment (locked out for months, completely delisted from results until this week) given Vivint’s relationship to a Google-owned company and the lack of transparency Google offers in assisting offending sites. Multiple sources at Vivint told us that the company was told that it had “unnatural links” but was left to guess at what these were, having to repeatedly cut content blindly and ask for reinstatement from Google, until it hit upon the magic recipe. To these charges, Cutts has no answer. That’s a shame. Now, I’m going to pull an excerpt from an article of my own from November because it seems highly relevant here: Many would say that Google has become more transparent over the years. It gives users, businesses and webmasters access to a lot more information about its intentions and business practices than it did long ago, but is it going far enough? When it comes to its search algorithm and changes to how it ranks content, Google has arguably scaled back a bit on the transparency over the past year or so. Google, as a company, certainly pushes the notion that it is transparent. Just last week, Google updated its Transparency Report for the eighth time, showing government requests for user information (which have doubled over three years, by the way). That’s one thing. For the average online business that relies on Internet visibility for customers, however, these updates are of little comfort. A prime example of where Google has reduced its transparency is the monthly lists of algorithm changes it used to put out, but stopped. Cutts said the “world got bored” with those . Except it really didn’t as far as we can tell. Image via YouTube The post Google’s Transparency Called Into Question Again appeared first on WebProNews .

Mar 26 2014

PSA: The Topics You Include On Your Blog Must Please Google

It’s no secret now that Google has launched an attack against guest blogging. Since penalizing MyBlogGuest earlier this month, Google has predictably reignited the link removal hysteria . More people are getting manual penalties related to guest posts. SEO Doc Sheldon got one specifically for running one post that Google deemed to not be on-topic enough for his site. Even though it was about marketing. Maybe there were more, but that’s the one Google pointed out. The message he received (via Search Engine Roundtable ) was: Google detected a pattern of unnatural, artificial, deceptive or manipulative outbound links on pages on this site. This may be the result of selling links that pass PageRank or participating in link schemes. He shared this in an open letter to Matt Cutts, Eric Schmidt, Larry Page, Sergey Brin, et al. Cutts responded to that letter with this: @DocSheldon what "Best Practices For Hispanic Social Networking" has to do with an SEO copywriting blog? Manual webspam notice was on point. — Matt Cutts (@mattcutts) March 24, 2014 To which Sheldon responded: @mattcutts My blog is about SEO, marketing, social media, web dev…. I'd say it has everything to do – or I wouldn't have run it — DocSheldon (@DocSheldon) March 25, 2014 Perhaps that link removal craze isn’t so irrational. Irrational on Google’s part perhaps, but who can really blame webmasters for succumbing to Google’s pressure to dictate what content they run on their sites when they rely on Google for traffic and ultimately business. @mattcutts So we can take this to mean that just that one link was the justification for a sitewide penalty? THAT sure sends a message! — DocSheldon (@DocSheldon) March 25, 2014 Here’s the article in question. Sheldon admitted it wasn’t the highest quality post in the world, but also added that it wasn’t totally without value, and noted that it wasn’t affected by the Panda update (which is supposed to handle the quality part algorithmically). I have a feeling that link removal craze is going to be ramping up a lot more. Ann Smarty, who runs MyBlogGuest weighed in on the conversation: I don't have the words RT @DocSheldon @mattcutts one link was the justification for a sitewide penalty? THAT sure sends a message! — Ann Smarty (@seosmarty) March 25, 2014 Image via YouTube

Dec 4 2013

Matt Cutts Talks Content Stitching In New Video

Google has a new Webmaster Help video out about content that takes text from other sources. Specifically, Matt Cutts responds to this question: Hi Matt, can a site still do well in Google if I copy only a small portion of content from different websites and create my own article by combining it all, considering I will mention the source of that content (by giving their URLs in the article)? “Yahoo especially used to really hate this particular technique,” says Cutts. “They called it ‘stitching’. If it was like two or three sentences from one article, and two or three sentences from another article, and two or three sentences from another article, they really considered that spam. If all you’re doing is just taking quotes from everybody else, that’s probably not a lot of added value. So I would really ask yourself: are you doing this automatically? Why are you doing this? Why? People don’t just like to watch a clip show on TV. They like to see original content.” I don’t know. SportsCenter is pretty popular, and I don’t think it’s entirely for all the glowing commentary. It’s also interesting that he’s talking about this from Yahoo’s perspective. “They don’t just want to see an excerpt and one line, and then an excerpt and one line, and that sort of thing,” Cutts continues. “Now it is possible to pull together a lot of different sources, and generate something really nice, but you’re usually synthesizing. For example, Wikipedia will have stuff that’s notable about a particular topic, and they’ll have their sources noted, and they cite all of their sources there, and they synthesize a little bit, you know. It’s not like they’re just copying the text, but they’re sort of summarizing or presenting as neutral of a case as they can. That’s something that a lot of people really enjoy, and if that’s the sort of thing that you’re talking about, that would probably be fine, but if you’re just wholesale copying sections from individual articles, that’s probably going to be a higher risk area, and I might encourage you to avoid that if you can.” If you’re creating good content that serves a valid purpose for your users, my guess is that you’ll be fine, but you know Google hates anything automated when it comes to content.

Oct 4 2012

Google Continues To Tinker With Freshness In Recent Algorithm Adjustments

Is Google getting close to where it wants to be in terms of how it handles freshness of content in search results? This has been one major area of focus for Google for the past year or so. Last November, Google launched the Freshness update , and since then, it has periodically been making various tweaks to how it handles different things related to freshness. Google has been releasing regular lists of algorithm changes it makes from month to month all year, and some of these lists have been quite heavy on the freshness factor . On Thursday, Google released its lists for changes made in August and September. Somewhat surprisingly, “freshness” is only mentioned twice. Two changes were made (at least changes that Google is disclosing) under the “Freshness” project banner. We actually already discussed one of them in another article , as it is also related to how Google deals with domains (which Google seems to be focusing on more these days). That would be this list entry: #83761. [project “Freshness”] This change helped you find the latest content from a given site when two or more documents from the same domain are relevant for a given search query. That change was made in September. The other one was made in August: Imadex. [project “Freshness”] This change updated handling of stale content and applies a more granular function based on document age. Quite frankly, I’m not sure what you can really do with that information other than to consider the freshness of your content, in cases where freshness is relevant to quality. This is actually a topic Google’s Matt Cutts discussed in a Webmaster Help video released this week. “If you’re not in an area about news – you’re not in sort of a niche or topic area that really deserves a lot of fresh stuff, then that’s probably not something you need to worry about at all,” he said in the video. I’ve been a fairly vocal critic of how Google has handled freshness , as I’ve found the signal to get in the way of the information I’m actually seeking far too often. Plenty of readers have agreed, but this is clearly an area where Google is still tinkering. Do you think Google is getting better at how it handles freshness? Feel free to share your thoughts in the comments.

Oct 3 2012

Google EMD Update Was Accompanied By At Least One Other Update

As you probably know by now, Google’s Matt Cutts announced an algorithm change on Friday – the EMD update. The change was designed to reduce low low-quality exact match domains in search results. Cutts deemed the change “small” and tweeted about it as a “minor” weather report. Based on all of the complaints we’re seeing (you can read plenty of them in the comments of this article ), it may not have been all that minor. Cutts said that the change affects 0.6% of English-US queries to a noticeable degree, and noted that it was unrelated to Panda or Penguin. Still, based on all of these sites claiming to have been hit, you would think it was Panda or Penguin . Some webmasters claim to have been hit, but not necessarily on sites with exact match domains. So why would they have taken such a hit? Well, it’s not news that Google launches various changes to its algorithm on a day to day basis. The company often gives the “over 500 a year” number. This time is no different. Search Engine Roundtable is pointing to a reply Cutts gave to one person on Twitter about the situation, where he noted that he knows of one change that was also released during the same timeframe as the EMD update. Here’s the exchange (with another interesting one about Google’s struggle with quality thrown in): Follow @mattcutts Matt Cutts @mattcutts Minor weather report: small upcoming Google algo change will reduce low-quality “exact-match” domains in search results.   Follow @SeoLair George Fischer @SeoLair @mattcutts How are these sites ranking high currently if they are “low quality”? The EMD’s aren’t the real issue. #seo #google   Follow @mattcutts Matt Cutts @mattcutts @SeoLair all web sites and results exist on a continuum of quality. A big challenge in search is how/when to trade topicality vs. quality.   Follow @GregrySmith Gregory Smith ツ @GregrySmith @mattcutts As an Authority Local SEO Company all sites we reviewed that was hit by EMD update was not EMD’s at all.Can you explain this?   Follow @mattcutts Matt Cutts @mattcutts @GregrySmith yes. Multiple algos are rolling out all the time. Likely those sites weren’t affected by EMD update but by another algo.   Follow @GregrySmith Gregory Smith ツ @GregrySmith @mattcutts Thank you matt but all this happened during the past 3 days. Has another update happened during this time?   Follow @mattcutts Matt Cutts @mattcutts @GregrySmith yes. 500+ algo launches/year mean 1-2 a day. I know of at least one other algo rolling out over same timeframe for example.   Reply  ·   Retweet  ·   Favorite 2 days ago via Twitter for Android  · powered by @socialditto While it’s not that interesting that Google launched another change the same time as the EMD update (again, it’s common knowledge that Google pushes changes every day), it is interesting that so many people are complaining about being hit when the update Cutts tweeted about was said to be so small, and that many of those claiming to have been hit were not dealing with exact match domains. If another change had as big of an impact, a greater impact, or anywhere close to the impact as the EMD update, why wouldn’t Google announce that one? Meanwhile, we’re still waiting on Google to be “transparent” about the changes it has made over the course of August and September, with its monthly (at least they used to be) lists. All of that combined with new updates to Google ‘s Webmaster Guidelines should be enough to keep webmasters busy for a bit.

May 11 2012

Google Penguin Update Recovery: Matt Cutts Says Watch These 2 Videos

Danny Sullivan at Search Engine Land put up a great Penguin article with some new quotes from Matt Cutts. We’ve referenced some of the points made in other articles, but one important thing to note from the whole thing is that Cutts pointed to two very specific videos that people should watch if they want to clean up their sites and recover from the Penguin update. We often share Google’s Webmaster Help videos, which feature Cutts giving advice based on user-submitted questions (or sometimes his own questions). I’m sure we’ve run these in the past, but according to Sullivan, Cutts pointed to these: Guess what: in both videos, he talks about Google’s quality guidelines . That is your recovery manual, as far as Google is concerned. Here are some articles we’ve posted recently specifically on different aspects of the guidelines: Google Penguin Update: Don’t Forget About Duplicate Content Google Penguin Update: A Lesson In Cloaking Google Penguin Update Recovery: Hidden Text And Links Recover From Google Penguin Update: Get Better At Links Google Penguin Update: 12 Tips Directly From Google Google Penguin Update Recovery: Getting Better At Keywords Google Penguin Update: Seriously, Avoid Doorway Pages Google Penguin Update And Affiliate Programs So, in your recovery plan, take all of this into account, and these tips that Cutts lent his seal of approval to . And when all else fails, according to Cutts, you might want to just start over with a new site.

May 2 2012

Google Penguin Update Recovery: Hidden Text And Links

There’s been a lot of discussion about Google’s Penguin update since it was launched. The update, if you haven’t been following, is about decreasing the rankings of sites that are violating Google’s quality guidelines . With that in mind, it seems like a good idea to take a good look at these guidelines. Here are some articles about: Cloaking Links Keyword Stuffing The guidelines have been around for a long time, and Google has enforced them for just as long. In that regard, the Penguin update is nothing new. It’s just that Google thinks it has a new way to better enforce the guidelines. You should expect that Google will only continue to improve, so your best bet is to simply abide. That is, if you care about your Google rankings. Another one of Google’s guidelines is: Avoid hidden text or hidden links. This means, don’t use white text on a white background. It means don’t include text behind images. Don’t use CSS to hide text. Don’t set the font size to 0. These are specific examples of what not to do straight from Google’s help center page on the topic . When you do these things, Google is deeming your site untrustworthy, and will go out of its way not to rank your site well or most likely de-index it completely. “If your site is perceived to contain hidden text and links that are deceptive in intent, your site may be removed from the Google index, and will not appear in search results pages,” Google says. “When evaluating your site to see if it includes hidden text or links, look for anything that’s not easily viewable by visitors of your site. Are any text or links there solely for search engines rather than visitors?” Google defines hidden links as that are only intended to be crawled by Googlebot, but are unreadable to humans, because they consist of hidden text, CSS has been used to make them as small as one pixel high, or it is hidden in a small character (such as a hyphen in the middle of a paragraph). “If you’re using text to try to describe something search engines can’t access – for example, Javascript, images, or Flash files – remember that many human visitors using screen readers, mobile browsers, browsers without plug-ins, and slow connections will not be able to view that content either,” Google says. “Using descriptive text for these items will improve the accessibility of your site. You can test accessibility by turning off Javascript, Flash, and images in your browser, or by using a text-only browser such as Lynx.” Here a couple of relevant videos from Matt Cutts that you should probably watch, if you have any questions about how Google handles hidden text. They’re short, so don’t worry. Probably the best takeaway from Google’s advice on hidden text and links is that you should either remove them or make them more easily viewable (in the case that they’re actually relevant). As with other quality guidelines violations, you can submit a reconsideration request, once you’re sure you’ve got everything in compliance. Google has tips for doing that too . More Penguin Update coverage here . Image: Batman TV Series (ABC)

Apr 26 2012

Google Penguin Update: The New Name For The WebSpam Update

Here we go. Get ready for a barrage of Penguin articles to complement the Panda articles, just as the Penguin update complements the Panda update in bringing quality to Google’s search results (or at least trying to). Yes, the Webspam Update has now been named the Penguin Update, reportedly. According to Danny Sullivan, whose word is pretty credible within the search industry, Google has officially named the Webspam Update the Penguin update. Sullivan had previously reported that Google’s Matt Cutts specifically called it the Webspam algorithm update, but has now altered his article , saying Google is officially calling it the Penguin update. Matt Cutts tweeted this Instagram photo (why no Google+?) which would seem to confirm the name: At least it will be easier to find stock images of penguins (as opposed to webspam) for future articles. And it’s better than the “ viagra update ” (arguably). More coverage on the algorithm (and not the silly name) here: Webspam And Panda Updates: Does SEO Still Matter? Google Webspam Algorithm Update Draws Mixed Reviews From Users Google Webspam Update: Where’s The Viagra? [Updated] Google Webspam Update: “Make Money Online” Query Yields Less Than Quality Result Google Webspam Update: Losers & Winners, According To Searchmetrics [Updated] How Much Of Google’s Webspam Efforts Come From These Patents? Google Panda Update: Data Refresh Hit Last Week

Apr 25 2012

Matt Cutts Talks About How Google Handles Ajax

Google’s Matt Cutts put up a new Webmaster Help video, discussing how Google deals with Ajax. He takes on the following user-submitted question: How effective is Google now at handling content supplied via Ajax, is this likely to improve in the future? “Well, let me take Ajax, which is Asynchronous Javascript, and make it just Javascript for the time being,” says Cutts. “Google is getting more effective over time, so we actually have the ability not just to scan in strings of Javascript to look for URLs, but to actually process some of the Javascript. And so that can help us improve our crawl coverage quite a bit, especially if people use Javascript to help with navigation or drop-downs or those kinds of things. So Asynchronous Javascript is a little bit more complicated, and that’s maybe further down the road, but the common case is Javascript.” “And we’re getting better, and we’re continuing to improve how well we’re able to process Javascript,” he continues. “In fact, let me just take a little bit of time and mention, if you block Javascript or CSS in your robots.txt, where Googlebot can’t crawl it, I would change that. I would recommend making it so that Googlebot can crawl the Javascript and can crawl the CSS, because that makes it a lot easier for us to figure out what’s going on if we’re processing the Javascript or if we’re seeing and able to process and get a better idea of what the page is like.” As a matter of fact, Cutts actually put out a separate video about this last month, in which he said, “If you block Googlebot from crawling javascript or CSS, please take a few minutes and take that out of the robots.txt and let us crawl the javascript. Let us crawl the CSS, and get a better idea of what’s going on on the page.” “So I absolutely would recommend trying to check through your robots.txt, and if you have disallow slash Javascript, or star JS, or star CS, go ahead and remove that, because that helps Googlebot get a better idea of what’s going on on the page,” he reiterates in the new video. In another new video , Cutts talks about why Google won’t remove pages from its index at your request.