Nov 6 2014

Is The Matt Cutts Era Over?

It’s not 100% clear yet, but it’s looking like for webmasters and SEOs, the era of Matt Cutts is a thing of the past. His career at Google may continue, but it doesn’t sound like he’ll be the head of webspam going forward. Would you like to see Matt Cutts return to the role he’s held for years, or do you look forward to change in the search department? Share your thoughts in the comments . It’s a pretty interesting time in search right now. Matt Cutts, who has been the go-to guy for webmaster help and Q&A related to Google search for quite a few years, has been on leave from the company since July. Meanwhile, his counterpart over at Bing has been let go from his duties at Microsoft . @DuaneForrester sending you good thoughts today. Thanks for providing info to so many people and tough love when needed. — Matt Cutts (@mattcutts) October 30, 2014 When Cutts announced his leave , he didn’t really make it sound like he wouldn’t be back, but rather like he would be taking a nice,long, much-deserved vacation. He wrote on his blog : I wanted to let folks know that I’m about to take a few months of leave. When I joined Google, my wife and I agreed that I would work for 4-5 years, and then she’d get to see more of me. I talked about this as recently as last month and as early as 2006. And now, almost fifteen years later I’d like to be there for my wife more. I know she’d like me to be around more too, and not just physically present while my mind is still on work. So we’re going to take some time off for a few months. My leave starts next week. Currently I’m scheduled to be gone through October. Thanks to a deep bench of smart engineers and spam fighters, the webspam team is in more-than-capable hands. Seriously, they’re much better at spam fighting than I am, so don’t worry on that score. Scheduled to be gone through October. See? Pretty much sounds like a vacation. As you know, October has since come and gone. On October 31, Cutts provided another update, saying he was extending his leave, and wouldn’t be back at Google this year. I'm planning to extend my leave into 2015: — Matt Cutts (@mattcutts) November 1, 2014 Ok, fine. Cutts has been at Google for fourteen years, and can probably take a considerable amount of time off with no problem. But he’d be back in the swing of things in the new year, right? Well, he might be back, but what he’ll be doing remains to be seen. Cutts appeared on the web chat show This Week in Google , hosted by Leo Laporte, who asked him if he’ll go back to the same role, or if this is a chance for him to try something different. This part of the conversation starts at about 9 minutes and 50 seconds in to the video below (h/t: Search Engine Roundtable ). “Well, I really have been impressed with how well everyone else on the team is doing, and it’s created a little bit of an opportunity for them to try new things, explore different stuff, you know, approach problems from a different way, and so we’ll have to see how it goes,” Cutts responded. “I loved the part of my job that dealt with keeping an eye on what important news was happening related to Google, but you know, it’s not clear that having me as a lightning rod, you know for, you know unhappy black hat SEOs or something is the best use of anybody’s time compared to working on other things that could be making the world better for Google or in general. So we’ll see how it all works.” It doesn’t really sounds like he intends to go back to the classic Matt Cutts role. In fact, later in the discussion, he referred to the initial leave as the “official” leave, implying that the one he’s now on is open-ended. Laporte asked him if he has the ability at the company to just do something different if he wants to. He said, “The interesting thing is that at Google they try to get you and go do different projects, so the product managers, they encourage you to rotate every two or three years, and so it’s relatively rare to find people who have been around forever in a specific area. You’ll find Amit [Singhal] in search, Sridhar [Ramaswamy], you know, some of these people that are really, really senior, you know – higher ranking than me for sure – they do stick around in one area, but a lot of other people jump to different parts of the company to furnish different skills and try different things, which is a pretty good idea, I think.” Again, it sounds like he would really like to do something different within the company. He also reiterated his confidence in the current webspam team. On his “colleagues” (he prefers that term to “minions”), he said, “I just have so much admiration for you know, for example, last year, there was a real effort on child porn because of some stuff that happened in the United Kingdom, and a lot of people chipped in, and that is not an easy job at all. So you really have to think hard about how you’re gonna try to tackle this kind of thing.” Jeff Jarvis, who was also on the show, asked Cutts what other things interest him. Cutts responded, “Oh man, I was computer graphics and actually inertial trackers and accelerometers in grad school. At one point I said, you know, you could use commodity hardware, but as a grad student, you don’t have access to influence anybody’s minds, so why don’t I just go do something else for ten years, and somebody else will come up with all these sensors, and sure enough, you’ve got Kinect, you have the Wii, you know, the iPhone. Now everybody’s got a computer in their pocket that can do 3D sensing as long as write the computer programs well. So there’s all kinds of interesting stuff you could do.” Will we see Matt working on the Android team? As a matter of fact, Laporte followed that up by mentioning Andy Rubin – the guy who created Android and brought it to Google – leaving the company. News of that came out last week . Matt later said, “I’ll always have a connection and soft spot for Google…” That’s actually a bit more mysterious of a comment. I don’t want to put any words in the guy’s mouth, but to me, that sounds like he’s not married to the company for the long haul. Either way, webmasters are already getting used to getting updates and helpful videos from Googlers like Pierre Far and John Mueller. We’ve already seen Google roll out new Panda and Penguin updates since Cutts has been on leave, and the SEO world hasn’t come crumbling down. I’m guessing Cutts is getting less hate mail these days. He must have been getting tired of disgruntled website owners bashing him online all the time. It’s got to be nice to not have to deal with that all the time. As I said at the beginning of the article, it’s really not clear what Matt’s future holds, so all we can really do is listen to what he’s said, and look for him to update people further on his plans. In the meantime, if you miss him, you can peruse the countless webmaster videos and comments he’s made over the years that we’ve covered here . Do you expect Matt Cutts to return to search in any capacity? Do you expect him to return to Google? Should he? Do you miss him already? Let us know what you think .

Jun 4 2014

Here’s Another Matt Cutts Floating Head Video (About The Most Common SEO Mistake)

  • Posted by in Web Pro News
  • Comments Off on Here’s Another Matt Cutts Floating Head Video (About The Most Common SEO Mistake)

We’ll just keep this one short like the video itself. The most common SEO mistake you can make, according to Matt Cutts, is not having a website. Hopefully you feel you’ve gotten your money’s worth on that one. Once again , Cutts uses the ol’ floating head trick. I wonder how many more of these things he’s got. Image via YouTube

Jun 4 2014

Here’s Another Matt Cutts Floating Head Video (About The Most Common SEO Mistake)

  • Posted by in Web Pro News
  • Comments Off on Here’s Another Matt Cutts Floating Head Video (About The Most Common SEO Mistake)

We’ll just keep this one short like the video itself. The most common SEO mistake you can make, according to Matt Cutts, is not having a website. Hopefully you feel you’ve gotten your money’s worth on that one. Once again , Cutts uses the ol’ floating head trick. I wonder how many more of these things he’s got. Image via YouTube The post Here’s Another Matt Cutts Floating Head Video (About The Most Common SEO Mistake) appeared first on WebProNews .

May 21 2014

Google Launches Two Algorithm Updates Including New Panda

Google makes changes to its algorithm every day (sometimes multiple changes in one day). When the company actually announces them, you know they’re bigger than the average update, and when one of them is named Panda, it’s going to get a lot of attention. Have you been affected either positively or negatively by new Google updates? Let us know in the comments . Google’s head of webspam Matt Cutts tweeted about the updates on Tuesday night: Google is rolling out our Panda 4.0 update starting today. — Matt Cutts (@mattcutts) May 20, 2014 This past weekend we started rolling out a ranking update for very spammy queries: — Matt Cutts (@mattcutts) May 21, 2014 Panda has been refreshed on a regular basis for quite some time now, and Google has indicated in the past that it no longer requires announcements because of that. At one point, it was actually softened . But now, we have a clear announcement about it, and a new version number (4.0), so it must be significant. For one, this indicates that the algorithm was actually updated as opposed to just refreshed, opening up the possibility for some big shuffling of rankings. The company told Search Engine Land that the new Panda affects different languages to different degrees, and impacts roughly 7.5% of queries in English to the degree regular users might notice. The other update is the what is a new version of what is sometimes referred to as the “payday loans” update. The first one was launched just a little more than a year ago. Cutts discussed it in this video before launching it: “We get a lot of great feedback from outside of Google, so, for example, there were some people complaining about searches like ‘payday loans’ on,” he said. “So we have two different changes that try to tackle those kinds of queries in a couple different ways. We can’t get into too much detail about exactly how they work, but I’m kind of excited that we’re going from having just general queries be a little more clean to going to some of these areas that have traditionally been a little more spammy, including for example, some more pornographic queries, and some of these changes might have a little bit more of an impact on those kinds of areas that are a little more contested by various spammers and that sort of thing.” He also discussed it at SMX Advanced last year. As Barry Schwartz reported at the time: Matt Cutts explained this goes after unique link schemes, many of which are illegal. He also added this is a world-wide update and is not just being rolled out in the U.S. but being rolled out globally. This update impacted roughly 0.3% of the U.S. queries but Matt said it went as high as 4% for Turkish queries were web spam is typically higher. That was then. This time, according to Schwartz , who has spoken with Cutts, it impacts English queries by about 0.2% to a noticeable degree. Sites are definitely feeling the impact of Google’s new updates. Here are a few comments from the WebmasterWorld forum from various webmasters: We’ve seen a nice jump in Google referrals and traffic over the past couple of days, with the biggest increase on Monday (the announced date of the Panda 4.0 rollout). Our Google referrals on Monday were up by 130 percent…. … I am pulling out my hair. I’ve worked hard the past few months to overcome the Panda from March and was hoping to come out of it with the changes I made. Absolutely no change at all in the SERPS. I guess I’ll have to start looking for work once again. … While I don’t know how updates are rolled out, my site that has had panda problems since April 2011first showed evidence of a traffic increase at 5 p.m. (central, US) on Monday (5/19/2014). … This is the first time I have seen a couple sites I deal with actually get a nice jump in rankings after a Panda… It appears that eBay has taken a hit. Dr. Peter J. Meyers at Moz found that eBay lost rankings on a variety of keywords, and that the main eBay subodmain fell out of Moz’s “Big 10,” which is its metric of the ten domains with the most real estate in the top 10. “Over the course of about three days, eBay fell from #6 in our Big 10 to #25,” he writes. “Change is the norm for Google’s SERPs, but this particular change is clearly out of place, historically speaking. eBay has been #6 in our Big 10 since March 1st, and prior to that primarily competed with for either the #6 or #7 place. The drop to #25 is very large. Overall, eBay has gone from right at 1% of the URLs in our data set down to 0.28%, dropping more than two-thirds of the ranking real-estate they previously held.” He goes on to highlight specific key phrases where eBay lost rankings. It lost two top ten rankings for three separate phrases: “fiber optic christmas tree,” “tongue rings,” and “vermont castings”. Each of these, according to Meyers, was a category page on eBay. eBay also fell out of the top ten, according to this report, for queries like “beats by dr dre,” “honeywell thermostat,” “hooked on phonics,” “batman costume,” “lenovo tablet,” “george foreman grill,” and many others. It’s worth noting that eBay tended to be on the lower end of the top ten rankings for these queries. They’re not dropping out of the number one spot, apparently. Either way, this is isn’t exactly good news for eBay sellers. Of course, it’s unlikely that Google was specifically targeting eBay with either update, and they could certainly bounce back. Have you noticed any specific types of sites (or specific sites) that have taken a noticeable hit? Do Google’s results look better in general? Let us know in the comments . Image via Thinkstock

May 5 2014

Google: Links Will Become Less Important

Links are becoming less important as Google gets better at understanding the natural language of users’ queries. That’s the message we’re getting from Google’s latest Webmaster Help video. It will be a while before links become completely irrelevant, but the signal that Google’s algorithm was basically based upon is going to play less and less of a role as time goes on. Do you think Google should de-emphasize links in its algorithm? Do you think they should count as a strong signal even now? Share your thoughts . In the video, Matt Cutts takes on this user-submitted question: Google changed the search engine market in the 90s by evaluating a website’s backlinks instead of just the content, like others did. Updates like Panda and Penguin show a shift in importance towards content. Will backlinks lose their importance? “Well, I think backlinks have many, many years left in them, but inevitably, what we’re trying to do is figure out how an expert user would say this particular page matched their information needs, and sometimes backlinks matter for that,” says Cutts. “It’s helpful to find out what the reputation of a site or of a page is, but for the most part, people care about the quality of the content on that particular page – the one that they landed on. So I think over time, backlinks will become a little less important. If we could really be able to tell, you know, Danny Sullivan wrote this article or Vanessa Fox wrote this article – something like that, that would help us understand, ‘Okay, this is something where it’s an expert – an expert in this particular field – and then even if we don’t know who actually wrote something, Google is getting better and better at understanding actual language.” “One of the big areas that we’re investing in for the coming few months is trying to figure out more like how to do a Star Trek computer, so conversational search – the sort of search where you can talk to a machine, and it will be able to understand you, where you’re not just using keywords,” he adds. You know, things like this: Cutts continues,”And in order to understand what someone is saying, like, ‘How tall is Justin Bieber?’ and then, you know, ‘When was he born?’ to be able to know what that’s referring to, ‘he’ is referring to Justin Bieber – that’s the sort of thing where in order to do that well, we need to understand natural language more. And so I think as we get better at understanding who wrote something and what the real meaning of that content is, inevitably over time, there will be a little less emphasis on links. But I would expect that for the next few years we will continue to use links in order to assess the basic reputation of pages and of sites.” Links have always been the backbone of the web. Before Google, they were how you got from one page to the next. One site to the next. Thanks to Google, however (or at least thanks to those trying desperately to game Google, depending on how you look at it), linking is broken. It’s broken as a signal because of said Google gaming, which the search giant continues to fight on an ongoing basis. The very concept of linking is broken as a result of all of this too. Sure, you can still link however you want to whoever you want. You don’t have to please Google if you don’t care about it, but the reality is, most sites do care, because Google is how the majority of people discover content. As a result of various algorithm changes and manual actions against some sites, many are afraid of the linking that they would have once engaged in. We’ve seen time after time that sites are worried about legitimate sites linking to them because they’re afraid Google might not like it. We’ve seen sites afraid to naturally link to other sites in the first place because they’re afraid Google might not approve. No matter how you slice it, linking isn’t what it used to be, and that’s largely because of Google. But regardless of what Google does, the web is changing, and much of that is going mobile. That’s a large part of why Google must adapt with this natural language search. Asking your phone a question is simply a common way of searching. Texting the types of queries you’ve been doing from the desktop for years is just annoying, and when your phone has that nice little microphone icon, which lets you ask Google a question, it’s just the easier choice (in appropriate locations at least). Google is also adapting to this mobile world by indexing content within apps as it does links, so you if you’re searching on your phone, you can open content right in the app rather than in the browser. Last week, Facebook made an announcement taking this concept to another level when it introduced App Links. This is an open source standard ( assuming it becomes widely adopted ) for apps to link to one another, enabling users to avoid the browser and traditional links altogether by jumping from app to app. It’s unclear how Google will treat App Links, but it would make sense to treat them the same as other links. The point is that linking itself is both eroding and evolving at the same time. It’s changing, and Google has to deal with that as it comes. As Cutts said, linking will still play a significant role for years to come, but how well Google is able to adapt to the changes in linking remains to be seen. Will it be able to deliver the best content based on links if some of that content is not being linked to because others are afraid to link to it? Will it acknowledge App Links, and if so, what about the issues that’ having? Here’s the “standard” breaking the web, as one guy put it: What if this does become a widely adopted standard, but proves to be buggy as shown above? Obviously, Google is trying to give you the answers to your queries on its own with the Knowledge Graph when it can. Other times it’s trying to fill in the gaps in that knowledge with similarly styled answers from websites . It’s unclear how much links fit into the significance of these answers. We’ve seen two examples in recent weeks where Google was turning to parked domains. Other times, the Knowledge Graph just provides erroneous information. As Cutts said, Google will get better and better at natural language, but it’s clear this is the type of search results it wants to provide whenever possible. The problem is it’s not always reliable, and in some cases, the better answer comes from good old fashioned organic search results (of the link-based variety). We saw an example of this recently, which Google ended up changing after we wrote about it (not saying it was because we wrote about it). So if backlinks will become less important over time, does that mean traditional organic results will continue to become a less significant part of the Google search experience? It’s certainly already trended in that direction over the years. What do you think? How important should links be to Google’s ranking? Share your thoughts in the comments . Images via YouTube , Google

Apr 8 2014

An Update (Kind Of) On How Google Handles JavaScript

The latest Google Webmaster Help video provides an update on where Google is on handling JavaScript and AJAX. Well, an update on where they were nearly a year ago at least. Matt Cutts responds to this question: JavaScript is being used more and more to progressively enhance content on page & improve usability. How does Googlebot handle content loaded (AJAX) or displayed (JC&CSS) by Javascript on pageload, on click? “Google is pretty good at indexing JavaScript, and being able to render it, and bring it into our search results. So there’s multiple stages that have to happen,” Cutts says. “First off, we try to fetch all the JavaScript, CSS – all those sorts of resources – so that we can put the page under the microscope, and try to figure out, ‘Okay, what parts of this page should be indexed? What are the different tokens or words that should be indexed?’ that sort of thing. Next, you have to render or execute the JavaScript, and so we actually load things up, and we try to pretend as if a real browser is sort of loading that page, and what would that real browser do? Along the way, there are various events you could trigger or fire. There’s the page on load. You could try to do various clicks and that sort of thing, but usually there’s just the JavaScript that would load as you start to load up the page, and that would execute there.” “Once that JavaScript has all been loaded, which is the important reason why you should always let Google crawl the JavaScript and the CSS – all those sorts of resources – so that we can execute the page,” he continues. “Once we’ve fetched all those resources, we try to render or execute that JavaScript, and then we extract the tokens – the words that we think should be indexed – and we put that into our index.” “As of today, there’s still a few steps left,” Cutts notes. “For example, that’s JavaScript on the page. What if you have JavaScript that’s injected via an iframe? We’re still working on pulling in indexable tokens from JavaScript that are accessible via iframes, and we’re getting pretty close to that. As of today, I’d guess that we’re maybe a couple months away although things can vary depending on engineering resources, and timelines, and schedules, and that sort of thing. But at that point, then you’ll be able to have even included Javascript that can add a few tokens to the page or that we can otherwise index.” It’s worth noting that this video was recorded almost a year ago (May 8th, 2013). That’s how long it can take for Google to release these things sometimes. Cutts notes that his explanation reflects that particular point in time. We’re left to wonder how far Google has really come since then. There’s that transparency we’re always hearing about. He also notes that Google’s not the only search engine, so you may want to think about what other search engines are able to do. He also says Google reserves the right to put limits on how much it’s going to index or how much time it will spend processing a page. Image via YouTube

Apr 2 2014

Matt Cutts Talks About Coming Google Algorithm Changes

Google seems to have announced some coming changes to its algorithm in the latest “Webmaster Help” video. Head of webspam Matt Cutts said the search engine is working on some changes that will help it better determine when a site is an authority on a topic. He didn’t give any specific dates or anything, but says he’s “looking forward to to those rolling out.” Do you think Google is good at determining which sites are authorities on certain topics right now? Do you expect these changes to lead to better results? Let us know what you think in the comments . The topic came up when Blind Five Year Old asked Cutts, “As Google continues to add social signals to the algorithm, how do you separate simple popularity from true authority?” Cutts says in the video that the first part of that question makes an “assumption” in that Google is using social signals in its ranking algorithm. The rest of the time, he talks more about authority vs. popularity more generally, and doesn’t really get into social signals at all. He did recently talk about Facebook and Twitter signals in another video. More on that here . CEO Larry Page has also talked about social signals in search in the past. Regarding popularity versus authority, Cutts says, “We’ve actually thought about this quite a bit because from the earliest days it would get us really kind of frustrated when we would see reporters talk about PageRank, and say, ‘PageRank is a measure of popularity of websites,’ because that’s not true.” He goes on to talk about how porn sites are popular because a lot of people go to them, but not a lot of people link to them, and how on the other hand, a lot of people link to government websites, but not as many go to them. They want the government sites to have authority, but porn sites not so much. “You can separate simple popularity from reputation or authority, but now how do we try to figure out whether you’re a good match for a given query?” Cutts continues. “Well, it turns out you can say, take PageRank for example – if you wanted to do a topical version of PageRank, you could look at the links to a page, and you could say, ‘OK, suppose it’s Matt Cutts. How many of my links actually talk about Matt Cutts?’ And if there are a lot of links or a large fraction of the links, then I’m pretty topical. I’m maybe an authority for the phrase Matt Cutts.” “It’s definitely the case that you can think about not only taking popularity, and going to something like reputation, which is PageRank, but you could also imagine more topical…’Oh, you’re an authority in the medical space” or “You’re an authority in the travel space” or something like that. By looking at extra signals where you could say, ‘Oh, you know what? As a percentage of the sorts of things we see you doing well for or whatever, it turns out that your links might be including more anchor text about travel or about medical queries or something like that,’ so it is difficult, but it’s a lot of fun.” Then we get to the part about the upcoming algorithm changes. “We actually have some algorithmic changes that try to figure out, ‘Hey, this site is a better match for something like a medical query, and I’m looking forward to those rolling out, because a lot of people have worked hard so that you don’t just say, ‘Hey, this is a well-known site, therefore it should match for this query.’ It’s ‘this is a site that actually has some evidence that it should rank for something related to medical queries,’ and that’s something where we can improve the quality of the algorithms even more.” If they actually work, these changes could indeed provide a boost to search result quality. In fact, this is just the kind of thing that it seemed like the Panda update was originally designed to do. Remember how it was initially referred to as the “farmer” update because it was going after content farms, which ware saturating the search results? Many of those articles from said farms were drowning out authoritative sites on various topics. There is supposed to be a “next generation” Panda update hitting sometime as well, though Cutts didn’t really suggest in the video that this was directly related to that. That one, he said, could help small businesses and small sites. After the initial Panda update, Google started placing a great deal of emphasis on freshness, which led to a lot of newer content ranking for any given topic. This, in my opinion, didn’t help things much on the authority side of things. Sometimes more authoritative (or frankly relevant) content was again getting pushed down in favor of newer, less helpful content. I do think things have gotten a bit better on that front over maybe the past year or so, but there’s always room for improvement. It’s interesting that Google is looking more at authority by topic now, because Cutts has also been suggesting that blogs stay on topic (I guess whatever topic Google thinks you should be writing about) at least when it comes to guest blog posts. As you may know, Google has been cracking down on guest blog posts, and when one site was penalized, Cutts specifically suggested that the topic of one post wasn’t relevant to the blog (even though most people seem to disagree with that). Either way, this is another clue that Google really is looking at authority by topic. It seems like it might be as good a time as any to be creating content geared toward a specific niche. Do you think these algorithm changes will help or hurt your site? Will they improve Google’s search results? Let us know what you think in the comments .

Feb 26 2014

Google: You Don’t Have To Dumb Your Content Down ‘That Much’

  • Posted by in Web Pro News
  • Comments Off on Google: You Don’t Have To Dumb Your Content Down ‘That Much’

Google’s Matt Cutts answers an interesting question in a new “Webmaster Help” video: “Should I write content that is easier to read or more scientific? Will I rank better if I write for 6th graders?” Do you think Google should give higher rankings to content that is as well-researched as possible, or content that is easier for the layman to understand? Share your thoughts in the comments . This is a great question as we begin year three of the post-Panda Google . “This is a really interesting question,” says Cutts. “I spent a lot more time thinking about it than I did a lot of other questions today. I really feel like the clarity of what you write matters a lot.” He says, “I don’t know if you guys have ever had this happen, but you land on Wikipedia, and you’re searching for information – background information – about a topic, and it’s way too technical. It uses all the scientific terms or it’s talking about a muscle or whatever, and it’s really hyper-scientific, but it’s not all that understandable, and so you see this sort of revival of people who are interested in things like ‘Explain it to me like I’m a five-year-old,’ right? And you don’t have to dumb it down that much , but if you are erring on the side of clarity, and on the side of something that’s going to be understandable, you’ll be in much better shape because regular people can get it, and then if you want to, feel free to include the scientific terms or the industry jargon, the lingo…whatever it is, but if somebody lands on your page, and it’s just an opaque wall of scientific stuff, you need to find some way to pull people in to get them interested, to get them enticed in trying to pick up whatever concept it is you want to explain.” Okay, it doesn’t sound so bad the way Cutts describes it, and perhaps I’m coming off a little sensational here, but it’s interesting that Cutts used the phrase, “You don’t have to dumb it down that much.” This is a topic that we discussed last fall when a Googler Ryan Moulton said in a conversation on Hacker News, “There’s a balance between popularity and quality that we try to be very careful with. Ranking isn’t entirely one or the other. It doesn’t help to give people a better page if they aren’t going to click on it anyways.” He then elaborated: Suppose you search for something like [pinched nerve ibuprofen]. The top two results currently are… and… Almost anyone would agree that the mayoclinic result is higher quality. It’s written by professional physicians at a world renowned institution. However, getting the answer to your question requires reading a lot of text. You have to be comfortable with words like “Nonsteroidal anti-inflammatory drugs,” which a lot of people aren’t. Half of people aren’t literate enough to read their prescription drug labels: The answer on yahoo answers is provided by “auntcookie84.” I have no idea who she is, whether she’s qualified to provide this information, or whether the information is correct. However, I have no trouble whatsoever reading what she wrote, regardless of how literate I am. That’s the balance we have to strike. You could imagine that the most accurate and up to date information would be in the midst of a recent academic paper, but ranking that at 1 wouldn’t actually help many people. This makes for a pretty interesting debate. Should Google bury the most well-researched and accurate information just to help people find something that they can read easier, even if it’s not as high quality? Doesn’t this kind of go against the guidelines Google set forth after the Panda update? You know, like these specific questions Google suggested you ask about your content: “Would you trust the information presented in this article?” (What’s more trustworthy, the scientific explanation from a reputable site or auntcookie’s take on Yahoo Answers?) “Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?” (Uh…) “Does the article provide original content or information, original reporting, original research, or original analysis?” (Original research and analysis, to me, suggests that someone is going to know and use the lingo.) “Does the page provide substantial value when compared to other pages in search results?” (Couldn’t value include educating me about the terminology I might not otherwise understand?) “Is the site a recognized authority on its topic?” (You mean the type of authority that would use the terminology associated with the topic?) “For a health related query, would you trust information from this site?” (Again, are you really trusting auntcookie on Yahoo Answers over Mayo Clinic?) “Does this article provide a complete or comprehensive description of the topic?” (Hmm. Complete and comprehensive. You mean as opposed to dumbed down for the layman?) “Does this article contain insightful analysis or interesting information that is beyond obvious?” (I’m not making this up. Here’s Google’s blog post listing these right here .) “Are the pages produced with great care and attention to detail vs. less attention to detail?” (You get the idea.) Maybe I’m missing something, but it seems like Google has been encouraging people to make their content as thorough, detailed, and authoritative as possible. I don’t see “Is your content dumbed down for clarity’s sake?” on the list. Of course that was nearly three years ago. If quality is really the goal (as Google has said over and over again in the past), doesn’t the responsibility of additional research and additional clicking of links rest with the searcher? If I don’t understand what the most accurate and relevant result is saying, isn’t it my responsibility to continue to educate myself, perhaps by looking at other sources of information and looking up the things I don’t understand? But that would go against Google trying to get users answers as quickly as possible. That must be why Google is trying to give you the answers itself rather than having to send you to third-party sites. Too bad those answers aren’t always reliable . Cutts continues in the video, “So I would argue first and foremost, you need to explain it well, and then if you can manage to do that while talking about the science or being scientific, that’s great, but the clarity of what you do, and how you explain it often matters almost as much as what you’re actually saying because if you’re saying something important, but you can’t get it across, then sometimes you never got it across in the first place, and it ends up falling on deaf ears.” Okay, sure, but isn’t this just going to encourage users to dumb down content at the risk of educating users less? I don’t think that’s what Cutts is trying to say here, but people are going to do anything they can to get their sites ranked better. At least he suggests trying to use both layman’s terms and the more scientific stuff. “It varies,” he says. “If you’re talking only to industry professionals – terminators who are talking about the scientific names of bugs, and your audience is only bugs – terminator – you know, exterminator experts, sure, then that might make sense, but in general, I would try to make things as natural sounding as possible – even to the degree that when I’m writing a blog post, I’ll sometimes read it out loud to try to catch what the snags are where things are gonna be unclear. Anything you do like that, you’ll end up with more polished writing, and that’s more likely to stand the test of time than something that’s just a few, scientific mumbo jumbo stuff that you just spit out really quickly.” I’m not sure where the spitting stuff out really quickly thing comes into play here. The “scientific mumbo jumbo” (otherwise known as facts and actual terminology of things) tends to appear in lengthy, texty content, like Moulton suggested, no? Google, of course, is trying to get better at natural language with updates like Hummingbird and various other acquisitions and tweaks . It should only help if you craft your content around that. “It’s not going to make that much of a difference as far as ranking,” Cutts says. “I would think about the words that a user is going to type, which is typically going to be the layman’s terms – the regular words rather than the super scientific stuff – but you can find ways to include both of them, but I would try to err on the side of clarity if you can.” So yeah, dumb it down. But not too much. Just enough. But also include the smart stuff. Just don’t make it too smart. What do you think? Should Google dumb down search results to give users things that are easier to digest, or should it be the searcher’s responsibility to do further research if they don’t understand the accurate and well-researched information that they’re consuming? Either way, isn’t this kind of a mixed message compared to the guidance Google has always given regarding “quality” content? Share your thoughts . For the record, I have nothing against auntcookie. I know nothing about auntcookie, but that’s kind of the point.

Dec 19 2013

Google: Your Various ccTLDs Will Probably Be Fine From The Same IP Address

  • Posted by in Web Pro News
  • Comments Off on Google: Your Various ccTLDs Will Probably Be Fine From The Same IP Address

Ever wondered if Google would mind if you had multiple ccTLD sites hosted from a single IP address? If you’re afraid they might not take kindly to that, you’re in for some good news. It’s not really that big a deal. Google’s Matt Cutts may have just saved you some time and money with this one. He takes on the following submitted question in the latest Webmaster Help video: For one customer we have about a dozen individual websites for different countries and languages, with different TLDs under one IP number. Is this okay for Google or do you prefer one IP number per country TLD? “In an ideal world, it would be wonderful if you could have, for every different, .com, .fr, .de, if you could have a different, separate IP address for each one of those, and have them each placed in the UK, or France, or Germany, or something like that,” says Cutts. “But in general, the main thing is, as long as you have different country code top level domains, we are able to distinguish between them. So it’s definitely not the end of the world if you need to put them all on one IP address. We do take the top-level domain as a very strong indicator.” “So if it’s something where it’s a lot of money or it’s a lot of hassle to set that sort of thing up, I wouldn’t worry about it that much,” he adds. “Instead, I’d just go ahead and say, ‘You know what? I’m gonna go ahead and have all of these domains on one IP address, and just let the top-level domain give the hint about what country it’s in. I think it should work pretty well either way.” While on the subject, you might want to listen to what Cutts had to say about location and ccTLDs earlier this year in another video .

Dec 9 2013

Google Gives Advice On Speedier Penalty Recovery

Google has shared some advice in a new Webmaster Help video about recovering from Google penalties that you have incurred as the result of a time period of spammy links. Now, as we’ve seen, sometimes this happens to a company unintentionally. A business could have hired the wrong person/people to do their SEO work, and gotten their site banished from Google, without even realizing they were doing anything wrong. Remember when Google had to penalize its own Chrome landing page because a third-party firm bent the rules on its behalf? Google is cautiously suggesting “radical” actions from webmasters, and sending a bit of a mixed message. How far would you go to get back in Google’s good graces? How important is Google to your business’ survival? Share your thoughts in the comments . The company’s head of webspam, Matt Cutts, took on the following question: How did Interflora turn their ban in 11 days? Can you explain what kind of penalty they had, how did they fix it, as some of us have spent months try[ing] to clean things up after an unclear GWT notification. As you may recall, Interflora, a major UK flowers site, was hit with a Google penalty early this year . Google didn’t exactly call out the company publicly, but after reports of the penalty came out, the company mysteriously wrote a blog post warning people not to engage in the buying and selling of links. But you don’t have to buy and sell links to get hit with a Google penalty for webspam, and Cutts’ response goes beyond that. He declines to discuss a specific company because that’s not typically not Google’s style, but proceeds to try and answer the question in more general terms. “Google tends to looking at buying and selling links that pass PageRank as a violation of our guidelines, and if we see that happening multiple times – repeated times – then the actions that we take get more and more severe, so we’re more willing to take stronger action whenever we see repeat violations,” he says. That’s the first thing to keep in mind, if you’re trying to recover. Don’t try to recover by breaking the rules more, because that will just make Google’s vengeance all the greater when it inevitably catches you. Google continues to bring the hammer down on any black hat link network it can get its hands on, by the way. Just the other day, Cutts noted that Google has taken out a few of them , following a larger trend that has been going on throughout the year. The second thing to keep in mind is that Google wants to know your’e taking its guidelines seriously, and that you really do want to get better – you really do want to play by the rules. “If a company were to be caught buying links, it would be interesting if, for example, [if] you knew that it started in the middle of 2012, and ended in March 2013 or something like that,” Cutts continues in the video. “If a company were to go back and disavow every single link that they had gotten in 2012, that’s a pretty monumentally epic, large action. So that’s the sort of thing where a company is willing to say, ‘You know what? We might have had good links for a number of years, and then we just had really bad advice, and somebody did everything wrong for a few months – maybe up to a year, so just to be safe, let’s just disavow everything in that timeframe.’ That’s a pretty radical action, and that’s the sort of thing where if we heard back in a reconsideration request that someone had taken that kind of a strong action, then we could look, and say, ‘Ok, this is something that people are taking seriously.” Now, don’t go getting carried away. Google has been pretty clear since the Disavow Links tool launched that this isn’t something that most people want to do. Cutts reiterates, “So it’s not something that I would typically recommend for everybody – to disavow every link that you’ve gotten for a period of years – but certainly when people start over with completely new websites they bought – we have seen a few cases where people will disavow every single link because they truly want to get a fresh start. It’s a nice looking domain, but the previous owners had just burned it to a crisp in terms of the amount of webspam that they’ve done. So typically what we see from a reconsideration request is people starting out, and just trying to prune a few links. A good reconsideration request is often using the ‘domain:’ query, and taking out large amounts of domains which have bad links.” “I wouldn’t necessarily recommend going and removing everything from the last year or everything from the last year and a half,” he adds. “But that sort of large-scale action, if taken, can have an impact whenever we’re assessing a domain within a reconsideration request.” In other words, if your’e willing to go to such great lengths and eliminate such a big number of links, Google’s going to notice. I don’t know that it’s going to get you out of the penalty box in eleven days (as the Interflora question mentions), but it will at least show Google that you mean business, and, in theory at least, help you get out of it. Much of what Cutts has to say this time around echoes things he has mentioned in the past. Earlier this year, he suggested using the Disavow Links tool like a “machete”. He noted that Google sees a lot of people trying to go through their links with a fine-toothed comb, when they should really be taking broader swipes. “For example, often it would help to use the ‘domain:’ operator to disavow all bad backlinks from an entire domain rather than trying to use a scalpel to pick out the individual bad links,” he said. “That’s one reason why we sometimes see it take a while to clean up those old, not-very-good links.” On another occasion, he discussed some common mistakes he sees people making with the Disavow Links tool. The first time someone attempts a reconsideration request, people are taking the scalpel (or “fine-toothed comb”) approach, rather than the machete approach. “You need to go a little bit deeper in terms of getting rid of the really bad links,” he said. “So, for example, if you’ve got links from some very spammy forum or something like that, rather than trying to identify the individual pages, that might be the opportunity to do a ‘domain:’. So if you’ve got a lot of links that you think are bad from a particular site, just go ahead and do ‘domain:’ and the name of that domain. Don’t maybe try to pick out the individual links because you might be missing a lot more links.” And remember, you need to make sure you’re using the right syntax. You need to use the “domain:” query in the following format: Don’t add an “http” or a ‘www” or anything like that. Just the domain. So, just to recap: Radical, large-scale actions could be just what you need to take to make Google seriously reconsider your site, and could get things moving more quickly than trying single out links from domains. But Google wouldn’t necessarily recommend doing it. Oh, Google. You and your crystal clear, never-mixed messaging. As Max Minzer commented on YouTube (or is that Google+? ), “everyone is going to do exactly that now…unfortunately.” Yes, this advice will no doubt lead many to unnecessarily obliterate many of the backlinks they’ve accumulated – including legitimate links – for fear of Google . Fear they won’t be able to make that recovery at all, let alone quickly. Hopefully the potential for overcompensation will be considered if Google decides to use Disavow Links as a ranking signal . Would you consider having Google disavow all links from a year’s time? Share your thoughts in the comments .