Nov 6 2014

Is The Matt Cutts Era Over?

It’s not 100% clear yet, but it’s looking like for webmasters and SEOs, the era of Matt Cutts is a thing of the past. His career at Google may continue, but it doesn’t sound like he’ll be the head of webspam going forward. Would you like to see Matt Cutts return to the role he’s held for years, or do you look forward to change in the search department? Share your thoughts in the comments . It’s a pretty interesting time in search right now. Matt Cutts, who has been the go-to guy for webmaster help and Q&A related to Google search for quite a few years, has been on leave from the company since July. Meanwhile, his counterpart over at Bing has been let go from his duties at Microsoft . @DuaneForrester sending you good thoughts today. Thanks for providing info to so many people and tough love when needed. — Matt Cutts (@mattcutts) October 30, 2014 When Cutts announced his leave , he didn’t really make it sound like he wouldn’t be back, but rather like he would be taking a nice,long, much-deserved vacation. He wrote on his blog : I wanted to let folks know that I’m about to take a few months of leave. When I joined Google, my wife and I agreed that I would work for 4-5 years, and then she’d get to see more of me. I talked about this as recently as last month and as early as 2006. And now, almost fifteen years later I’d like to be there for my wife more. I know she’d like me to be around more too, and not just physically present while my mind is still on work. So we’re going to take some time off for a few months. My leave starts next week. Currently I’m scheduled to be gone through October. Thanks to a deep bench of smart engineers and spam fighters, the webspam team is in more-than-capable hands. Seriously, they’re much better at spam fighting than I am, so don’t worry on that score. Scheduled to be gone through October. See? Pretty much sounds like a vacation. As you know, October has since come and gone. On October 31, Cutts provided another update, saying he was extending his leave, and wouldn’t be back at Google this year. I'm planning to extend my leave into 2015: https://t.co/T5adq50x4L — Matt Cutts (@mattcutts) November 1, 2014 Ok, fine. Cutts has been at Google for fourteen years, and can probably take a considerable amount of time off with no problem. But he’d be back in the swing of things in the new year, right? Well, he might be back, but what he’ll be doing remains to be seen. Cutts appeared on the web chat show This Week in Google , hosted by Leo Laporte, who asked him if he’ll go back to the same role, or if this is a chance for him to try something different. This part of the conversation starts at about 9 minutes and 50 seconds in to the video below (h/t: Search Engine Roundtable ). “Well, I really have been impressed with how well everyone else on the team is doing, and it’s created a little bit of an opportunity for them to try new things, explore different stuff, you know, approach problems from a different way, and so we’ll have to see how it goes,” Cutts responded. “I loved the part of my job that dealt with keeping an eye on what important news was happening related to Google, but you know, it’s not clear that having me as a lightning rod, you know for, you know unhappy black hat SEOs or something is the best use of anybody’s time compared to working on other things that could be making the world better for Google or in general. So we’ll see how it all works.” It doesn’t really sounds like he intends to go back to the classic Matt Cutts role. In fact, later in the discussion, he referred to the initial leave as the “official” leave, implying that the one he’s now on is open-ended. Laporte asked him if he has the ability at the company to just do something different if he wants to. He said, “The interesting thing is that at Google they try to get you and go do different projects, so the product managers, they encourage you to rotate every two or three years, and so it’s relatively rare to find people who have been around forever in a specific area. You’ll find Amit [Singhal] in search, Sridhar [Ramaswamy], you know, some of these people that are really, really senior, you know – higher ranking than me for sure – they do stick around in one area, but a lot of other people jump to different parts of the company to furnish different skills and try different things, which is a pretty good idea, I think.” Again, it sounds like he would really like to do something different within the company. He also reiterated his confidence in the current webspam team. On his “colleagues” (he prefers that term to “minions”), he said, “I just have so much admiration for you know, for example, last year, there was a real effort on child porn because of some stuff that happened in the United Kingdom, and a lot of people chipped in, and that is not an easy job at all. So you really have to think hard about how you’re gonna try to tackle this kind of thing.” Jeff Jarvis, who was also on the show, asked Cutts what other things interest him. Cutts responded, “Oh man, I was computer graphics and actually inertial trackers and accelerometers in grad school. At one point I said, you know, you could use commodity hardware, but as a grad student, you don’t have access to influence anybody’s minds, so why don’t I just go do something else for ten years, and somebody else will come up with all these sensors, and sure enough, you’ve got Kinect, you have the Wii, you know, the iPhone. Now everybody’s got a computer in their pocket that can do 3D sensing as long as write the computer programs well. So there’s all kinds of interesting stuff you could do.” Will we see Matt working on the Android team? As a matter of fact, Laporte followed that up by mentioning Andy Rubin – the guy who created Android and brought it to Google – leaving the company. News of that came out last week . Matt later said, “I’ll always have a connection and soft spot for Google…” That’s actually a bit more mysterious of a comment. I don’t want to put any words in the guy’s mouth, but to me, that sounds like he’s not married to the company for the long haul. Either way, webmasters are already getting used to getting updates and helpful videos from Googlers like Pierre Far and John Mueller. We’ve already seen Google roll out new Panda and Penguin updates since Cutts has been on leave, and the SEO world hasn’t come crumbling down. I’m guessing Cutts is getting less hate mail these days. He must have been getting tired of disgruntled website owners bashing him online all the time. It’s got to be nice to not have to deal with that all the time. As I said at the beginning of the article, it’s really not clear what Matt’s future holds, so all we can really do is listen to what he’s said, and look for him to update people further on his plans. In the meantime, if you miss him, you can peruse the countless webmaster videos and comments he’s made over the years that we’ve covered here . Do you expect Matt Cutts to return to search in any capacity? Do you expect him to return to Google? Should he? Do you miss him already? Let us know what you think .

May 12 2014

What Have Google’s Biggest Mistakes Been?

Do you feel like Google makes many mistakes when it comes to trying to improve its search results? Do you think they’ve gone overboard or not far enough with regards to some aspect of spam-fighting? In the latest Google Webmaster Help video, head of webspam Matt Cutts talks about what he views as mistakes that he has made. He discusses two particular mistakes, which both involve things he thinks Google just didn’t address quickly enough: paid links and content farms. What do you think is the biggest mistake Google has made? Share your thoughts in the comments . The exact viewer-submitted question Cutts responds to is: “Was there a key moment in your spam fighting career where you made a mistake that you regret, related to spam?” Cutts recalls, “I remember talking to a very well-known SEO at a search conference in San Jose probably seven years ago (give or take), and that SEO said, ‘You know what? Paid links are just too prevalent. They’re too common. There’s no way that you guys would be able to crack down on them, and enforce that, and come up with good algorithms or take manual action to sort of put the genie back in the bottle,’ as he put it. That was when I realized I’d made a mistake that we’d allowed paid links that pass PageRank to go a little bit too far and become a little bit too common on the web.” “So in the early days of 2005, 2006, you’d see Google cracking down a lot more aggressively, and taking a pretty hard line on our rhetoric about paid links that pass PageRank,” he continues. “At this point, most people know that Google disapproves of it, it probably violates the Federal Trade Commission’s guidelines, all those sorts of things. We have algorithms to target it. We take spam reports about it, and so for the most part, people realize, it’s not a good idea, and if they do that, they might face the consequences, and so for the most part, people try to steer clear of paid links that pass PageRank at this point. But we probably waited too long before we started to take a strong stand on that particular issue.” Yes, most people who engage in paid links are probably aware of Google’s stance on this. In most cases, gaming Google is probably the ultimate goal. That doesn’t mean they’re not doing it though, and it also doesn’t mean that Google’s catching most of those doing it. How would we know? We’re not going to hear about them unless they do get caught, but who’s to say there aren’t still many, many instances of paid links influencing search results as we speak? The other mistake Cutts talks about will be fun for anyone who has ever been affected by the Panda update (referred to repeatedly as the “farmer” update in its early days). Cutts continues, “Another mistake that I remember is there was a group of content farms, and we were getting some internal complaints where people said, ‘Look, this website or that website is really bad. It’s just poor quality stuff. I don’t know whether you’d call it spam or low-quality, but it’s a really horrible user experience.’ And I had been to one particular page on one of these sites because at one point my toilet was running, and I was like, ‘Ok, how do you diagnose a toilet running?’ and I had gotten a good answer from that particular page, and I think I might have over-generalized a little bit, and been like, ‘No, no. There’s lots of great quality content on some of these sites because look, here was this one page that helped solve the diagnostic of why does your toilet run, and how do you fix it, and all that sort of stuff.’” “And the mistake that I made was judging from that one anecdote, and not doing larger scale samples and listening to the feedback, or looking at more pages on the site,” he continues. “And so I think it took us a little bit longer to realize that some of these lower-quality sites or content farms or whatever you want to call them were sort of mass-creating pages rather than really solving users’ needs with fantastic content. And so as a result, I think we did wake up to that, and started working on it months before it really became wide-scale in terms of complaints, but we probably could’ve been working on it even earlier.” The complaints were pretty loud and frequent by the time the Panda update was first pushed, but it sounds like it could have been rolled out (and hurt more sites ) a lot earlier than it eventually did. You have to wonder how that would have changed things. Would the outcome have been different if it had been pushed out months before it was? “Regardless, we’re always looking for good feedback,” says Cutts. “We’re always looking for what are we missing? What do we need to do to make our web results better quality, and so anytime we roll something out, there’s always the question of, ‘Could you have thought of some way to stop that or to take better action or a more clever algorithm, and could you have done it sooner? I feel like Google does a lot of great work, and that’s very rewarding, and we feel like, ‘Okay, we have fulfilled our working hours with meaningful work,’ and yet at the same time, you always wonder could you be doing something better. Could you find a cleaner way to do it – a more elegant way to do it – something with higher precision – higher recall, and that’s okay. It’s healthy for us to be asking ourselves that.” It’s been a while since Google pushed out any earth-shattering algorithm updates. Is there something Google is missing right now that Cutts is going to look back on, and wonder why Google didn’t do something earlier? Would you say that Google’s results are better as a result of its actions against paid links and content farms? What do you think Google’s biggest mistake has been? Let us know in the comments . The post What Have Google’s Biggest Mistakes Been? appeared first on WebProNews .

May 5 2014

Google: Links Will Become Less Important

Links are becoming less important as Google gets better at understanding the natural language of users’ queries. That’s the message we’re getting from Google’s latest Webmaster Help video. It will be a while before links become completely irrelevant, but the signal that Google’s algorithm was basically based upon is going to play less and less of a role as time goes on. Do you think Google should de-emphasize links in its algorithm? Do you think they should count as a strong signal even now? Share your thoughts . In the video, Matt Cutts takes on this user-submitted question: Google changed the search engine market in the 90s by evaluating a website’s backlinks instead of just the content, like others did. Updates like Panda and Penguin show a shift in importance towards content. Will backlinks lose their importance? “Well, I think backlinks have many, many years left in them, but inevitably, what we’re trying to do is figure out how an expert user would say this particular page matched their information needs, and sometimes backlinks matter for that,” says Cutts. “It’s helpful to find out what the reputation of a site or of a page is, but for the most part, people care about the quality of the content on that particular page – the one that they landed on. So I think over time, backlinks will become a little less important. If we could really be able to tell, you know, Danny Sullivan wrote this article or Vanessa Fox wrote this article – something like that, that would help us understand, ‘Okay, this is something where it’s an expert – an expert in this particular field – and then even if we don’t know who actually wrote something, Google is getting better and better at understanding actual language.” “One of the big areas that we’re investing in for the coming few months is trying to figure out more like how to do a Star Trek computer, so conversational search – the sort of search where you can talk to a machine, and it will be able to understand you, where you’re not just using keywords,” he adds. You know, things like this: Cutts continues,”And in order to understand what someone is saying, like, ‘How tall is Justin Bieber?’ and then, you know, ‘When was he born?’ to be able to know what that’s referring to, ‘he’ is referring to Justin Bieber – that’s the sort of thing where in order to do that well, we need to understand natural language more. And so I think as we get better at understanding who wrote something and what the real meaning of that content is, inevitably over time, there will be a little less emphasis on links. But I would expect that for the next few years we will continue to use links in order to assess the basic reputation of pages and of sites.” Links have always been the backbone of the web. Before Google, they were how you got from one page to the next. One site to the next. Thanks to Google, however (or at least thanks to those trying desperately to game Google, depending on how you look at it), linking is broken. It’s broken as a signal because of said Google gaming, which the search giant continues to fight on an ongoing basis. The very concept of linking is broken as a result of all of this too. Sure, you can still link however you want to whoever you want. You don’t have to please Google if you don’t care about it, but the reality is, most sites do care, because Google is how the majority of people discover content. As a result of various algorithm changes and manual actions against some sites, many are afraid of the linking that they would have once engaged in. We’ve seen time after time that sites are worried about legitimate sites linking to them because they’re afraid Google might not like it. We’ve seen sites afraid to naturally link to other sites in the first place because they’re afraid Google might not approve. No matter how you slice it, linking isn’t what it used to be, and that’s largely because of Google. But regardless of what Google does, the web is changing, and much of that is going mobile. That’s a large part of why Google must adapt with this natural language search. Asking your phone a question is simply a common way of searching. Texting the types of queries you’ve been doing from the desktop for years is just annoying, and when your phone has that nice little microphone icon, which lets you ask Google a question, it’s just the easier choice (in appropriate locations at least). Google is also adapting to this mobile world by indexing content within apps as it does links, so you if you’re searching on your phone, you can open content right in the app rather than in the browser. Last week, Facebook made an announcement taking this concept to another level when it introduced App Links. This is an open source standard ( assuming it becomes widely adopted ) for apps to link to one another, enabling users to avoid the browser and traditional links altogether by jumping from app to app. It’s unclear how Google will treat App Links, but it would make sense to treat them the same as other links. The point is that linking itself is both eroding and evolving at the same time. It’s changing, and Google has to deal with that as it comes. As Cutts said, linking will still play a significant role for years to come, but how well Google is able to adapt to the changes in linking remains to be seen. Will it be able to deliver the best content based on links if some of that content is not being linked to because others are afraid to link to it? Will it acknowledge App Links, and if so, what about the issues that’ having? Here’s the “standard” breaking the web, as one guy put it: What if this does become a widely adopted standard, but proves to be buggy as shown above? Obviously, Google is trying to give you the answers to your queries on its own with the Knowledge Graph when it can. Other times it’s trying to fill in the gaps in that knowledge with similarly styled answers from websites . It’s unclear how much links fit into the significance of these answers. We’ve seen two examples in recent weeks where Google was turning to parked domains. Other times, the Knowledge Graph just provides erroneous information. As Cutts said, Google will get better and better at natural language, but it’s clear this is the type of search results it wants to provide whenever possible. The problem is it’s not always reliable, and in some cases, the better answer comes from good old fashioned organic search results (of the link-based variety). We saw an example of this recently, which Google ended up changing after we wrote about it (not saying it was because we wrote about it). So if backlinks will become less important over time, does that mean traditional organic results will continue to become a less significant part of the Google search experience? It’s certainly already trended in that direction over the years. What do you think? How important should links be to Google’s ranking? Share your thoughts in the comments . Images via YouTube , Google

Feb 26 2014

Google: You Don’t Have To Dumb Your Content Down ‘That Much’

  • Posted by in Web Pro News
  • Comments Off on Google: You Don’t Have To Dumb Your Content Down ‘That Much’

Google’s Matt Cutts answers an interesting question in a new “Webmaster Help” video: “Should I write content that is easier to read or more scientific? Will I rank better if I write for 6th graders?” Do you think Google should give higher rankings to content that is as well-researched as possible, or content that is easier for the layman to understand? Share your thoughts in the comments . This is a great question as we begin year three of the post-Panda Google . “This is a really interesting question,” says Cutts. “I spent a lot more time thinking about it than I did a lot of other questions today. I really feel like the clarity of what you write matters a lot.” He says, “I don’t know if you guys have ever had this happen, but you land on Wikipedia, and you’re searching for information – background information – about a topic, and it’s way too technical. It uses all the scientific terms or it’s talking about a muscle or whatever, and it’s really hyper-scientific, but it’s not all that understandable, and so you see this sort of revival of people who are interested in things like ‘Explain it to me like I’m a five-year-old,’ right? And you don’t have to dumb it down that much , but if you are erring on the side of clarity, and on the side of something that’s going to be understandable, you’ll be in much better shape because regular people can get it, and then if you want to, feel free to include the scientific terms or the industry jargon, the lingo…whatever it is, but if somebody lands on your page, and it’s just an opaque wall of scientific stuff, you need to find some way to pull people in to get them interested, to get them enticed in trying to pick up whatever concept it is you want to explain.” Okay, it doesn’t sound so bad the way Cutts describes it, and perhaps I’m coming off a little sensational here, but it’s interesting that Cutts used the phrase, “You don’t have to dumb it down that much.” This is a topic that we discussed last fall when a Googler Ryan Moulton said in a conversation on Hacker News, “There’s a balance between popularity and quality that we try to be very careful with. Ranking isn’t entirely one or the other. It doesn’t help to give people a better page if they aren’t going to click on it anyways.” He then elaborated: Suppose you search for something like [pinched nerve ibuprofen]. The top two results currently are http://www.mayoclinic.com/health/pinched-nerve/DS00879/DSECT… and http://answers.yahoo.com/question/index?qid=20071010035254AA… Almost anyone would agree that the mayoclinic result is higher quality. It’s written by professional physicians at a world renowned institution. However, getting the answer to your question requires reading a lot of text. You have to be comfortable with words like “Nonsteroidal anti-inflammatory drugs,” which a lot of people aren’t. Half of people aren’t literate enough to read their prescription drug labels: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1831578/ The answer on yahoo answers is provided by “auntcookie84.” I have no idea who she is, whether she’s qualified to provide this information, or whether the information is correct. However, I have no trouble whatsoever reading what she wrote, regardless of how literate I am. That’s the balance we have to strike. You could imagine that the most accurate and up to date information would be in the midst of a recent academic paper, but ranking that at 1 wouldn’t actually help many people. This makes for a pretty interesting debate. Should Google bury the most well-researched and accurate information just to help people find something that they can read easier, even if it’s not as high quality? Doesn’t this kind of go against the guidelines Google set forth after the Panda update? You know, like these specific questions Google suggested you ask about your content: “Would you trust the information presented in this article?” (What’s more trustworthy, the scientific explanation from a reputable site or auntcookie’s take on Yahoo Answers?) “Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?” (Uh…) “Does the article provide original content or information, original reporting, original research, or original analysis?” (Original research and analysis, to me, suggests that someone is going to know and use the lingo.) “Does the page provide substantial value when compared to other pages in search results?” (Couldn’t value include educating me about the terminology I might not otherwise understand?) “Is the site a recognized authority on its topic?” (You mean the type of authority that would use the terminology associated with the topic?) “For a health related query, would you trust information from this site?” (Again, are you really trusting auntcookie on Yahoo Answers over Mayo Clinic?) “Does this article provide a complete or comprehensive description of the topic?” (Hmm. Complete and comprehensive. You mean as opposed to dumbed down for the layman?) “Does this article contain insightful analysis or interesting information that is beyond obvious?” (I’m not making this up. Here’s Google’s blog post listing these right here .) “Are the pages produced with great care and attention to detail vs. less attention to detail?” (You get the idea.) Maybe I’m missing something, but it seems like Google has been encouraging people to make their content as thorough, detailed, and authoritative as possible. I don’t see “Is your content dumbed down for clarity’s sake?” on the list. Of course that was nearly three years ago. If quality is really the goal (as Google has said over and over again in the past), doesn’t the responsibility of additional research and additional clicking of links rest with the searcher? If I don’t understand what the most accurate and relevant result is saying, isn’t it my responsibility to continue to educate myself, perhaps by looking at other sources of information and looking up the things I don’t understand? But that would go against Google trying to get users answers as quickly as possible. That must be why Google is trying to give you the answers itself rather than having to send you to third-party sites. Too bad those answers aren’t always reliable . Cutts continues in the video, “So I would argue first and foremost, you need to explain it well, and then if you can manage to do that while talking about the science or being scientific, that’s great, but the clarity of what you do, and how you explain it often matters almost as much as what you’re actually saying because if you’re saying something important, but you can’t get it across, then sometimes you never got it across in the first place, and it ends up falling on deaf ears.” Okay, sure, but isn’t this just going to encourage users to dumb down content at the risk of educating users less? I don’t think that’s what Cutts is trying to say here, but people are going to do anything they can to get their sites ranked better. At least he suggests trying to use both layman’s terms and the more scientific stuff. “It varies,” he says. “If you’re talking only to industry professionals – terminators who are talking about the scientific names of bugs, and your audience is only bugs – terminator – you know, exterminator experts, sure, then that might make sense, but in general, I would try to make things as natural sounding as possible – even to the degree that when I’m writing a blog post, I’ll sometimes read it out loud to try to catch what the snags are where things are gonna be unclear. Anything you do like that, you’ll end up with more polished writing, and that’s more likely to stand the test of time than something that’s just a few, scientific mumbo jumbo stuff that you just spit out really quickly.” I’m not sure where the spitting stuff out really quickly thing comes into play here. The “scientific mumbo jumbo” (otherwise known as facts and actual terminology of things) tends to appear in lengthy, texty content, like Moulton suggested, no? Google, of course, is trying to get better at natural language with updates like Hummingbird and various other acquisitions and tweaks . It should only help if you craft your content around that. “It’s not going to make that much of a difference as far as ranking,” Cutts says. “I would think about the words that a user is going to type, which is typically going to be the layman’s terms – the regular words rather than the super scientific stuff – but you can find ways to include both of them, but I would try to err on the side of clarity if you can.” So yeah, dumb it down. But not too much. Just enough. But also include the smart stuff. Just don’t make it too smart. What do you think? Should Google dumb down search results to give users things that are easier to digest, or should it be the searcher’s responsibility to do further research if they don’t understand the accurate and well-researched information that they’re consuming? Either way, isn’t this kind of a mixed message compared to the guidance Google has always given regarding “quality” content? Share your thoughts . For the record, I have nothing against auntcookie. I know nothing about auntcookie, but that’s kind of the point.

Aug 8 2013

Can Google Really Keep Competitors From Harming Your Business?

Some webmasters aren’t convinced by Google’s “solution” to negative SEO. Wasn’t Google’s Disavow Links tool supposed to be a major help in preventing negative SEO – competitors (or other enemies) associating your otherwise legitimate site with “bad neighborhoods,” by way of links? Do you think Google’s tool does its job the way it should? Is it the answer to this problem? What more should Google be doing to help webmasters? Let us know what you think in the comments . Perhaps Disavow Links has helped combat negative SEO for some, but it hasn’t stopped the issue from coming up repeatedly since the tool was launched. Google has a new Webmaster Help video out about the topic. Matt Cutts responds to the user-submitted question: Recently I found two porn websites linking to my site. I disavow[ed] those links and wrote to admins asking them to remove those links but… what can I do if someone, (my competition), is trying to harm me with bad backlinks? Notice that Google rephrased the question for the video title: Should I be worried if a couple of sites that I don’t want to be associated with are linking to me? Cutts says, “So, you’ve done exactly the right thing. You got in touch with the site owners, and you said, ‘Look, please don’t link to me. I don’t want to have anything to do with your site, and then if those folks aren’t receptive, just go ahead and disavow those links. As long as you’ve taken those steps, you should be in good shape. But if there’s any site that you don’t want to be associated with that’s linking to you, and you want to say, ‘Hey, I got nothing to do with this site,’ you can just do a disavow, and you can even do it at a domain level.” “At that point, you should be in good shape, and I wouldn’t worry about it after that,” Cutts concludes. So, this has basically been Google’s advice since the Disavow tool launched, but is it really the answer? Based on the submitted question, it makes it seem like the webmaster did what he was supposed to do (as Cutts acknowledges). So why submit the question if the issue was resolved? Is it just a matter of time? Is the webmaster overlooking other variables? Is the solution Cutts prescribes really not the solution? Is there even a truly effective solution? Some webmasters in the comments on YouTube aren’t convinced by Cutts’ response. “What a crock Matt,” writes user jeffostroff. “What about the scammers who have 5000 links pointing to our site from sites in China or Russia, where no one responds, not even the web hosts. Disavow has not worked. When are you going to offer ability to disavow whole countries. I’m sure many Americans don’t want any links coming from other countries if their site is targeted only to Americans.” That comment has the most YouTube likes of the bunch so far (17) . “I don’t think simply disavowing links is necessarily the solution Matt,” Chris Ainsworth comments. “Agreed it will help to disassociate a website from any rogue/malicious links but it doesn’t solve the on-going issue of competitor link spam tactics. In many cases, especially with larger brands, managing link activity can be a time intensive process. Should it be the responsibility of the business to manage their link profile or should Google have the ability to better identify malicious activity?” That one got 15 likes. Google has been talking about the effects of the Disavow tool on negative SEO from the beginning. In the initial blog post announcing the tool, Google included an FAQ section, and one of the questions was: Can this tool be used if I’m worried about negative SEO? The official response from Google was: The primary purpose of this tool is to help clean up if you’ve hired a bad SEO or made mistakes in your own link-building. If you know of bad link-building done on your behalf (e.g., paid posts or paid links that pass PageRank), we recommend that you contact the sites that link to you and try to get links taken off the public web first. You’re also helping to protect your site’s image, since people will no longer find spammy links and jump to conclusions about your website or business. If, despite your best efforts, you’re unable to get a few backlinks taken down, that’s a good time to use the Disavow Links tool. In general, Google works hard to prevent other webmasters from being able to harm your ranking. However, if you’re worried that some backlinks might be affecting your site’s reputation, you can use the Disavow Links tool to indicate to Google that those links should be ignored. Again, we build our algorithms with an eye to preventing negative SEO, so the vast majority of webmasters don’t need to worry about negative SEO at all. So really, it does sound like Google does aim to shoulder the responsibility for negative SEO, rather than webmasters having to rely on their tool to battle it. Google wants to do that battling algorithmically, but is it doing a good enough job? Comments like the ones above and countless others in various threads around the SEO industry would suggest that it is not. Google is probably right in that “the vast majority of webmasters don’t need to worry about negative SEO,” but what about the minority? How big is the minority? That, we don’t know, but as often as the issue comes up in discussion, it seems big enough. Even if Google isn’t doing a good enough job combatting the issue, that doesn’t mean it’s not trying. Google makes algorithm changes on a daily basis, and many of them are certainly aimed at spam-related issues. Perhaps it will get better. Perhaps it has already gotten better to some extent. The concerns are still out there, however. Real people appear to still be dealing with negative SEO. Either that, or they’re just diagnosing their problems wrong. What do you think? How common is negative SEO really? What would you like to see Google do to address the issue? Share your thoughts .

Oct 4 2012

Google Continues To Tinker With Freshness In Recent Algorithm Adjustments

Is Google getting close to where it wants to be in terms of how it handles freshness of content in search results? This has been one major area of focus for Google for the past year or so. Last November, Google launched the Freshness update , and since then, it has periodically been making various tweaks to how it handles different things related to freshness. Google has been releasing regular lists of algorithm changes it makes from month to month all year, and some of these lists have been quite heavy on the freshness factor . On Thursday, Google released its lists for changes made in August and September. Somewhat surprisingly, “freshness” is only mentioned twice. Two changes were made (at least changes that Google is disclosing) under the “Freshness” project banner. We actually already discussed one of them in another article , as it is also related to how Google deals with domains (which Google seems to be focusing on more these days). That would be this list entry: #83761. [project “Freshness”] This change helped you find the latest content from a given site when two or more documents from the same domain are relevant for a given search query. That change was made in September. The other one was made in August: Imadex. [project “Freshness”] This change updated handling of stale content and applies a more granular function based on document age. Quite frankly, I’m not sure what you can really do with that information other than to consider the freshness of your content, in cases where freshness is relevant to quality. This is actually a topic Google’s Matt Cutts discussed in a Webmaster Help video released this week. “If you’re not in an area about news – you’re not in sort of a niche or topic area that really deserves a lot of fresh stuff, then that’s probably not something you need to worry about at all,” he said in the video. I’ve been a fairly vocal critic of how Google has handled freshness , as I’ve found the signal to get in the way of the information I’m actually seeking far too often. Plenty of readers have agreed, but this is clearly an area where Google is still tinkering. Do you think Google is getting better at how it handles freshness? Feel free to share your thoughts in the comments.

Aug 5 2009

What Are Your Thoughts Regarding A “work At Home” Job?

I am interested in making additional money through part-time work. I have been applying on line with various employers in the area. A possibility is one of the many “work at home” jobs that are offered. In particular, using my computer. I keep seeing opportunities to make money taking surveys. What are your thoughts on […]

Jun 19 2009

Tool Time Friday | Xmarks!

It takes only a moment to get up and running with Xmarks. After you install the add-on, click on the notification to set up Xmarks and start backing up and synchronizing your bookmarks. Skip this step if you only wish to use our discovery features. Secure Password Sync is an optional Xmarks feature. If you’d like to learn more about how it works, visit this link . To learn more about Xmarks, visit our feature pages . Get the latest news about Xmarks at our blog . Xmarks was formerly known as Foxmarks. How To Get Unstoppable Traffic? Go Evergreen . Read the true story of how one of the site owners, ill and preparing to retire, created traffic so powerful that she couldn’t turn it off – and how you can do the same for your site. convert this post to pdf.

Jun 19 2009

Tool Time Friday | Any Color you want

This will be short and sweet. AnyColor will change Firefox to Any Color and create your own personalized theme. That’s it! How To Get Unstoppable Traffic? Go Evergreen . Read the true story of how one of the site owners, ill and preparing to retire, created traffic so powerful that she couldn’t turn it off – and how you can do the same for your site. convert this post to pdf.

Jun 12 2009

Tool Time Friday | Smart

As in Smart Bookmarks. Not enough place on your bookmarks bar ?… Smart Bookmarks Bar extension comes right here by hiding bookmarks names and only showing icons.  Bookmark names will be displayed on mouse over. Here’s a comment from the site – “Great idea to augment an already great part of Firefox. There was never enough space for all the pages I visit every day… but now there is! Top marks.” How to Increase Your Search Engine Visitors Every site needs more search engine visitors. Join Traffic Reality , where we’ll teach you end-to-end Website Promotion Techniques starting with how to get the best out of Google, Yahoo and MSN Live Search. Learn how to integrate all your promotion efforts into a plan that keeps working even when you’re not. convert this post to pdf.