Google’s Matt Cutts discusses stock photos as a ranking signal in today’s Webmaster Help video. Specifically, he responds to the following user-submitted question: Does using stock photos on your pages have a negative effect on rankings? Do original photos help you in this regard? “‘Does using stock photos on your pages have a negative effect on rankings?’ To the best of my knowledge, the answer is no,” says Cutts. “‘Do original photos help you?’ To the best of my knowledge, it doesn’t really make a difference whether it’s a stock photo versus an original photo.” But he doesn’t leave it at that. “But you know what?” he adds. “That’s a great suggestion for a future signal we could look at in terms of search quality. Who knows? Maybe original sites – original image sites might be higher quality, where sites that just repeat the same stock photos over and over again might not be nearly as high quality.” Interesting. “But to the best of my knowledge,” he reiterates, “we don’t use that directly in our algorithmic web ranking right now.” Well, even if Google is not using this as a signal currently, it’s hard to imagine why Cutts would make comments like these if he’s not serious about this actually being something Google could add in the future. They are, as you know, making changes to the algorithm every day. Here, he’s pretty much saying that original images are a signal of quality, so that’s worth paying attention to.
Read the original here:
Google: Stock Photos Don’t Hurt Your Rankings…Yet
It’s been a pretty big week for search and SEO news. There have been a lot of announcements, not only from Google, but from Google competitors. Let’s recap, and discuss in the comments. Which of the latest announcements do you believe will have the biggest impact on webmasters? On your SEO strategy? Let us know what you think . On Monday, Apple had its big Worldwide Developer’s Conference keynote, where it unveiled the latest versions of its Mac OS X and iOS operating systems. Within these unveilings were a few pieces of noteworthy search news. For one, its adding more search options to Safari , which is significant given that it has made moves in recent memory to distance itself further from Google. The big piece of news here, however, was the addition of Bing (Google’s biggest search competitor) to Siri as the web search provider. We discussed the implications of this in more depth here , but suffice it to say, this could lead to a lot more people accessing your content from Bing if you’re ranking there. In other words, you now have more of a reason to optimize for Bing. Also on Monday, Google released a video discussing mistakes webmasters are commonly making when using the Disavow Links tool. The most common mistake is that people are uploading the wrong kinds of files. Yelp, a frequent critic of Google’s (which generates its own share of criticism ) is making moves to become a better local search tool. See its newly revamped “Nearby” mobile feature . Local businesses now have even more incentive to be found in Yelp. Speaking of Yelp, Greg Sterling at Screenwerk shares an anecdote in which a plumber claimed that 95% of his leads come from the service. This caught the attention of CEO Jeremy Stoppelman: Plumber: 95% of My Leads Come from Yelp http://t.co/69Vmg2Euri via @gsterling — Jeremy Stoppelman (@jeremys) June 13, 2013 Clearly some are finding Yelp well worth it, despite those decrying the service. Google made a major announcement in that it is readying ranking changes for mobile content. Basically, if you’re not providing smartphone users with the relevant content you’re providing them on the desktop, you’re going to be in trouble. “Some websites use separate URLs to serve desktop and smartphone users,” explain Google’s Yoshikiyo Kato and Pierre Far. “A faulty redirect is when a desktop page redirects smartphone users to an irrelevant page on the smartphone-optimized website. A typical example is when all pages on the desktop site redirect smartphone users to the homepage of the smartphone-optimized site.” “This kind of redirect disrupts a user’s workflow and may lead them to stop using the site and go elsewhere,” they add. “Even if the user doesn’t abandon the site, irrelevant redirects add more work for them to handle, which is particularly troublesome when they’re on slow mobile networks. These faulty redirects frustrate users whether they’re looking for a webpage, video, or something else, and our ranking changes will affect many types of searches.” More on all of this here . In addition to that, Google’s Matt Cutts hinted at SMX Advanced that mobile site speed could soon become a ranking factor. Google made site speed a signal several years ago, and it looks like they’ll be taking that a step further with mobile in mind . Cutts revealed quite a few things at SMX Advanced, actually. Here’s the whole discussion he had with interviewer Danny Sullivan: One thing he mentioned at the conference was that Google started rolling out a new ranking update to clean up more spammy queries. It’s been unofficially referred to as the “payday loans” update. Google had previously warned about forthcoming efforts in this area, and these efforts are now taking effect. In other algorithm update news, Cutts also indicated that Google hasn’t rolled out a Panda data refresh for a month and a half. Panda is apparently being run about once a month, and rolling out slowly over the course of roughly ten days. He mentioned a new structured data tool Google is beta testing, which allows webmasters to report structured data errors. Giving webmasters as much control over structured data is going to be increasingly important, as Google is turning to this kind of data more and more for its search results. Optimizing structured data could be considered a vital part of your SEO strategy these days, for better or worse . At least Google providing more and more tools in this area. Finally, Cutts announced that Google is now including example links in its messages to webmasters regarding manual penalties. Those who have to deal with these penalties find the addition very welcome. Cutts put out a video discussing this: Facebook, as I’m sure you’ve heard , has launched hashtags, which pretty much turn the giant social network into a real-time search engine, for all intents and purposes. That has some pretty big marketing implications. The hashtags, by the way, can be searched via Facebook’s Graph Search. On a separate note, Facebook is killing its sponsored search results . So those are some of the biggest stories in a very busy week for search. The mere fact that all of this stuff just happened over the past week really illustrates how rapidly the search game is evolving, and this doesn’t even take into account that Google makes changes to its algorithm every day. Out of all that was announced this week, which item are you most concerned about? Which are you most excited about? Let us know in the comments .
See original here:
Wow, A Lot Of Stuff Just Happened In SEO
Google announced that it is now including examples of problems in its messaging to webmasters who have been hit with manual webspam penalties. Google’s Matt Cutts actually mentioned it in a Q&A session at SMX Advanced on Tuesday night, but now he has put out a Webmaster Help video discussing it further. “I’m very happy to say that just recently, we’ve rolled out the ability to, when we send a message, to include more examples,” he says. “We’re not going to be able to show you every single thing that we think is wrong, for a couple reasons. Number one, it might help the spammers, and number two, if there’s a lot of bad pages, we’d be sending out emails that are, you know, like fifty megabytes long. But we do think it’s helpful if we can include a small number of example URLs that will help you, as a webmaster, know where to look whenever you’re trying to fix things and clean the site back up.” He adds, “It’s much better than it was than even just a few months ago, and we’ll keep looking for ways to provide even more guidance and a little more transparency so that webmasters get [an] even better idea of where to look, but we’re just really happy that now we have the ability, when we send messages, to give you a few concrete examples.” Cutts notes that it’s going to take some time to roll out, test, ramp up, etc. There might still be some cases where people aren’t getting examples.
See the original post:
Google Updates Messaging For Manual Webspam Actions
Google has clearly had it with sites that have lackluster mobile experiences. This week, the company took to its Webmater Central blog to discuss “several ranking changes” it’s preparing for sites not configured for smartphone users. But that’s not all. Google’s Matt Cutts spoke at SMX Advanced on Tuesday evening, and implied that Google might roll out a version of its site speed ranking factor for mobile sites. Google officially revealed that site speed was a ranking factor for search over three years ago after placing a great deal of emphasis on speed for quite some time before that. Fast forward to 2013, and mobile has grown a lot. Google is making it so you have no excuse to treat your mobile content with less regard than your desktop content. Frankly, sites should be optimizing for mobile anyway, simply for the benefit of their users, but if ignoring the mobile experience is going to cost sites search rankings, perhaps this will light a fire under their butts to do something about poor mobile site performance. Here’s the relevant section of Search Engine Land’s liveblogged account of what Cutts said about mobile page speed: At Google I/O, there was a session on instant mobile websites – there were page speed recommendations. We’ve said that before about desktop sites, we might start doing the same thing about mobile websites. By the way, Cutts said at the event that the smartphone-related changes discussed in the blog post have been approved, but that he’s not sure when they’ll roll out.
See the original post:
Mobile Site Speed To Be A Google Ranking Factor?
Google’s Matt Cutts revealed in a Q&A at SMX Advanced on Tuesday night that Google is rolling out a test of a “structured data dashboard,” according to Search Engine Land . Barry Schwartz writes that he announced “a new beta application is testing within Google Webmaster Tools named the Structured Data Dashboard”. Google actually announced the launch of the Structured Data Dashboard in Webmaster Tools last July . Our coverage of that is here . Search Engine Land provides a link the an application for those who want to test the new tool. It appears that the test is just for “structured data error reporting”. Presumably, this is part of the dashboard announced last year. A couple weeks ago, Google launched some new tools for webmasters to provide it with structured data from their sites. They added support for new types of data with the Data Highlighter and launched the Structured Data Markup Helper.
The rest is here:
Google Tests Structured Data Error Reporting In Webmaster Tools
Back in March, Google launched a Panda refresh . This is something they’ve done numerous times since first launching the update back in early 2011. There was something special about this particular refresh, however, because it marked the beginning of a new era of Panda in which Google will keep the update going regularly, without announcing all the refreshes. “Rather than having some huge change that happens on a given day. You are more likely in the future to see Panda deployed gradually as we rebuild the index. So you are less likely to see these large scale sorts of changes,” Google’s Matt Cutts was quoted as saying . Matt Cutts appeared in a discussion at SMX Advanced Tuesday evening in which he spoke a bit about Panda, among many other things. Interviewer Danny Sullivan asked Cutts how many Panda updates there have been since Google stopped confirming them. His response was that they had one about a month and a half ago, but hadn’t updated it since then because they’re looking at pulling in a new signal that might help some people out of the gray zone. This brings to mind recent words from Cutts in an industry-famous video in which he discussed numerous upcoming changes. In that, Cutts talked about Google changing its update strategy for Panda. “We’ve also been looking at Panda, and seeing if we can find some additional signals (and we think we’ve got some) to help refine things for the sites that are kind of in the border zone – in the gray area a little bit. And so if we can soften the effect a little bit for those sites that we believe have some additional signals of quality, then that will help sites that have previously been affected (to some degree) by Panda.” Panda will apparently be updated about once a month, and roll out slowly throughout the month. “What happens is Google will run the update on a particular day, let’s say on the 4th of the month,” explains Barry Schwartz from SMX sister site, Search Engine Land. “Then Google will slowly push out that impact over 10 days or so through the month. Google will typically repeat this cycle over monthly.” Hat tip to Matt McGee for liveblogging the discussion .
Read the original:
Google: We Haven’t Updated Panda For A Month And A Half
Matt Cutts participated in a Q&A session with Danny Sullivan at the SMX Advanced conference. SMX has now made the video available to all via its YouTube channel, so if you couldn’t make the conference, here you go: Cutts discusses a variety of things during the session, but he also announced that Google is now rolling out a new algorithm update focusing on spammy queries like “payday loans”. More on that here .
Here’s The Matt Cutts Discussion From SMX Advanced
Google’s Matt Cutts announced that Google has “started” a new ranking update to help clean up some spammy queries. It’s one of the changes that Cutts warned us about in that big video a while back. We just started a new ranking update today for some spammy queries. See 2:30-3:10 of this video: goo.gl/ufCiH #smx — Matt Cutts (@mattcutts) June 11, 2013 In the video, Cutts talked about working harder on types of queries that tend to draw a lot of spam. “We get a lot of great feedback from outside of Google, so, for example, there were some people complaining about searches like ‘payday loans’ on Google.co.uk,” he said. “So we have two different changes that try to tackle those kinds of queries in a couple different ways. We can’t get into too much detail about exactly how they work, but I’m kind of excited that we’re going from having just general queries be a little more clean to going to some of these areas that have traditionally been a little more spammy, including for example, some more pornographic queries, and some of these changes might have a little bit more of an impact on those kinds of areas that are a little more contested by various spammers and that sort of thing.” Cutts discussed the update a little at SMX Advanced. Barry Schwartz from the conference’s sister site, Search Engine Land, reports : Matt Cutts explained this goes after unique link schemes, many of which are illegal. He also added this is a world-wide update and is not just being rolled out in the U.S. but being rolled out globally. This update impacted roughly 0.3% of the U.S. queries but Matt said it went as high as 4% for Turkish queries were web spam is typically higher. They’re calling it the “payday loan algorithm,” by the way (not sure if that’s official). In another tweet, Cutts said that the update will be rolling out over the course of the next one to two months. @ rjbeech12 @ atwheeler @ patrickaltoft it’s a multifaceted rollout that will be happening over the next 1-2 months. — Matt Cutts (@mattcutts) June 12, 2013 Image: Hersheys.com
Originally posted here:
Google Goes After ‘Payday Loans’ And Other Spam With New Algorithm Update
Google’s Matt Cutts recently talked about Google’s Disavow Links tool in the comments of a blog post, in which he suggested using it more like a machete than like a fine-toothed comb. Today, Google released a new Webmaster Help video discussing the mistakes that people most often make when using the tool. Cutts says, “The file that you upload is just supposed to be a regular text file, so expect either a comment on its own line, a domain that starts with ‘domain:url’. Anything else is weird syntax, and in theory, could cause the parser to reject the file. What we see is people sometimes uploading Word files, so .doc, Excel spreadsheets, you know, and that’s the sort of thing where our parser is not built to handle. It’s expecting just a text file. So if you upload something really strange, that can cause the parser to throw that file out, and then the reconsideration request would not go through.” Once again, Cutts advises machete-like use of the tool. He says, “The other thing that we see is, a lot of the times, the first attempt at a reconsideration request, you see people really trying to take a scalpel, and pick out really individual bad links in a very granular way, and for better or worse, sometimes when you’ve got a really bad link profile, rather than a scalpel, you might be thinking more of a machete sort of thing. You need to go a little bit deeper in terms of getting rid of the really bad links.” “So, for example, if you’ve got links from some very spammy forum or something like that, rather than trying to identify the individual pages, that might be the opportunity to do a ‘domain:’,” he adds. “So if you’ve got a lot of links that you think are bad from a particular site, just go ahead and do ‘domain:’ and the name of that domain. Don’t maybe try to pick out the individual links because you might be missing a lot more links.” “The other thing that we see is, the ‘domain:’ needs to have the right syntax,” he says. “So, ‘domain:’ and then a domain name. Don’t do, ‘domain:’ and then ‘http’ or ‘www.’ or something like that. An actual domain like ‘example.com’ or ‘mattcutts.com’ is what we’re looking for there.” It’s a little surprising that Google’ system can’t tell when somebody’s talking about a domain when they use “http” or “www,” but it is what it is. Good to know. Cutts continues, “A bunch of people, we sometimes see them putting context, or the story, or the documentation for the reconsideration request in the Disavow Links text file that they try to upload. That’s really not the right place for it. The right place to give us the context, or to describe what’s going on is in the reconsideration request, not in the Disavow Links text file….You probably don’t need a lot of comments. If they’re there, I’d keep ‘em short. I wouldn’t make a lot of multiple lines and all that sort of stuff because it increases the likelihood that you might make a copy-and-paste error, and then we would not trust that particular file.” “The other thing that we see is that some people think that Disavow is be all end all..the panacea that’s going to cure all their ills, and yet we do want, if you’ve been doing some bad SEO and you’re trying to cure it, in an ideal world, you would actually clean up as many links as you can off the actual web,” says Cutts. “That’s just a really helpful way for us to see, when you’re doing a reconsideration request, that you’re putting in the effort to try and make sure things have been corrected and cleaned up, and it’s not going to happen again.”
Read the original post:
Matt Cutts: Here’s What You’re Doing Wrong With ‘Disavow Links’ Tool
Over the past couple of years, it has become abundantly clear that authorship will continue to play an increasingly important role in how Google determines when and how to rank some types of content in search results. Nothing is changing there, and you can expect Google to continue to look for ways to improve how it uses this signal. Google’s Matt Cutts put out a new Webmaster Help video today discussing this. Specifically, he responds to the user-submitted question: Will Google be evaluating the use of rel=”author” moving forward as more sites use the feature on generic, non-article/news pages, such as the home page or an about page? “My brief answer is yes,” begins Cutts. “I’m pretty excited about the ideas behind rel=’author’. Basically, if you can move from an anonymous web to a web where you have some notion of identity and maybe even reputation of individual authors, then webspam, you kind of get a lot of benefits for free. It’s harder for the spammers to hide over here in some anonymous corner.” “Now, I continue to support anonymous speech and anonymity, but at the same time, if Danny Sullivan writes something on a forum or something like that I’d like to know about that, even if the forum itself doesn’t have that much PageRank or something along those lines,” he continues. “It’s definitely the case that it was a lot of fun to see the initial launch of rel=’author’. I think we probably will take another look at what else do we need to do to turn the crank and iterate and improve how we handle rel=’author’. Are there other ways that we can use that signal?” Cutts concludes, “I do expect us to continue exploring that because if we can move to a richer, more annotated web, where we rally know…the philosophy of Google has been moving away from keywords, ‘from strings towards things,’ so we’ve had this Knowledge Graph where we start to learn about the real world entities and the real world relationships between those entities. In the same way, if you know who the real world people are who are actually writing content, that could be really useful as well, and might be able to help you improve search quality. So it’s definitely something that I’m personally interested in, and I think several people in the Search Quality group continue to work on, and I think we’ll continue to look at it, as far as seeing how to use rel=’author’ in ways that can improve the search experience.” Cutts discussed authorship in a hangout about social search back in the fall. In that, he indicated that authorship could become a weightier signal in the future. In fact, he dubbed it a “long term trend”. The moral of the story is: If you have started building reputation and credibility yet, you should probably do so. You’ll also want to implement authorship markup .
Go here to read the rest:
Google Will Continue To Improve How It Handles Authorship, Look For Other Ways To Use It