Mar 27 2013

Will Google Offer A ‘Brain Interface’ Within The Next Ten Years?

Google put out a new Webmaster Help video today, though this one doesn’t really do much to help webmasters. It’s simply Matt Cutt responding to the question: Where do you see Google search 10 years down the road? An interesting topic, for sure. As Cutts notes, that’s a long time. In Internet years that’s a really, really long time. I can’t imagine how even Cutts could possibly know what Google will be like that far into the future. It sounds, however, like a lot of current Google projects like Google Glass and Google Now are involved. One interesting (if not scary) concept Cutts mentions is a brain interface. “In theory there could be a brain interface so you could be having a dialogue where some of it is audible and some of it is not,” he contemplates. I think that hovers somewhere around that “creepy line” that Microsoft likes to keep talking about (and even illustrating ). Former Google CEO (and current Executive Chairman) Eric Schmidt said a few years ago that brain implants would cross the creepy line. The part about the creepy line was used as a sound byte in Microsoft’s “Scroogled” campaign about Gmail , even though Schmidt was talking about brain implants. I smell a Scroogled resurrection .

Mar 26 2013

Google’s Cutts Talks About Blocking JavaScript

For the second day in a row , Matt Cutts answers a question from a fellow Googler in a Webmaster Help video. This one comes from Webmaster Trends analyst John Mueller: “Googlebot keeps crawling my JavaScript and thinking that text in my scripts refers to URLs. Can I just disallow crawling of my JavaScript files to fix that.” Long story short, he wouldn’t advise it. If there’s one individual JavaScript file that’s the source of the problem, you could disallow that, he says. Also, don’t block CSS. He says, “It turns out, as we’re executing JavaScript, we do look at the attributes, so you can actually use JavaScript, and put like a nofollow attribute on individual URLs, and so it is the sort of thing where you can do link level granularity there. And you can block, for example, and individual JavaScript file, but in general, I would not recommend blocking all of your JavaScript files.” Cutts has talked about blocking Google from crawling JavaScript, and how it can hurt your rankings, in the past. Watch that video here .

Mar 26 2013

Google Announces Opt-Out Tool To Keep Content Out Of Its Specialized Search Engines

Google has launched a new way for sites to opt out of having their content show up in Google Shopping, Advisor, Flights, Hotels, and Google+ Local search. Matt Cutts announced the feature in a very brief post on the Google Webmaster Central blog, saying, “Webmasters can now choose this option through our Webmaster Tools, and crawled content currently being displayed on Shopping, Advisor, Flights, Hotels, or Google+ Local search pages will be removed within 30 days.” This is obviously not a feature that Google would want a ton of people to use, because the less content that appears in these services, the less useful they are. Perhaps that’s why Cutts hasn’t tweeted about the tool (maybe not, but perhaps). At least with the short announcement, they have something they can point to. The feature is a direct response to an investigation by the Federal Trade Commission. When Google settled with the FTC, one of the voluntary concessions Google made was a feature that would let sites opt out of Google’s specialized search engines. As Danny Sullivan notes , the feature doesn’t let you choose which search engines you wish to opt out of. If you use the feature, you’re opting out of all of those mentioned. On a help page , Google says, “This opt-out option currently applies only to services hosted on google.com and won’t apply to other Google domains.”

Mar 25 2013

Getting Google Rankings Back After Site Downtime

The latest Google Webmaster Help video deals with getting your site’s rankings back after experiencing some downtime. Google’s Matt Cutts will often provides answers to his own questions in these videos. This time the question comes from Googler Pierre Far, a Webmaster Trends analyst at Google UK. The question is: My website was down for a couple of days and now has lost all of its Google rankings. What can I do to get them back? Basically, Cutts’ answer is just to put the site back up, make sure it’s reliable, and make sure the pages that were there before are still there. “There’s a tension at Google where if a page goes away (maybe there’s a 404), we don’t know if it’s really gone away or whether that page will be back,” he says. “Sometimes there’s a server time out, you know, the server is kind of slow. And so, on one hand, you don’t want to keep showing a page that would be a bad user experience, like it’s really gone away. On the other hand, it’s very common for websites to go down for an hour, two hours, a day, two days…and so you also want to give the benefit of the doubt, so you can revisit those pages and see whether they’ve gone up.” “So we do have different varying time periods where we basically allow, if a domain looks like it’s gone away, but it comes back – you know, it’s back online, then we just sort of say, ‘Okay, it was a transient error,’ so the short and simple advice is just make sure you put the website back up the way it was, and hopefully things should recover relatively quickly,” says Cutts. This may not be quite the answer you were looking for, but that’s the one Google is giving. It would certainly be interesting to know more about these “varying periods of time”.

Mar 20 2013

Google: We Still Need Text To Index Your Content

Google’s latest Webmaster Help video discusses Google’s need for text in indexing content. Matt Cutts responds to a question about how important text is in getting Google to understand their site. The user has a site that is mostly made up of images, and says that users like it better, bounce rate has declined, and conversions are up. “Google does still want text,” he says. “So there’s a couple options. One is: if you have an image that you’ve made of some text that’s really beautiful, you can include some textual content there. You can sort of say, ‘alt,’ you know, or the title – that sort of thing. So you can say, for an image, here’s some text to go along with that, and that can help.” He goes on to say that one reason a site might be having more user interaction, time on site, conversions, etc., is because it’s prettier. “And we see that,” he says. “Better design can help people enjoy and use your site more.” He also suggests considering Google Web Fonts.

Mar 19 2013

Google: If We Mistakenly Penalize You For Paid Links, There Would Be A ‘Ton Of Collateral Damage’

There has been a lot of talk about Google and paid links in the news lately, so it’s only fitting that they’re the topic of the latest Webmaster Help video from the company. In this one, Matt Cutts responds to this question: On our travel site, we recommend and link out to hotels and B&B’s in our niche. Our readers find it useful. They’re not paid links, so we don’t add the nofollow attribute. What stops Google from suspecting these are paid links and penalizing us? “The short answer is: if you’re linking to high quality sites, and you editorially think that they’re good sites, that’s how most of the web works,” says Cutts. “We get into this tiny little area of search and SEO, and we’re convinced all links are nofollowed, and if a link looks good, it must be paid or something like that, and the fact is that for the most part, whenever you’re looking at links, people are linking to stuff that they like. They’re linking to stuff that they enjoy.” “So, if we mistakenly thought that those were paid links, and as a result, penalized you, there would be a ton of collateral damage,” he says. “There would be a ton of algorithmic damage to our search rankings. So it’s in our enlightened, best self interest, as well as in the interest of our users to make sure that we don’t accidentally classify links as paid and penalize the site. And normally, even if we would classify links as paid, we might not trust the links from your site, but we wouldn’t have things where your site would necessarily stop ranking as well. It can happen if somebody is selling a lot of links, they’ve been selling them for a long time, and those sorts of things, so we do take strong action in some situations, but a lot of the times if we think that a link might be sold or if we have very good reason to suspect, we might just not trust that site’s links nearly as much or maybe zero.” Concluding the video, Cutts reiterates that it’s in the company’s best interest to be precise when it comes to getting paid links right.

Mar 18 2013

Matt Cutts On Google’s Handling Of Single-Page Sites

Google has released its latest Webmaster Help video. This time, Matt Cutts discusses single-page sites, and how Google handles them. Specifically, he responds to the following user-submitted question: What does Google think of single-page websites? They are becoming more complex and there are some great websites using only a single page (+lots of CSS and JavaScript) bringing the same user experience as a regular website with many subpages. “Google has gotten better at handling javascript, and a lot of the time, if you’re doing some strange or unusual javascript interaction, or pinning some part of the page, or something like that, or having things fold in or fold out, we’re pretty good at being able to process that,” says Cutts. “In general, I would run a test first. I wouldn’t bet your SEO legacy on this one single page working well if the javascript or CSS is really obscure or maybe you’ve forgotten and blocked that out in robots.txt. But if you run the test, and you’re pretty happy with it, I don’t necessarily see a problem with that.” “It’s a different convention,” he continues. “Sometimes it works. Maybe you get better conversions, maybe you don’t. It’s going to depend on what your particular area is, what the topic is, what kind of layout you come out with…but if it works for you, and for users to have that all on one page, for the most part, it should work for Google as well.” On a semi-related topic, here’s what Cutts had to say about a year ago about blocking Google from javascript and CSS. Here , he talks about Google getting better at handling javascript.

Mar 12 2013

Matt Cutts: Panda Update Coming Friday, ‘Big’ Penguin Update Later This Year

According to Google webspam head Matt Cutts, we can expect the next Panda refresh to occur within the next few days. Speaking at the SMX conference, Cutts said that the next Panda update will take place this Friday, March 15th or by Monday, March 18th at the latest. The last Panda update rolled out on January 22nd, and Google said that it affected 1.2% of queries. Even if a Panda update launches this Friday, it will have been the longest time between updates in recent memory. Google previously released a Panda update a few days before Christmas, and two back in November . Although the Panda refresh is coming sooner, a Penguin update is also on the horizon – and Cutts said that it’ll be a big one . Cutts said that it will be one of the most talked-about updates of the year. They are “working on the next generation of Penguin,” said Cutts. More algorithm changes were discussed at SXSW last week. There, Cutts announced a possible crackdown on bad online merchants . “We have a potential launch later this year, maybe a little bit sooner, looking at the quality of merchants and whether we can do a better job on that, because we don’t want low quality experience merchants to be ranking in the search results,” he said. Check here for more on the future of Panda and Penguin in 2013