If you are working hard to do SEO for your hosting website, you should be interested to know a change to Google’s algorithm. I found the Matt Cutts interview in the latest SMX Advanced conference in Seattle.
You can check the complete interview at http://searchengineland.com/live-blog-you-a-with-matt-cutts-at-smx-advanced-123513
Danny Sullivan(DS): What’s the deal with Penguin. Is it a penalty?
Matt Cutts(MC): We look at it something designed to tackle low-quality content. It started out with Panda, and then we noticed that there was still a lot of spam and Penguin was designed to tackle that. It’s an algorithmic change, but when we use a word like “penalty,” we’re talking about a manual action taken by the web spam team — it wasn’t that.
We don’t think of it as a penalty. We think of it as, “We have over 200 signals, and this is one of the signals.”
DS: So from now, does “penalty” mean it’s a human thing?
MC: That’s pretty much how we look at it. In fact, we don’t use the word “penalty” much, we refer to things as a “manual action.” Part of the reason why we do that breakdown is, how transparent can we be? We do monthly updates where we talk about changes, and in the past year, we’ve been more transparent about times when we take manual action. We send out alerts via Google Webmaster Tools.
DS: Did you just do another Penguin update?
MC: No.
(Danny references the WPMU story and Matt says that the site recovered due to the data refreshes and algorithmic tweaks.)
DS: Now we hear a lot of people talking about “negative SEO.”
MC: The story of this year has been more transparency, but we’re also trying to be better about enforcing our quality guidelines. People have asked questions about negative SEO for a long time. Our guidelines used to say it’s nearly impossible to do that, but there have been cases where that’s happened, so we changed the wording on that part of our guidelines.
Some have suggested that Google could disavow links. Even though we put in a lot of protection against negative SEO, there’s been so much talk about that that we’re talking about being able to enable that, maybe in a month or two or three.
DC: asks about different types of links
MC: We’ve done a good job of ignoring boilerplate, site wide links. In the last few months, we’ve been trying to make the point that not only is link buying like that not doing any good, we’re turning the dial up to let people know that certain link spam techniques are a waste of money.
DC: Danny asks about messaging.
MC: If you roll out a new algorithm, it can affect millions of sites. It’s not practical to notify website owners when you have 500 algo changes every year, but we can notify when there’s been manual action against a specific site.
One thing I’d like to clear up — the news earlier this year about 700,000 warnings. The vast majority of those were because we started sending out messages even for cases of very obvious black hat techniques. So now we’re completely transparent with the warnings we send. Typically your website ranking will drop if you don’t take action after you get one of those warnings.
DC: Anything new related to paid links?
MC: We’re always working on improving our tools. Some of the tools that we built, for example, to spot blog networks, can also be used to spot link buying. People sometimes think they can buy links without a footprint, but you don’t know about the person on the other side. People need to realize that, as we build up new tools, paid links becomes a higher risk endeavor. We’ve said it for years, but we’re starting to enforce it more.
I believe, if you ask any SEO, is SEO harder now than 5-6 years ago, I think they’d say it’s a little more challenging. You can expect that to increase. Google is getting more serious about buying and selling links. Penguin showed that some stuff that may work short term won’t work in the long term.
DS: Affiliate links. Do people need to run around and nofollow all that?
MC: If it’s a large enough affiliate network, we know about it and recognize it. But yes, I would recommend no following affiliate links. (That’s a paraphrase! Not an exact quote – sorry.)
DS: Do links still work, or are social signals gonna replace them?
MC: Douglas Adams wrote “Space is big. You have no idea how big space is.” The web is like that. Library of Congress, the biggest library in the world, has 235 terabytes of data. That’s not very big compared to the way the web grows.
The actual percentage of nofollow links on the web is a single digit percentage, and it’s a pretty small percentage. To say that links are a dead signal his wrong. I wouldn’t write the epitaph for links just yet.
DS: You do these 30-day challenges, like “I’m gonna use Bing for 30 days.”
MC: I have not done that one, and I’m afraid to try! (huge laughter from audience – Matt then says he’s joking and compliments Bing team)
Danny challenges Matt and Google to do something to see the web from an SEOs shoes, and says that SEOs should try to see things from Matt’s perspective, too.
DS: What’s up with your war on SEOs? (laughter) Or is it a war on spam?
MC: It’s a war on spam. If you go on the black hat forums, there’s a lot of people asking, How do I fake sincerity? How do I fake being awesome? Why not just be sincere and be awesome? We’re trying to stop spam so people can compete on a level playing field. I think our philosophy has been relatively consistent.
DS: What about tweets earlier today about using bounce rate? You don’t look at how quickly someone bounces from a search result and back to Google?
MC: Webspam doesn’t use Google Analytics. I asked again before this conference and was told, No, Google does not use analytics in its rankings.
And now we’re going to audience questions.
DS: What percent of organic queries are now secure?
MC: The launch was a little backwards, because we didn’t want to talk about being able to search over different corpi/corpuses. It was a single percentage of traffic in the US, and then we rolled it out internationally.
I think it’s still a minority of the traffic now, but there’s things like Firefox adding SSL search in the browser. There’s a lot of things aimed at helping users with privacy. I recognize that’s not good for marketers, but we have to put users first. We feel like moving toward SSL, moving toward encrypted, is the right long-term plan.
DS: (reading audience question) How come WordPress didn’t get penalized with all the blogs that have WordPress links in their footer?
MC: If you look at the volume of those links, most of them are from quality sites. WPMU had a pretty good number of links from lower quality sites.
DS: How come AdWords isn’t being blocked from keyword referrals?
MC: If we did that, every advertiser would do an exact match for every phrase and then the ad database would grow exponentially. He adds that he wishes Google might have reconsidered that decision, though.
You can check the complete interview at http://searchengineland.com/live-blog-you-a-with-matt-cutts-at-smx-advanced-123513