10 Takeaways In Google’s: What To Expect From SEO This Summer Video
The head of Googles Webspam, Matt Cutts, released a video yesterday giving SEOs and webmasters a rare look into what to expect from Google in the coming months. The Google God reveals some key changes that will make webmasters’ lives easier, provide better search results, improve the user experience, and really, REALLY scare the bejeebers out of any remaining black hatters out there.
To summarize the video below in two sentences:
“If you are doing high quality content whenever you’re doing SEO…you shouldn’t have to worry about a lot of different changes. If you’ve been hanging out on a lot of black hat forums and trading different types of spamming package tips…then it might be a more eventful summer for you.”
Matt Cutt’s Disclaimer:
Take this (info) with a grain of salt…As of today, these are the things that look like
they have gotten some approval or look pretty promising.
1. The next generation of Penguin: Penguin 2.0(technically 4) – From the looks of it, this is going to be BIG, and quite possibly one of those most impactful algorithm updates seen to date. From what is known, penguin 2.0 will be released in the next couple weeks and Matt says the previous updates will be minor in comparison… I’m both excited and anxious to see how this is going to affect the SERPs. I’m anxious because as Danny Sullivan pointed out, even if SEO’s and webmasters did manage to clean up sites after the initial release of penguin, is it possible they triggered something else? Is it possible we may see substantial gains in rankings? Only time will tell.
“We’re relatively close to deploying the next generation of Penguin. Internally, we call it “Penguin 2.0″. And again, Penguin is a webspam change that’s dedicated to try to find black hat webspam and try to target and address that. So this one is a little more comprehensive than Penguin 1.0 and we expect it to go a little bit deeper and have a little bit more of an impact than the original version of Penguin.”
2. Advertorials – Eliminating the flow of PageRank to advertorials, and the need to provide transparency if something is paid.
“We’ve also been looking at advertorials that is sort of native advertising and those sorts of things that violate our quality guidelines. So again, if someone pays for coverage or pays for an ad or something like that, those ads should not flow PageRank. We’ve seen a few sites in the US and around the world that take money and then do link to websites and pass PageRank. So we’ll be looking at some efforts to be a little bit stronger on our enforcement as far as advertorials that violate our quality guidelines. Now there’s nothing wrong inherently with advertorials or native advertising, but they should not flow PageRank and there should be clear and conspicuous disclosure so that users realize that something is paid, not organic or editorial.”
3. Cleaning up specific queries – Specifically ones that are traditionally spammy like payday loans.
“It’s kind of interesting. We get a lot of great feedback from outside of Google. For example, there were people complaining about searches like “payday loans” on Google.co.uk. So we have two different changes that try to tackle those kinds of queries in a couple different ways. We can’t get into too much detail about exactly how they work, but I’m kind of excited that we’re going from having just general queries be a little more cleaned to going to some of these areas that have traditionally been a little more spammy including, for example, some more pornographic queries. And some of these changes might have a little bit more of an impact in those kinds of areas that are a little more contested by various spammers and that sort of thing.”
4. Denying the value of link spammers
“We’re also looking at some ways to go upstream to deny the value to link spammers – some people who spam links in various ways. We’ve got some nice ideas on trying to make sure that that becomes less effective and so we expect that that will roll out over the next few months as well.”
5. More sophisticated link analysis – The details are vague here… but this has the potential of being a game changer.
“And in fact, we’re working on a completely different system that does more sophisticated link analysis. We’re still in the early days for that, but it’s pretty exciting. We’ve got some data now that we’re ready to start munging and see how good it looks and we’ll see whether that bears fruit or not.”
6. Better detection of hacked sites – This includes better communication and information for webmasters.
“We also continue to work on hacked sites in a couple different ways, number one trying to detect them better. We hope in the next few months to roll out a next generation of hacked sites detection that is even more comprehensive, and also try to communicate better to webmasters, because sometimes they/we see confusion between hacked sites and sites that serve up malware. And ideally you have a one-stop shop where once someone realizes that they have been hacked. They can go to webmaster tools and have some single spot they can go where they get a lot more info to sort of point them in the right way to hopefully clean up those hacked sites… We’re also going to be looking for ways we can provide more concrete details, more example URLs that webmasters can use to figure out where to go diagnose their site.”
7. Better detection of authority – Could this refer to author rank?
“We’re doing a better job of detecting when someone is sort of an authority in a specific space — could be medical or could be travel or whatever — and trying to make sure that those rank a little more highly if you’re some sort of authority or a site, that according to the algorithms, we think might be a little more appropriate for users.”
8. Softening Panda’s blow – Specifically to previously affected sites by examining additional quality signals.
“We’ve also been looking at Panda and seeing if we can find some additional signals and we think we’ve got some to help refine things for the sites that are kinda in the border zone/in the grey area a little bit. So if we can soften the effect a little bit for those sites that we believe have some additional signals of quality that will help sites that might have previously been effected to some degree by Panda.”
9. Refining domain clusters – to improve search results
“We’ve also heard a lot of feedback from some people about that if I go down three pages deep I’ll see a cluster of several results all from one domain. We’ve actually made things better that you’re less likely to see that on the first page and more likely to see that on the following pages. And we’re looking at a change which might deploy which would basically say that once you’ve seen a cluster of results from one site then you’d be less likely to see more results from that site as you go deeper into the next pages of Google search results.”
10. Exciting changes that will help SMB’s – in the works
“I think it’s going to be a lot of fun. I’m really excited about a lot of these changes because we do see really good improvements in terms of people who are link spamming or doing various black hat spam would be less likely to show up I think by the end of the summer. And at the same time we’ve got a lot of nice changes queued up that hopefully will help small/medium businesses and regular webmasters as well.”
I get the feeling this is the calm before the storm, and the SEO world may be in for another wild ride. The implications of some of the above changes could fundamentally change the way in which we strategically approach SEO for years to come.
What do you all think? Please share!