The Google 2011 Algorithm Update

You may have noticed some wild fluctuations recently in the Google SERPs. Site’s that were ranking well are nowhere to be seen or have changed rank significantly. As SEO’s are keenly aware, every year about this time, Google tends to make large changes and updates to their search algorithm. This tends to skew most search results for a time but eventually seems to straighten out. This year’s update looks to impact many sites on the web and will force us to re-evaluate how we optimize our websites and deal with content.

Google has always maintained that "content is king". This latest update is a new attempt to reduce the quality score for websites that have low-quality, scraped or copied content. Google has taken a hard line approach over the last few years against content spamming and duplicate content by enforcing penalties against offending websites. This latest update takes this approach to a whole new level.

You must make quick and decisive changes to your site immediately if you have a large amount of reproduced content by adding your own unique content. Google will be hammering down on websites that have little original content.

"We’ve also radically improved our ability to detect hacked sites, which were a major source of spam in 2010. And we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others content and sites with low levels of original content. We’ll continue to explore ways to reduce spam, including new ways for users to give more explicit feedback about spammy and low-quality sites."
http://googleblog.blogspot.com/2011/01/google-search-and-search-engine-spam.html

Many SEOs rely on article syndication to provide backlinks to their client sites. How will this new change affect the article sites and the quality of the backlinks they can provide? If I cite a reference such as I have in this post, will I be penalized for it? I can only assume that they have a way to determine the context of the duplicate content, as opposed to copied, spammed or spun content?

Google says only that they are targeting sites with a "high-percentage" of duplicate content. But of course that leave the rest of us scratching our heads wondering how much is too much and what exactly quantifies as a “high-percentage”? If you have a lot of repeated content or if your keyword densities are too high Google may now be regarding your site as “spammy” and may suffer severe penalties against your site.

Matt Cutt’s dished:

"…we recently launched a redesigned document-level classifier that makes it harder for spammy on-page content to rank highly. The new classifier is better at detecting spam on individual web pages, e.g., repeated spammy words—the sort of phrases you tend to see in junky, automated, self-promoting blog comments."

http://www.stepforth.com/blog/2011/latest-google-algorithm-update-affect/

If you have a site with solid, unique content and your rankings have slipped, it may be that the Google bots just need more time to re-crawl your site. Extra time and bot re-visitation may be necessary before your site can be accurately ranked due to all the fluctuations of other sites caused by the algorithm’s implementation.

If your site does have good rankings but also has a lot of repetitious keyword usage or minimal content and you are watching your site spiraling down the SERPs, you may be hesitant to make broad sweeping changes to your site. In this case, a "minimalist" approach may be best to reduce the "spammy" content on your site. Try removing the offending content in small chunks and wait to see how Google reacts to the changes before determining if a more global approach is required.

The other possibility is that that you feel Google has repositioned your site unfairly. If so, you can also submit a Request Reconsideration from Google:
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35843

For further information on how to avoid or deal with duplicate content visit:
http://googlewebmastercentral.blogspot.com/2008/09/demystifying-duplicate-content-penalty.html

As par for the course, Google rolls out another major update and leaves the rest of the world (and more importantly, SEOs) trying to figure out what the broader implications are and how it will affect our clients rankings and of how to begin implementing such far-reaching changes. Further bulletins as events warrant.