If your site has been penalized by Penguin or Panda, or if you have received a manual penalty, your first instinct is probably to try and help the site recover in rankings (provided that you weren’t already expecting this and gave up on the site in advance). Even if your site was penalized for having a multitude of harmful links pointing to it, recovering it may still be an option, but it may also be a resource and time sink that will ultimately result in failure. Knowing whether it is feasible to attempt recovery can be quite difficult, especially since people can get emotionally attached to their sites, but giving up on it can sometimes be the only reasonable option that you have left.
January 2011 was still a blissful time for farmers, despite the inferior seed that they were planting and the fact that they have often just been “appropriating” the crops from their neighbors, the harvests were plentiful. But that January was marked by an ominous rumbling coming from the nearby bamboo patch. Something huge was approaching, something that threatened to put an end to their farming practices and force them to find other disciplines to try and exploit. The black and white fury descended upon them with the wrath of betrayed user expectations and the vengefulness of a community deceived. Ever since that day people have started paying a lot of attention to the beast’s mood and its constantly changing habits, lest they also end up being mauled by the Panda.
In constant efforts to only bring the most relevant and useful content to its users, Google announced an improvement of their search algorithm in the January of 2011. At the time a number of sites that offered less than interesting content, or in some cases, less then original, were still able to get quite decent, or even excellent, rankings on SERPs. This tendency of such sites to clutter up the higher positions in the search engine’s result page and attract visitors only to let their expectations down came to be called “webspam” and Google would suffer it no longer.
Sites that only offered superficial and unengaging content that seemed to be mass produced and lacking in any real substance became known as “content farms”. The content published on such websites only served as a carrier for keywords and was devoid of any real value for the user that might stumble upon it. So called “scrapers” were another prominent type of webspam. Basically, scrapers are websites that have very little to none of their original content, but they instead publish content from other websites. While there are sites that are meant to do just that, gather information from various sources through, for instance, RSS files, there are also those that are simply stealing other people’s content and passing it on as their own. It was the latter type that Google wanted to find a way to penalize.
Panda is in the game
Panda was a perfect tool for the job. Well not perfect, the algorithm is, naturally, still being updated quite frequently. Since its inception it has been through a lot of changes, but its purpose remains the same, weeding out the websites that are ranked higher that they objectively deserve. As it is mostly an automated process, one can’t really expect it to be infallible, but constant updates and improvements to the algorithm are at least keeping some people on their toes, and ensuring that they will think twice before stealing someone else’s content or post a nonsensical wall of text instead of a well thought out and written article. Another advantage of the new algorithm was that it took the entire website into consideration. Before Panda, pages of a website were assessed individually, allowing a terribly constructed website with just one well made page to be displayed on the first page of search results. Today, sites are judged as a whole, if you have a bunch of duplicated content, unfavorable content/ads ratio or anything else that is deemed undesirable by the algorithm, that fact will be reflected in your rankings.
As I have already mentioned, the first installment of the algorithm was announced in January 2011, and it was activated on February 24th of the same year. Unofficially targeted at scraper websites and content farms it made quite an impact. Panda manged to affect 11.8% of search results. This first version was limited to US websites, but just a month later, in April, Panda got the next update that applied to all of the English language queries all over the world. The new update also allowed users to block certain sites, and by doing so send a clear signal to Google’s crawlers. Panda 2.0 affected 2% of the queries made. Next three updates that were released in May, June and July didn’t really have a pronounced effect on the queries, but the next update in line that was released in August, and that is known as Panda 2.4 did. It is with this update that Panda expanded to non-English languages, and affected the queries by 6-9%. The next update was released in September, but Google was not as forthcoming with the information on this update as it was with some of the other. In the October, Panda flux was announced, preparing site owners for fluctuations in their rankings. It turned out that the fluctuations were connected with the new update, Panda 3.0, which affected somewhere around 2% of the queries made. There have been a number of smaller or larger updates since then, with the last one being the 20th update of the algorithm. Most of the updates that preceded it influenced less than 1% of the queries, and rarely resulted in significant changes in ranking, but the last update that was released in September of 2012 affected 2.4% of queries in English.
[one_half]
- Panda 1.0 – February 24th
- Panda 2.0 – April 11th
- Panda 2.1 – May 9th
- Panda 2.2 – June 18th
- Panda 2.3 – July 22nd
- Panda 2.4 – August 12th
- Panda 2.5 – September 28th
- Panda 2.5.1 – October 9th
- Panda 2.5.2 – October 13th
- Panda 2.5.3 – October 19/20th
- Panda 3.1 – November 18th
- Panda 3.2 – January 15th
[/one_half] [one_half_last]
- Panda 3.3 – February 26th
- Panda 3.4 – March 23rd
- Panda 3.5 – April 19th
- Panda 3.6 – April 27th
- Panda 3.7 – June 8th
- Panda 3.8 – June 25th
- Panda 3.9 – July 24th
- Panda 3.9.1 – August 20th
- Panda 3.9.2 – September 18th
- Panda #20 – September 27th
- Panda #21 – November 6th
[/one_half_last]
Different kinds of websites were penalized during that period. A number of shopping comparison and ecommerce sites lost their rankings when the algorithm found their content to be lacking. However, some websites were not affected by the updates, even though their content was not exactly perfect. If you are reading this, then there is probably no need to explain what damage a decrease in your rankings can do to you, this is why site owners and SEOs have devoted a lot of time and effort to tracking what demands Panda makes of your website. If you know what it is that the algorithm is targeting you can either avoid getting penalized, or try to recover from it if the penalty has already been incurred.
Panda recovery
First of all, you need to determine whether the decline in your rankings is caused by Panda. One of the ways to do this is to check whether the decline coincided with an algorithm update. Naturally, even if it does, it is not necessarily Panda’s fault, but chances are that it is. Don’t forget to check if the visits form Google search have decreased, for it is them that would be affected by the update. Now if you do think that you have fallen prey to the beast, there are still some things you might be able to do if you want to get out of its claws. Naturally, all of this also applies to sites that haven’t been affected by the algorithm, and would very much like to stay in its good graces.
If you needed to summarize the ways to redeem your site in one phrase, that phrase would probably be “improve user experience”. After all, the goal of Panda and similar algorithms is to help the users find what they are looking for. This, however, includes a number of different things. First of all, you’ll need to check your page loading time. If your pages are loading slowly, or if they fail to display certain pieces of content, you will not exactly become too popular with the users, and the algorithm will take note of it. This might be caused by a number of reasons, and whatever they are, you have to make sure to eliminate them.
Be sure to check your bounce rate, are people leaving your site immediately after seeing the homepage? If they are there might be a number of reasons for that. It might have something to do with aesthetics, but it might also be caused by bad page organization, such as confusing navigation or too many ads. You need to make the user want to stay on your site and click on through to the other pages after seeing the homepage. Speaking of ads, they might relate to both of the above issues. First of all, if you have ads from too many networks, that might have serious negative impact on your page loading times. Additionally, even if all the ads are from the same network, you should really be careful about how many of them you display on any of the pages, not just the homepage. Content should dominate your pages, not the ads.
Content & the Panda Update
Which brings us to the most important factor when it comes to your rankings and Panda, content. Sure, you need to limit the number of outgoing links on your pages to a reasonable number, you need to pay attention to your internal linking structure, and you have to make sure that your code is all in order, but it is content that makes or breaks most websites. It is content that the users are looking for and it is content that will keep them on your site. It is also content that will make them, or another website, want to recommend you to someone else. There are several things that you have to look out for. First of all, and this should go without mentioning, you need to make sure that your content is within the rules of grammar and that it is spelled correctly. It should also be informative, and actually relevant to the keywords that are likely to bring people to that page. Now when I say “informative” I don’t mean that it should just state the very basic things about the topic, as is often the case, instead try to really involve the potential readers, offer unique insights or little known facts, make it an entertaining or a truly useful read. You’ll find that if you offer something that people can actually use, it will promote itself. There should also be enough of it, you might write a 200 word article on a topic and decide that it is enough, but you might want to think about making it longer, especially if the page has something that is distracting people from the content, such as ads for instance.
One of the things that you must watch out for is duplicated content. If Panda determines that you have too much of it on your website, you’re very likely to get penalized. Now, this is not always the rule, there are websites that dodged the algorithm despite having a lot of duplicated content, but those were usually quite strong as brands, and that helped them pull through. It is important to note that when I say “duplicated content” that doesn’t only apply to content that you have copied from another site, but also to separate iterations of the same content across your various pages. This often happens with dynamic pages, so make sure that you pay special attention to them. If, for some reason, you do need to have multiple pages with the same content, use canonical URLs, or noindex and nofollow attributes to avoid being penalized.

If you find that some of your pages have suffered a significantly larger drop in rankings than other, analyze them and try to determine what the cause of such a discrepancy might be. If you do find the cause it might help you improve your other pages as well. Based on your findings, you might even decide to improve your worst page(s) or completely remove them. As long as it doesn’t disturb your site’s functionality too much, it is better to redirect or 404 a page, than to let it bring your whole site down.
Also, be sure to check your incoming links. You might be getting links from less than reputable sources, and enough links of that type might hurt you quite badly even if you weren’t responsible for them. Try arranging for those links to be removed when it is in your power, or if all else fails, consider removing the pages that such sites are linking to, it sounds extreme, but it might help you in the long run. Likewise, try to get inbound links from reputable sources, keeping in mind that quality is much more important than quantity.
Once you have made sure that your site is working properly and offering visitors useful and engaging content that is easily accessible and well organized, all you need to do is wait to see if you recover. This might take some time, but if you have covered all of the important issues, you should eventually see improvement. If all else fails, you might try writing a report to Google explaining your situation, but be sure that you provide them with all of the necessary information, and that you are actually justified in your claims. In the end, it all boils down to a quite simple truth, if you want to have a site that ranks well, you need to have a well made site. There, I hope it’s all clear now.
There has been a lot of stir surrounding Google’s latest Panda update. It has severely affected the rankings of some websites leaving them reeling from Google’s latest punches. But even with all these changes two strategies remain to be the pillars of any effective SEO campaign.
Writing high quality content and creating relevant backlinks are two of the most crucial SEO strategies that impact a website’s ranking in major search engines such as Google, Yahoo! and Bing. However, despite having a similar purpose, these two SEO strategies differ from each other and are worlds apart when it comes to execution.
Writing high quality content is a skill. It requires the ability to communicate effectively a particular idea or topic through the use of written words. It also necessitates adept knowledge when it comes to the rules of grammar and proper syntax.
On the other hand, creating relevant backlinks involves deep research and analysis regarding the Internet and the websites, directories and forums wherein one can create backlinks. Creating backlinks for your website requires adept marketing skills.
What is the advantage of writing high quality content?
Writing high quality content offers a lot of advantages. When you write high quality content on a particular subject, you start to establish your expertise in that field. The better your explanations, analysis and researches are, the more you’ll be perceived as an expert in the said subject.
Writing high quality content also helps in establishing your website as an authority site. Thus, search engines will most likely put you in a higher search ranking.
Compared to creating relevant backlinks, writing high quality content has a more lasting effect as it will also improve your writing, grammar and communication skills.
What is the advantage of creating relevant backlinks?
Creating relevant backlinks that point directly to your website is actually much easier compared to writing high quality content. For one, creating relevant backlinks doesn’t require that much research on a topic.
Quality backlinks usually consist of 3-4 sentences only as opposed to high quality content that has five or six paragraphs minimum. More so, some backlinks are simply summary of the article or the website being promoted.
Unlike high quality content, backlinks can be created not just in a single domain. Backlinks can be created in article directories like Ezine, Article Base, Article Alley, Article Blasts and Article Trader.
Social media sites like Facebook and the micro-blogging platform Twitter are also great places where bloggers and webmasters can create backlinks for their blogs and websites.

High quality content versus relevant backlinks
Writing high quality content and creating relevant backlinks are part of a good SEO strategy. Both techniques, if executed properly, can generate heavy amounts of traffic for your website.
When it comes to which one is better, the answer boils down to the old Internet adage that content is king. Creating relevant backlinks is important but high quality content is weighed heavier especially in today’s fast-changing Internet hemisphere.
Another argument for high quality content is that it can actually generate backlinks of its own. Obviously, readers and fellow bloggers will more likely to share, recommend and create backlinks to high quality content compared to junk and duplicate content.
This is a guest post by Suzzane Edwards. She is a financial advisor and currently works as a consultant for small businesses. When she’s not writing for Cash for Gold, she can be found blogging about simple tips on how people could take advantage of the internet’s numerous business opportunities