If your site has been penalized by Penguin or Panda, or if you have received a manual penalty, your first instinct is probably to try and help the site recover in rankings (provided that you weren’t already expecting this and gave up on the site in advance). Even if your site was penalized for having a multitude of harmful links pointing to it, recovering it may still be an option, but it may also be a resource and time sink that will ultimately result in failure. Knowing whether it is feasible to attempt recovery can be quite difficult, especially since people can get emotionally attached to their sites, but giving up on it can sometimes be the only reasonable option that you have left.
Year 2012 has been quite tumultuous when it comes to SEO. Even though Google has been picking up pace ever since it started providing its services, the number of algorithm updates and changes made to search process itself have been quite astonishing. For the most part, the changes were for the best, but such a huge task as the one that Google has set for itself is not easily accomplished, especially in the ever-changing environment such as the internet, so naturally, not everything went as smoothly as we might have hoped. Anyway, we’ll provide a short overview of the most important changes, and leave you to judge whether we’re better off for them, as SEOs and as internet users.
With Google continually making changes to the layout of their first page of results and rolling out updates to punish low-quality pages, it has never been more essential to use every technique at your disposal to compete for a top spot. If you’re a website owner or SEO practitioner, then you know that’s it’s important to stay up-to-date with the latest developments, and that’s why today we’re going to go over how to roll-out an SEO for Video strategy that will help you rank highly and improve your traffic.
This year Google announced a number of improvements in their services and cemented their inviolable position in the technology world. The most important announcements on this year’s Google I/O conference were definitively launches of cloud computing platform GCE, Jelly Bean and Google Nexus 7. However, they didn’t forget to update their search technology as well.
Obviously determined to preserve in their efforts to fight black-hat SEO and webspam only with black and white animals that are just cute and cuddly enough to not make the public’s opinion on the algorithms that they are mascots of even worse than it is, on the 24th of April in 2012, Google activated their new algorithm that was later to be officially named Penguin. As was the case with Panda, Google didn’t give the algorithm a name until it became possible that the public might come with a, not too flattering, name on their own. People were already starting to call the algorithm “Titanic”, when Google decided that they actually might be if they did their own branding, and made the name Penguin official.
January 2011 was still a blissful time for farmers, despite the inferior seed that they were planting and the fact that they have often just been “appropriating” the crops from their neighbors, the harvests were plentiful. But that January was marked by an ominous rumbling coming from the nearby bamboo patch. Something huge was approaching, something that threatened to put an end to their farming practices and force them to find other disciplines to try and exploit. The black and white fury descended upon them with the wrath of betrayed user expectations and the vengefulness of a community deceived. Ever since that day people have started paying a lot of attention to the beast’s mood and its constantly changing habits, lest they also end up being mauled by the Panda.
In constant efforts to only bring the most relevant and useful content to its users, Google announced an improvement of their search algorithm in the January of 2011. At the time a number of sites that offered less than interesting content, or in some cases, less then original, were still able to get quite decent, or even excellent, rankings on SERPs. This tendency of such sites to clutter up the higher positions in the search engine’s result page and attract visitors only to let their expectations down came to be called “webspam” and Google would suffer it no longer.
Sites that only offered superficial and unengaging content that seemed to be mass produced and lacking in any real substance became known as “content farms”. The content published on such websites only served as a carrier for keywords and was devoid of any real value for the user that might stumble upon it. So called “scrapers” were another prominent type of webspam. Basically, scrapers are websites that have very little to none of their original content, but they instead publish content from other websites. While there are sites that are meant to do just that, gather information from various sources through, for instance, RSS files, there are also those that are simply stealing other people’s content and passing it on as their own. It was the latter type that Google wanted to find a way to penalize.
Panda is in the game
Panda was a perfect tool for the job. Well not perfect, the algorithm is, naturally, still being updated quite frequently. Since its inception it has been through a lot of changes, but its purpose remains the same, weeding out the websites that are ranked higher that they objectively deserve. As it is mostly an automated process, one can’t really expect it to be infallible, but constant updates and improvements to the algorithm are at least keeping some people on their toes, and ensuring that they will think twice before stealing someone else’s content or post a nonsensical wall of text instead of a well thought out and written article. Another advantage of the new algorithm was that it took the entire website into consideration. Before Panda, pages of a website were assessed individually, allowing a terribly constructed website with just one well made page to be displayed on the first page of search results. Today, sites are judged as a whole, if you have a bunch of duplicated content, unfavorable content/ads ratio or anything else that is deemed undesirable by the algorithm, that fact will be reflected in your rankings.
As I have already mentioned, the first installment of the algorithm was announced in January 2011, and it was activated on February 24th of the same year. Unofficially targeted at scraper websites and content farms it made quite an impact. Panda manged to affect 11.8% of search results. This first version was limited to US websites, but just a month later, in April, Panda got the next update that applied to all of the English language queries all over the world. The new update also allowed users to block certain sites, and by doing so send a clear signal to Google’s crawlers. Panda 2.0 affected 2% of the queries made. Next three updates that were released in May, June and July didn’t really have a pronounced effect on the queries, but the next update in line that was released in August, and that is known as Panda 2.4 did. It is with this update that Panda expanded to non-English languages, and affected the queries by 6-9%. The next update was released in September, but Google was not as forthcoming with the information on this update as it was with some of the other. In the October, Panda flux was announced, preparing site owners for fluctuations in their rankings. It turned out that the fluctuations were connected with the new update, Panda 3.0, which affected somewhere around 2% of the queries made. There have been a number of smaller or larger updates since then, with the last one being the 20th update of the algorithm. Most of the updates that preceded it influenced less than 1% of the queries, and rarely resulted in significant changes in ranking, but the last update that was released in September of 2012 affected 2.4% of queries in English.
[one_half]
- Panda 1.0 – February 24th
- Panda 2.0 – April 11th
- Panda 2.1 – May 9th
- Panda 2.2 – June 18th
- Panda 2.3 – July 22nd
- Panda 2.4 – August 12th
- Panda 2.5 – September 28th
- Panda 2.5.1 – October 9th
- Panda 2.5.2 – October 13th
- Panda 2.5.3 – October 19/20th
- Panda 3.1 – November 18th
- Panda 3.2 – January 15th
[/one_half] [one_half_last]
- Panda 3.3 – February 26th
- Panda 3.4 – March 23rd
- Panda 3.5 – April 19th
- Panda 3.6 – April 27th
- Panda 3.7 – June 8th
- Panda 3.8 – June 25th
- Panda 3.9 – July 24th
- Panda 3.9.1 – August 20th
- Panda 3.9.2 – September 18th
- Panda #20 – September 27th
- Panda #21 – November 6th
[/one_half_last]
Different kinds of websites were penalized during that period. A number of shopping comparison and ecommerce sites lost their rankings when the algorithm found their content to be lacking. However, some websites were not affected by the updates, even though their content was not exactly perfect. If you are reading this, then there is probably no need to explain what damage a decrease in your rankings can do to you, this is why site owners and SEOs have devoted a lot of time and effort to tracking what demands Panda makes of your website. If you know what it is that the algorithm is targeting you can either avoid getting penalized, or try to recover from it if the penalty has already been incurred.
Panda recovery
First of all, you need to determine whether the decline in your rankings is caused by Panda. One of the ways to do this is to check whether the decline coincided with an algorithm update. Naturally, even if it does, it is not necessarily Panda’s fault, but chances are that it is. Don’t forget to check if the visits form Google search have decreased, for it is them that would be affected by the update. Now if you do think that you have fallen prey to the beast, there are still some things you might be able to do if you want to get out of its claws. Naturally, all of this also applies to sites that haven’t been affected by the algorithm, and would very much like to stay in its good graces.
If you needed to summarize the ways to redeem your site in one phrase, that phrase would probably be “improve user experience”. After all, the goal of Panda and similar algorithms is to help the users find what they are looking for. This, however, includes a number of different things. First of all, you’ll need to check your page loading time. If your pages are loading slowly, or if they fail to display certain pieces of content, you will not exactly become too popular with the users, and the algorithm will take note of it. This might be caused by a number of reasons, and whatever they are, you have to make sure to eliminate them.
Be sure to check your bounce rate, are people leaving your site immediately after seeing the homepage? If they are there might be a number of reasons for that. It might have something to do with aesthetics, but it might also be caused by bad page organization, such as confusing navigation or too many ads. You need to make the user want to stay on your site and click on through to the other pages after seeing the homepage. Speaking of ads, they might relate to both of the above issues. First of all, if you have ads from too many networks, that might have serious negative impact on your page loading times. Additionally, even if all the ads are from the same network, you should really be careful about how many of them you display on any of the pages, not just the homepage. Content should dominate your pages, not the ads.
Content & the Panda Update
Which brings us to the most important factor when it comes to your rankings and Panda, content. Sure, you need to limit the number of outgoing links on your pages to a reasonable number, you need to pay attention to your internal linking structure, and you have to make sure that your code is all in order, but it is content that makes or breaks most websites. It is content that the users are looking for and it is content that will keep them on your site. It is also content that will make them, or another website, want to recommend you to someone else. There are several things that you have to look out for. First of all, and this should go without mentioning, you need to make sure that your content is within the rules of grammar and that it is spelled correctly. It should also be informative, and actually relevant to the keywords that are likely to bring people to that page. Now when I say “informative” I don’t mean that it should just state the very basic things about the topic, as is often the case, instead try to really involve the potential readers, offer unique insights or little known facts, make it an entertaining or a truly useful read. You’ll find that if you offer something that people can actually use, it will promote itself. There should also be enough of it, you might write a 200 word article on a topic and decide that it is enough, but you might want to think about making it longer, especially if the page has something that is distracting people from the content, such as ads for instance.
One of the things that you must watch out for is duplicated content. If Panda determines that you have too much of it on your website, you’re very likely to get penalized. Now, this is not always the rule, there are websites that dodged the algorithm despite having a lot of duplicated content, but those were usually quite strong as brands, and that helped them pull through. It is important to note that when I say “duplicated content” that doesn’t only apply to content that you have copied from another site, but also to separate iterations of the same content across your various pages. This often happens with dynamic pages, so make sure that you pay special attention to them. If, for some reason, you do need to have multiple pages with the same content, use canonical URLs, or noindex and nofollow attributes to avoid being penalized.

If you find that some of your pages have suffered a significantly larger drop in rankings than other, analyze them and try to determine what the cause of such a discrepancy might be. If you do find the cause it might help you improve your other pages as well. Based on your findings, you might even decide to improve your worst page(s) or completely remove them. As long as it doesn’t disturb your site’s functionality too much, it is better to redirect or 404 a page, than to let it bring your whole site down.
Also, be sure to check your incoming links. You might be getting links from less than reputable sources, and enough links of that type might hurt you quite badly even if you weren’t responsible for them. Try arranging for those links to be removed when it is in your power, or if all else fails, consider removing the pages that such sites are linking to, it sounds extreme, but it might help you in the long run. Likewise, try to get inbound links from reputable sources, keeping in mind that quality is much more important than quantity.
Once you have made sure that your site is working properly and offering visitors useful and engaging content that is easily accessible and well organized, all you need to do is wait to see if you recover. This might take some time, but if you have covered all of the important issues, you should eventually see improvement. If all else fails, you might try writing a report to Google explaining your situation, but be sure that you provide them with all of the necessary information, and that you are actually justified in your claims. In the end, it all boils down to a quite simple truth, if you want to have a site that ranks well, you need to have a well made site. There, I hope it’s all clear now.
For a long time, using exact match domains to rank for specific keywords was one of the easiest ways to gain an edge in the competitive SEO wars. Domains names were given a ranking advantage over websites when it came to the keyword phrase contained in the domain.
While this ranking boost made sense from common-sense standpoint – after all, if your website is bluewidgets.com, it’s likely relevant to a search for “blue widgets” – this phenomenon was often used by SEOs to easily outrank more authoritative sources for specific key phrases, especially long-tail search terms.
Whether you think it’s a positive development or not Google released an algorithm update on Sept 28, 2012 – known as the EMD update – that has severely reduced and possibly eliminated any SEO benefit of having your targeted search phrase in your domain name. According to Google’s head of webspam Matt Cutts, this update noticeably affected .6% of English-US search queries. To add to the confusion, Google released a major Panda algorithm update during the same period which affected 2.6% of English-US search queries. This caused many webmasters and SEOs to confuse the algorithm update targeting exact match domains with a panda update designed to target low quality content. The timing of the updates was likely intentionally designed to cause confusion.

What To Take Out Of The EMD Update
While there is no way to know with 100% certainty at this point, it seems likely based on the initial data and that the EMD update devalues exact match domains, rather than penalizes them. Many strong, white-hat exact match domains dropped a few positions/pages in the search engine rankings but didn’t suffer anything that would be consistent with a penalty.
For exact match domains that were severely demoted, this is likely because they were experiencing significant boosts from their domains which – once devalued – severely hurt their rankings. Some of these significant drops also look a lot like a panda penalty, and could also be a result of the Panda update that occurred at the same time, and may actually be unrelated to the EMD update. After all, Panda is designed to target what Google deems to be low quality websites, and the reality is that many exact match domains built around long-tail search terms are created by SEOs solely for the purpose of generating Adsense or affiliate revenue.
With reports that many EMDs with highly optimized keyword density and header tags experienced significant drops, one other possibility is that the EMD update not only removes exact match SEO benefits, but it also goes beyond and penalizes domains that are over-optimized for their exact match key phrase. Because the EMD update was released at the same time as a major Panda update (which could also target over-optimized sites), it may be awhile before we sort out whether the EMD update has any actual penalty effect, or whether it’s merely designed to remove the extra boost given to exact match domains.

How The EMD Update Will Affect Your SEO Strategy Going Forward
One thing is clear going forward after the EMD update – relying on the exact match domain boost for easier search engine rankings is no longer a viable strategy. With reports of heavily keyword-optimized sites being penalized, it also seems clear that Panda, Penguin and the latest EMD update are working together to force SEOs to develop ranking strategies that look as natural as possible.
While there will always be exceptions to the rule, the general trend in SEO will be focused quality and diversity. Including a targeted keyword in your title, url, h1, h2, h3 tags, picture alt tags, and bolded throughout your content may no longer be the best idea. While you should ensure that your targeted keyword appears in your title and at least one header tag, focus your on-site efforts towards creating a quality user experience. Whether it’s due to the latest EMD update or an addition to the Panda update, it looks like Google is targeting websites that are over-optimizing their on-page SEO for specific keywords – especially if your website is new or has less authority.
Continue to build links from quality sources with a wide variety of anchor text. Avoid links from linkfarms, spam, or any source that will link out to bad neighborhoods such as low quality directories/bookmarking sites. The more natural your link profile and on-site SEO, the less likely you can be targeted for penalties.
Is SEO Dead?
The cries that SEO is dead are as always – totally misguided. This latest algorithm update didn’t change the number of organic positions in Google, and many of these positions are still occupied by smaller niche sites. There is definitely a trend with Google towards favoring larger authority websites, established brands, and Google’s own content, but there is still plenty of room for other sites to rank.
Author Bio:
Nat is a full-time SEO and a part-time blogger for Whoishostingthis.com – an Alexa top 10k web property. You can find additional information about their hosting reviews through their company website.
A cold chill has hit the world of Search engine optimization consulting and online business over the past few months. An arctic chill, to be precise, as Google rolled out its search algorithm update, Penguin 2.0. Online business and SEO forums are aflame with debate about the effects on site portfolios of this latest Google “housekeeping” update and of course, the main question that is being asked is “how do we get around it?”
Why has Google released Penguin 2.0?
To understand how to “get around” Google Penguin 2.0[1] requires that you understand the point of it in the first place. For the past ten years, Google has been the search engine powerhouse of the internet and no other company has managed to oust them form the top spot. Google’s goal, apart from making money, is to make sure the results it returns to searchers are valuable to them. The more relevant and high quality the results, the more money Google makes and the better the user experience for everyone. Google is investing heavily in algorithmic updates to weed out the junk, spam and downright illegal and that’s where Penguin and its predecessor Panda[2] come in.
What is Penguin?
Penguin 2.0 is Google’s attempt to weed out sites that use keyword stuffing, duplicate content and keyword cloaking. Penguin 1.0 was first released in April 2012.

Image by geekdashboard.com
What links are affected?
Panda was all about thin content and poor quality user experience. Penguin is all about manipulation and poor content. In July of 2012, Google sent out unnatural link warnings to around 1.5 million webmaster tools users. These are links that are solicited, black hat, bought or found through link directories and which help a mediocre website work its way quickly up the search engine rankings unnaturally. In combination with Penguin, Google is taking direct action on sites that use these unnatural linking methods and are bringing down sites manually from their high ranking positions where abuse or blissful but spammy ignorance is found.
Which links to get post-Penguin?
If you want to recover your search engine rankings after taking a beating from Penguin, or you just want to make sure you only acquire clean, white hat links, then follow the guide below.
- Anchor Texts: Pre-Penguin, anchor texts were rewarded for being keyword rich. Now the tables have turned and your inbound linking strategy should appear as natural and diversified as possible, with no over-optimisation.
- Quality Content: This is and always will be one of the best ways to get good quality links and avoid the wrath of algorithm changes. Good quality content is what the internet is about and what Google is aspiring to achieve. Keep your spam anchor texts to a minimum even in good content, write at least 500 words per piece and keep your keyword density at around 1% for the almost perfect, Penguin-proof website.
- Image Links: These are loved by Google post penguin and will do wonders for ranking your site providing the alt tags are correctly optimised and explain the image to users who cannot view it to prevent inadvertently showing Google something users are not seeing.
- News: Google also loves news for link building, so think about press releases, company news and comment on recent events through your site or blog. It provides rich quality content whilst offering you the opportunity to build some great quality, Penguin friendly links.
Which links to AVOID post-Penguin?
In general, Google rewards sites which have an overall tendency to organic, high quality SEO and high quality, user-friendly content. Any site, network or other vehicle for artificially building rank for “thin” sites is out post-penguin. Here are the main links to stay well away from (at least for now) or Matt Cutts will show you this:

- Blog Networks: These networks have been hot big time by Google. They have gone as far as to de-index altogether any blogs found within these networks. Popular link building blog network Buildmyrank.com was almost decimated in April by the first roll out of Penguin.
- Link Directories: Except for respectable and long-standing directories such as Yahoo and DMOZ, no other link directories are safe post-Penguin. You will find it difficult to be listed with DMOZ unless your site is top quality.
- Article Marketing: Thanks to the very poor quality of many articles on the web and the usage of them solely to rank spam and poor quality websites, article directories have taken a huge hit in Penguin 2.0. Many of the article repositories are taking measures to clean up their submissions process so this formerly fantastic way of getting links is a temporary (hopefully) method to avoid.
Oliver Ortiz works for Expert Market, a division of MVF Global. Expert Market is a B2B UK based provider of a wide variety of business related equipment. Our services include helping your business acquire trackers for vans. Follow us on Twitter and connect with us on Facebook.
Source:
1. Google Penguin – 2.0 or 1.2 | State of Search
http://www.stateofsearch.com/new-google-penguin/
2. Google Panda | Wikipedia
http://en.wikipedia.org/wiki/Google_Panda
SEO’s place in Internet marketing is firmly entrenched. No internet marketing campaign begins without a discussion about the SEO strategy. A few years back, SEO work comprised just a small percent of an Internet marketer’s job. Today, situation is much more different. This isn’t surprising considering the fact that there are more than 3 billion online searches every day[1] and that’s almost double comparing with 2011. People use search engines to find more about the products and services that they need and research shows that they more or less view only the first page of a search engine’s results. This has lead to a high page rank being the holy grail of Internet marketing.

Research has proved that most users review only the top 3-5 results returned by a search term. Search engine technology has advanced to the point that in most cases, the top 3 results offer exactly what the user is looking for. This means that good results that show up in the 2nd page of results are usually ignored and never get much web traffic. Research has also shown that people are now used to filtering out the paid search results[2] because of the distinctive colored box they appear in. Thus it is the challenge of every Internet marketer to reach the top of the rankings organically, preferably by using white hat optimization techniques.
In the early days of the widespread use of the Internet, it was easy for businesses to perform SEO for their websites on their own. Now SEO is a field all unto itself. The changes that happen in this industry are rapid and are directed by the search provider’s advances to combat spam in results. The work that is required is phenomenal and in most cases requires a team of people with specialized skill sets to achieve the targets set.

Further, these days SEO efforts are heavily affected by social signals. Social signals include retweets, Facebook likes and shares, reposts and other social networking site actions. Sending out the right image while online is often a full time task in itself. Businesses cannot afford to ignore this aspect as both positive and negative attention impact SEO efforts significantly. The prevalence of buttons on every site that lets you ‘Like’, ‘Share’, ‘Tweet’, ‘StumbleUpon’ or +1 the content lets search engines know what an actual human thinks about the content rather than taking only the opinion of a bot/crawler.
A few years back, search engine rankings did not take location into consideration. The first result for a keyword search would remain the same irrespective of the location the search originated from. Today, all search providers have in effect walled off the search results to pertain to a particular country by introducing country specific domains. Further, users who have location tracking services enabled get results that are localized and tailored to their needs. In this scenario, SEO efforts become a balancing act. Internet marketers need to optimize the websites after getting a clear understanding of the business local and international reach. This will have an impact on the relevant traffic the website receives. There are definitely more things on an Internet marketer’s plate to care of today but the SEO still constitute a lion share of his efforts.
Article written by Andrea Walters, a freelance writer for www.globalx.net – The Nation’s Lowest All-Digital Price Provider. Click here for more info about Globalx.
References:
1. The State of SEO and Internet Marketing in 2012 – Aug 20, 2012
http://www.slideshare.net/HubSpot/the-state-of-seo-and-internet-marketing-in-2012
2. Banner blindness – Wikipedia
http://en.wikipedia.org/wiki/Banner_blindness
A common misconception amongst businesses is the belief that anyone can perform SEO (search engine optimisation). Of course, injecting popular keywords into your content and linking your pages to appropriate pages will help to raise your rankings; however there is more to SEO than managing the visibility of your content.
There are other elements to take into consideration such as: competition (the more websites competing for one keyword, the harder it will be for you to rank for it) and reputation (top SEO services post relevant articles across directories and appropriate sites, to increase your links and more importantly build up your reputation).
For this reason, we strongly recommend employing the services of an SEO team.
They can help you to ascertain what are the best words to target on your website; manage your content and more importantly help to increase traffic/conversions to your website.
But where do you start?
You want your website to do well, so it only makes sense that you want to employ the best SEO team. And the easiest way we have found to narrow this list, is implementing the following tactics.
- Where do they rank?It makes sense that as an SEO company, they should rank well.
Yes, you could argue that there is a lot of competition in this field, meaning a team on page 1 of Google may only be slightly better than one on page 4, and yes, a quality SEO team that is part of a digital design agency may not rank as high as others as their site is targeting other design related phrases. However, the higher they rank for phrases such as ‘SEO services’ or ‘Digital Design Agency’, the more confident you can be that they can offer your website similar results.
So do a search, see how well they rank on search engines and then compare their fees, so you can pick the best company for you.
- Have they got certificates?A good thing to look for when comparing SEO companies is whether they have got ‘SEO certification’. These certifications are designed to prove the ability of an SEO team as their skills have been evaluated.
So when you are looking at different SEO teams, look to see if they have been granted these certifications. Find one on their website and you can feel confident that they meet the standard.
Now, these are just 2 of the many tricks you can use to help you find the best SEO companies in the field. And we think you’ll agree that they will certainly make a difference when helping you to narrow down your list.

However, if you are struggling we suggest doing the following:
- What is their portfolio of work? What other websites are they working on? How are these websites ranking? Ask for data/references and check to see if they have managed a website in your industry niche before.
- Ask for references. If possible try speaking to previous clients to see what they are like to work with; the results they have produced and how their websites are doing now.
- What services do they offer? A SEO team that forms part of a digital design agency may not rank as high in Google as independent SEO companies (as they will be targeting a larger range of keywords); however they may be able to offer you a range of other services i.e. web design, email marketing, content etc that can prove beneficial to your business in other ways.
- What tactics do they plan to use on your website? Alongside their quote, get them to put together a proposal containing their research, statistics and predicted projections. Essentially get them to outline exactly what they believe they can do for your business.
Finding a quality SEO team is within your research, so try implementing the tips above and make your search even easier.
This article was written by the web design agency, Soula Ltd