All SEO Changes in 2012

Year 2012 has been quite tumultuous when it comes to SEO. Even though Google has been picking up pace ever since it started providing its services, the number of algorithm updates and changes made to search process itself have been quite astonishing. For the most part, the changes were for the best, but such a huge task as the one that Google has set for itself is not easily accomplished, especially in the ever-changing environment such as the internet, so naturally, not everything went as smoothly as we might have hoped. Anyway, we’ll provide a short overview of the most important changes, and leave you to judge whether we’re better off for them, as SEOs and as internet users.

Changes in search

The first major change in the way search operates in 2012 came in January when Google introduced Search Plus[1], a feature which used the information from your Google + profile for recommending certain sites to you when you type in a query. The reaction of most people was quite negative, as it seemed that Google was using this to further promote its social network and bury the competition which was perceived as abuse of power and manipulating the search for their own convenience. One of the additional grievances that people had with this change was that it seemed to be active even when they weren’t signed in and used the search in ‘incognito’ mode, which further blurred the distinction between being online and being in ‘Google’s domain’.

The next major change came in February, with the Venice update[2]. Reception of this innovation was mixed. Seeing that the point off the update was to help promote local businesses in search, some people profited from this update while others were not as lucky. Basically, what they did was include your location as a search factor, so if you typed a keyword that wasn’t location specific, for instance ‘groceries’, Google would offer more results for grocery stores in your location than it did before. The idea was that people might type in just the keyword without specifying the location, and that this way they would be presented with more relevant results. Naturally, including more local results pushed out some of the formerly high-ranking sites further down the SERP, which, as you might imagine, wasn’t met with too much joy from the webmasters of those sites. However, this did help in leveling the field between small local business and large corporations to some extent.

In May, Google added the Knowledge Graph feature[3], which was supposed to help users find a relevant answer to their query. Apart from the usual results, people were able to see a list of alternative interpretations of their query in the right side bar. Like some of the other items in this list, this one didn’t have a great impact on SEO, but it still merits a mention. A couple of months later, in August, SERPs started displaying only seven suggestions[4] instead of the usual ten for some queries. This made it more difficult for sites to get to the coveted ‘First Page’, but apart from that it didn’t have additional repercussions for SEOs. At one point the left sidebar with options was removed, and there was a perceptible increase in the number of ads along with the exclusion of non-paid product suggestions.

Finally, in September, Google tried to address an issue that many have been complaining about for quite a while, namely, one domain dominating the SERP. This is not the first time that Google tried to regulate this, but this might have been its most successful attempt, however, some would probably rather call it ‘least unsuccessful’, as the results were not exactly stunning. Anyway, a change for the better was noticeable, which is more than one could honestly say about the previous fixes.

Reaching out to webmasters and SEOs

Google intensified their efforts to inform the interested parties about what it finds acceptable and commendable, and what it has an objection with. Apart from releasing a video of their search quality meeting[5], which provided webmasters and SEOs alike with valuable insights in the way Google operates, they have also been quite vocal with their disapproval. June and July were marked by a torrent of email warnings sent to webmasters that warned them that there were suspicious looking links pointing[6] to their site. Some panicked, others ignored the email, while Google was towering confusingly over them telling them to neither panic nor ignore the emails. That is about when people decided to either ignore their panic or panic about their ignoring the emails, which resulted in everyone being even more confused than before, which, in turn, led to most people ignoring the whole event and just pretending that nothing ever happened. Google seems to be OK with that.

The warnings that they have issued in July and October about infographic links and guest blogging links[7], respectively, were just slightly more helpful. Apparently, because of the abuse of these two link building strategies Google intended to scrutinize such links more carefully in order to determine which were spam and which justified the their existence by actually providing good content. Just like with email warnings, there is still a lot left unclear about this, for instance, how in the world would they assess them and what criteria would they use, even if they worked on case-by-case basis. In the end, this seems to have just boiled down to the basic ‘produce good content’ suggestion.

Some tangible help from Google came in the form of automatic URL canonicalization. This doesn’t mean that you are excused from thinking about duplicate content and taking action in order to prevent it from harming you, but Google will be carrying a part of the weight.

Additionally, Google addressed the concerns about negative SEO by offering the disavow links tool[8]. This tool allows you to ask Google to ignore some links that lead to your website, and according to them, you should only use it once you have tried removing the links yourself by contacting the administrator of the site which holds them. This made negative SEO less of a threat, and there were already many who didn’t consider it a serious threat in the first place. Now, some will claim that this only helps to try and solve the problem that Google itself has created with their Penguin update, but the fact that some people decided to use bad links to damage other sites is not exactly Google’s fault, although something like this might have been expected.

Search quality

Naturally, improving search quality has remained one of Google’s prime objectives. Panda algorithm, which is intended to fight low-quality content farms and scraper sites has seen 14 updates in 2012. In January, Google have released an update that was meant to ‘modify’ the rankings of websites that have too many ads above the fold. The update only influenced less than 1% of the queries, so even if it was a step in the right direction, it didn’t create a huge difference.

 

March and April were marked by talks about webspam and over-optimization[9]. Some of the issues mentioned found their way to the Penguin update[10], some were already addressed by Panda[11] to some extent, others were left out. Basically, Google advised against using blacklisted techniques, such as publishing spun content, stuffing your pages with too many keywords and similar. Naturally, they didn’t give too much information on what exactly was it that they meant when they said ‘too many’, but a great number of webmasters were to find out soon enough.

April 24th was when Penguin update was made live, and a number of people will associate that date with an image of a graph showing a steep decline in Google Analytics. A plummet in analytics would, perhaps, be more accurate. Penguin mostly took off-site factors into consideration, namely, suspicious links that were leading to your site, low link diversity, too many exact anchors, and a number of other factors. Naturally, most people considered that they didn’t deserve to be hit by this update, and some of them actually didn’t  Those who wanted to complain were later given the opportunity to submit reconsideration form[12], but this didn’t help too many people out. Finally, as I have already mentioned, Google provided us with a disavow tool, which was used by webmasters with varied levels of success (slim to not very much at all).

In August, Google released what became to be known as Pirate update[13]. It was targeted at websites which were distributing copyrighted content, but it ended up affecting sites that have received enough DMCA (Digital Millennium Copyright Act)[14] takedown requests. Submitting the request is a lengthy process, but you don’t have to offer proof in order to submit it, which makes this update open for abuse. Until the Pirate update, such requests were page specific, but with the update they jeopardized the rankings of the whole websites. However, Google remains confident that they will be able to assess which sites deserve to be penalized, and which are a victim of malicious intent.

Finally, in September Exact Match Domains[15] was released. As it name implies, it was supposed to prevent websites from ranking well for certain keywords only because they had them in their domain. Naturally, if you had a good site, your rankings should have remained unaltered, or even gotten better because of your competition getting penalized, but if your site’s rankings have been based only on the domain match, you have probably experienced a drop.

All in all, it was an exciting year, Google has constantly been making us adapt and grow, which is part of the fun 🙂 While the changes to the search quality might not seem so huge, an improvement was noticed by many while the others were fretting about how to recover from this or that update, and better search and better internet should be the final goal of all of us.

References:
[1] http://www.google.com/insidesearch/features/plus/index.html
[2] http://www.seomoz.org/blog/understand-and-rock-the-google-venice-update
[3] http://www.google.com/insidesearch/features/search/knowledge.html
[4] http://searchengineland.com/7-new-10-google-showing-fewer-results-131006
[5] http://insidesearch.blogspot.com/2012/03/video-search-quality-meeting-uncut.html
[6] http://googlewebmastercentral.blogspot.com/2012/07/new-notifications-about-inbound-links.html
[7] http://www.seroundtable.com/google-spam-blogging-15944.html
[8] http://googlewebmastercentral.blogspot.com/2012/10/a-new-tool-to-disavow-links.html
[9] http://googlewebmastercentral.blogspot.com/2012/04/another-step-to-reward-high-quality.html
[10] http://en.wikipedia.org/wiki/Google_Penguin
[11] http://en.wikipedia.org/wiki/Google_Panda
[12] http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35843
[13] http://searchengineland.com/library/google/google-pirate-update
[14] http://en.wikipedia.org/wiki/Digital_Millennium_Copyright_Act[
15] http://searchenginewatch.com/article/2214115/Google-Warns-of-Upcoming-Exact-Match-Domain-Algorithm-Change

Leave A Comment

Related Posts