Tag

On-site Optimization

January 2011 was still a blissful time for farmers, despite the inferior seed that they were planting and the fact that they have often just been “appropriating” the crops from their neighbors, the harvests were plentiful. But that January was marked by an ominous rumbling coming from the nearby bamboo patch. Something huge was approaching, something that threatened to put an end to their farming practices and force them to find other disciplines to try and exploit. The black and white fury descended upon them with the wrath of betrayed user expectations and the vengefulness of a community deceived. Ever since that day people have started paying a lot of attention to the beast’s mood and its constantly changing habits, lest they also end up being mauled by the Panda.

Google Panda

In constant efforts to only bring the most relevant and useful content to its users, Google announced an improvement of their search algorithm in the January of 2011. At the time a number of sites that offered less than interesting content, or in some cases, less then original, were still able to get quite decent, or even excellent, rankings on SERPs. This tendency of such sites to clutter up the higher positions in the search engine’s result page and attract visitors only to let their expectations down came to be called “webspam” and Google would suffer it no longer.

Sites that only offered superficial and unengaging content that seemed to be mass produced and lacking in any real substance became known as “content farms”. The content published on such websites only served as a carrier for keywords and was devoid of any real value for the user that might stumble upon it. So called “scrapers” were another prominent type of webspam. Basically, scrapers are websites that have very little to none of their original content, but they instead publish content from other websites. While there are sites that are meant to do just that, gather information from various sources through, for instance, RSS files, there are also those that are simply stealing other people’s content and passing it on as their own. It was the latter type that Google wanted to find a way to penalize.

Panda is in the game

Panda was a perfect tool for the job. Well not perfect, the algorithm is, naturally, still being updated quite frequently. Since its inception it has been through a lot of changes, but its purpose remains the same, weeding out the websites that are ranked higher that they objectively deserve. As it is mostly an automated process, one can’t really expect it to be infallible, but constant updates and improvements to the algorithm are at least keeping some people on their toes, and ensuring that they will think twice before stealing someone else’s content or post a nonsensical wall of text instead of a well thought out and written article. Another advantage of the new algorithm was that it took the entire website into consideration. Before Panda, pages of a website were assessed individually, allowing a terribly constructed website with just one well made page to be displayed on the first page of search results. Today, sites are judged as a whole, if you have a bunch of duplicated content, unfavorable content/ads ratio or anything else that is deemed undesirable by the algorithm, that fact will be reflected in your rankings.

Panda Traffic Drop

As I have already mentioned, the first installment of the algorithm was announced in January 2011, and it was activated on February 24th of the same year. Unofficially targeted at scraper websites and content farms it made quite an impact. Panda manged to affect 11.8% of search results. This first version was limited to US websites, but just a month later, in April, Panda got the next update that applied to all of the English language queries all over the world. The new update also allowed users to block certain sites, and by doing so send a clear signal to Google’s crawlers. Panda 2.0 affected 2% of the queries made. Next three updates that were released in May, June and July didn’t really have a pronounced effect on the queries, but the next update in line that was released in August, and that is known as Panda 2.4 did. It is with this update that Panda expanded to non-English languages, and affected the queries by 6-9%. The next update was released in September, but Google was not as forthcoming with the information on this update as it was with some of the other. In the October, Panda flux was announced, preparing site owners for fluctuations in their rankings. It turned out that the fluctuations were connected with the new update, Panda 3.0, which affected somewhere around 2% of the queries made. There have been a number of smaller or larger updates since then, with the last one being the 20th update of the algorithm. Most of the updates that preceded it influenced less than 1% of the queries, and rarely resulted in significant changes in ranking, but the last update that was released in September of 2012 affected 2.4% of queries in English.

[one_half]

  • Panda 1.0 – February 24th
  • Panda 2.0 – April 11th
  • Panda 2.1 – May 9th
  • Panda 2.2 – June 18th
  • Panda 2.3 – July 22nd
  • Panda 2.4 – August 12th
  • Panda 2.5 – September 28th
  • Panda 2.5.1 – October 9th
  • Panda 2.5.2 – October 13th
  • Panda 2.5.3 – October 19/20th
  • Panda 3.1 – November 18th
  • Panda 3.2 – January 15th

[/one_half] [one_half_last]

  • Panda 3.3 – February 26th
  • Panda 3.4 – March 23rd
  • Panda 3.5 – April 19th
  • Panda 3.6 – April 27th
  • Panda 3.7 – June 8th
  • Panda 3.8 – June 25th
  • Panda 3.9 – July 24th
  • Panda 3.9.1 – August 20th
  • Panda 3.9.2 – September 18th
  • Panda #20 – September 27th
  • Panda #21 – November 6th

[/one_half_last]

Panda Update timeline

Different kinds of websites were penalized during that period. A number of shopping comparison and ecommerce sites lost their rankings when the algorithm found their content to be lacking. However, some websites were not affected by the updates, even though their content was not exactly perfect. If you are reading this, then there is probably no need to explain what damage a decrease in your rankings can do to you, this is why site owners and SEOs have devoted a lot of time and effort to tracking what demands Panda makes of your website. If you know what it is that the algorithm is targeting you can either avoid getting penalized, or try to recover from it if the penalty has already been incurred.

Panda recovery

First of all, you need to determine whether the decline in your rankings is caused by Panda. One of the ways to do this is to check whether the decline coincided with an algorithm update. Naturally, even if it does, it is not necessarily Panda’s fault, but chances are that it is. Don’t forget to check if the visits form Google search have decreased, for it is them that would be affected by the update. Now if you do think that you have fallen prey to the beast, there are still some things you might be able to do if you want to get out of its claws. Naturally, all of this also applies to sites that haven’t been affected by the algorithm, and would very much like to stay in its good graces.

If you needed to summarize the ways to redeem your site in one phrase, that phrase would probably be “improve user experience”. After all, the goal of Panda and similar algorithms is to help the users find what they are looking for. This, however, includes a number of different things. First of all, you’ll need to check your page loading time. If your pages are loading slowly, or if they fail to display certain pieces of content, you will not exactly become too popular with the users, and the algorithm will take note of it. This might be caused by a number of reasons, and whatever they are, you have to make sure to eliminate them.

Be sure to check your bounce rate, are people leaving your site immediately after seeing the homepage? If they are there might be a number of reasons for that. It might have something to do with aesthetics, but it might also be caused by bad page organization, such as confusing navigation or too many ads. You need to make the user want to stay on your site and click on through to the other pages after seeing the homepage. Speaking of ads, they might relate to both of the above issues. First of all, if you have ads from too many networks, that might have serious negative impact on your page loading times. Additionally, even if all the ads are from the same network, you should really be careful about how many of them you display on any of the pages, not just the homepage. Content should dominate your pages, not the ads.

Content & the Panda Update

Which brings us to the most important factor when it comes to your rankings and Panda, content. Sure, you need to limit the number of outgoing links on your pages to a reasonable number, you need to pay attention to your internal linking structure, and you have to make sure that your code is all in order, but it is content that makes or breaks most websites. It is content that the users are looking for and it is content that will keep them on your site. It is also content that will make them, or another website, want to recommend you to someone else. There are several things that you have to look out for. First of all, and this should go without mentioning, you need to make sure that your content is within the rules of grammar and that it is spelled correctly. It should also be informative, and actually relevant to the keywords that are likely to bring people to that page. Now when I say “informative” I don’t mean that it should just state the very basic things about the topic, as is often the case, instead try to really involve the potential readers, offer unique insights or little known facts, make it an entertaining or a truly useful read. You’ll find that if you offer something that people can actually use, it will promote itself. There should also be enough of it, you might write a 200 word article on a topic and decide that it is enough, but you might want to think about making it longer, especially if the page has something that is distracting people from the content, such as ads for instance.

One of the things that you must watch out for is duplicated content. If Panda determines that you have too much of it on your website, you’re very likely to get penalized. Now, this is not always the rule, there are websites that dodged the algorithm despite having a lot of duplicated content, but those were usually quite strong as brands, and that helped them pull through. It is important to note that when I say “duplicated content” that doesn’t only apply to content that you have copied from another site, but also to separate iterations of the same content across your various pages. This often happens with dynamic pages, so make sure that you pay special attention to them. If, for some reason, you do need to have multiple pages with the same content, use canonical URLs, or noindex and nofollow attributes to avoid being penalized.

Our favorite evildoers: Matt Cutts and the panda 🙂

If you find that some of your pages have suffered a significantly larger drop in rankings than other, analyze them and try to determine what the cause of such a discrepancy might be. If you do find the cause it might help you improve your other pages as well. Based on your findings, you might even decide to improve your worst page(s) or completely remove them. As long as it doesn’t disturb your site’s functionality too much, it is better to redirect or 404 a page, than to let it bring your whole site down.

Also, be sure to check your incoming links. You might be getting links from less than reputable sources, and enough links of that type might hurt you quite badly even if you weren’t responsible for them. Try arranging for those links to be removed when it is in your power, or if all else fails, consider removing the pages that such sites are linking to, it sounds extreme, but it might help you in the long run. Likewise, try to get inbound links from reputable sources, keeping in mind that quality is much more important than quantity.

Once you have made sure that your site is working properly and offering visitors useful and engaging content that is easily accessible and well organized, all you need to do is wait to see if you recover. This might take some time, but if you have covered all of the important issues, you should eventually see improvement. If all else fails, you might try writing a report to Google explaining your situation, but be sure that you provide them with all of the necessary information, and that you are actually justified in your claims. In the end, it all boils down to a quite simple truth, if you want to have a site that ranks well, you need to have a well made site. There, I hope it’s all clear now.

The presentation is created by Rand from SEOmoz and Dharmesh from HubSpot, and these guys kicked some educational asses.

There are some really great information for SEO industry in this slideshow, like the fact that we reached 3 billion searches per day in 2012 and that’s double when comparing with 2011.
Also, 75% of clicks are going to organic results, not the paid ads and that more then 50% of search queries have no paid ads at all (great info for PPC guys).

Check out the rest of the slideshow and soak the knowledge.

Check out all 39 slides and learn what was and what will be in the world of Internet marketing.

So many website owners believe Search Engine Optimization is magical, at least that is how they act when the topic of SEO comes up. I almost always hear black hat Search Engine Optimization techniques when people talk about how they get traffic to their Website. When I tell someone that their advertising strategy is black hat, they look at me like I am speaking another language.

What is a black hat?

I always explain that the term black hat originates from Country Westerns where the good guys wear white hats and the bad guys wear black hats. When the term black hat is used in reference to Search Engine Optimization, it means you are doing something very stupid and while it might be helping your site right now, it will destroy your site in the long run.

Some very popular black hat SEO techniques that we will be covering and explaining why they are bad will be keyword stuffing and abusing HTML code.

What is Keyword stuffing?

Keyword stuffing can be done in several ways and all of them are very bad and will get your website removed from search engines, so your site really is invisible. One popular way to stuff keywords is to write the same word or a variation of words over and over on a page to up your SEO traffic. You may see a boost in traffic to your site, but how long do those people stay at your site? Most likely less than 3 seconds, just enough time to hit the back key. Another popular keyword stuffing technique is to do what is listed above, but make the words invisible so people won’t see them. People may not see them, but Search Engines will! If you view the source of the webpage you will see the so called “hidden” text. These techniques are surefire ways to get you banned from search engines like Google.

What is HTML code?

HTML code is the most popular language used to write a website in. HTML is very easy to learn, but also very easy to abuse. People who are desperate for traffic will hear about metadata description tags (that is already out-dated as an on-site optimization source  for better rankings) and that search engines use them to help rank a website. Some people will fill the description tag to the brim with words that are irrelevant to their site. People will not want to go to a site that has a description that makes no sense and search engines will remove you for this or at least lower your ranking on their search engine. Another form of abusing the description tag is stuffing it full of words that are relevant to your site, but put together make no sense whatsoever. Sure, you may have a site about electronics and technology, but having your description just full of words that have to do with electronics and technology, well that will get you banned. A well written description uses terms that are searched for a lot and has to do with what your site is about.

Black Hat SEO
Image by Searchenginepeople.com

Another abused tag is H1.
H1 stands for Header 1, which is the biggest font that HTML generates on its own. This tag is used for titles of a blog or the title of a page, but some people decide to use h1 for their entire site. Yup, this will really hurt your sites ranking when the search engines find out and trust me they will find out. two last abused tags are the strong tag and the b tag, they both do the same exact thing, they make the text bold, but to much bold text can be bad for a site. Bold text is used to say something is important. I know your entire site is important, but you have to decide which words or sentences should be in bold. This is very important, because if bold is used properly it will raise your ranking with the search engines, but if used improperly, well the search engines may find it as spam.

In conclusion, if you need your site to show up well on a search engine do a lot of research on SEO, watch videos by Matt Cutts, the lead anti-spam Engineer that works at Google and if you can’t learn SEO, then hire a firm that does it for a living. Be careful when hiring a firm, because there are a lot of firms out there that still practice black hat SEO, so the least you should do is tell them not to do anything that I list as being black hat.

This article has been written by Ryan Satterfield from Planetzuda.com, a game reviews, tech news and reviews website.

There is a big debate as to whether good website design and layout can really help search engine optimisation, with many branding it as a trick to convince businesses to get a new website. However, I am going to present the argument that a well thought-out, error-free and well coded website, is easier to rank, than a site that is outdated, slow and poorly structured.

My first piece of evidence is taken from Google AdWords. Quality Score in Google AdWords is worked out partly through click-through rate and keyword relevance but also page relevance. This shows that Google does look at landing pages to determine the quality of a listing in their search network. So why should Google not think to take a look at landing pages in their natural search listings, where competition to appear here is far higher?

Next, the Google Webmaster Central blog continually hands out advice on optimising your site for search – including canonical URLs, duplicate content and redirects. Why would Google give these tips out if it wasn’t looking for sites to be updated with the information?

Web designAs competition gets higher and higher to appear on the coveted first page of Google for selected search terms, Google must be looking to always improve the quality of the sites shown. Once all pages on page one have on-page SEO sorted, and a similar number of links built with appropriate anchor text, it will need something else to look at to determine which site should be shown where. Quality websites will most definitely have a lower bounce rate and a better page/view rating. With all that access to Google Analytics data, is it too far-fetched to consider that Google wouldn’t consider these elements in its rankings?

At the end of the day, it is in your interest to have a well-designed site. It will attract more website visitors, and most likely convert more visitors into sales or enquiries. As more and more people visit your site, they are more likely to share it using social media if they have enjoyed using it. People might write reviews, blog posts or simply just social bookmark your site. All of these actions lead to a ‘natural link portfolio’ that hasn’t been ascertained using spam methods, and are therefore likely to help your site rank higher and stay higher.

If you’ve had any experience in ranking high quality sites over poorly designed ones, feel free to let us know by commenting below.  
This article has been written on behalf of Bigfork Ltd, a Norwich web design agency.

Image by primolution.com

Seth Godin famously quoted, “Attention is a bit like real estate in that they’re not making any more of it. Unlike real estate, though, it keeps going up in value.
The key to build loyalty and brand around your website or blog is to create attention grabbing content, which can mesmerize the reader, and compel him to comeback for more. By creating such awesome content, you are basically trying to grab the attention of the visitor, which is so important in contemporary online marketing world. For best SEO promotion, it is mandatory that your website should have regular visitors and shoppers: the loyal customers.

Below are some key pointers which can assist any content creator to develop attention grabbing posts:

Is the content solving any particular problem:

Any content which can solve a particular problem of your visitors is good. In fact, being a webmaster and blogger, you should try to understand the common problems faced by your visitors, and produce content which solves them. Such type of content will definitely grab the attention of your targeted audience, and help your SEO promotion activities.

Is the content answering any question:

Be it any niche, there are always questions posted by searchers? You can find such common questions from famous Q&A portals such as Yahoo Answers, Quora and others. Based on those questions, you can formulate your content, thereby making it attention grabber.

Does your content present a unique angle:

The primary objective of creating unique content is not just unique words, but presenting a viewpoint under a different light. The content which is being produced for your website should be able to present a unique angle to an existing theory, thereby grabbing attention all over. Such content can also be a hit by virtue of social sharing and bookmarking.

Targeted content for targeted audience:

Before initiating any SEO related operation, the webmaster should always be clear about its targeted audience. Unless your content is targeted and to the point related to your niche audience, it can not be worthy of attention grabber. Creating targeted content for targeted audience is the key.

Does your content initiate discussions:

Initiating discussions and debate is the most effective way to create buzz and make news. The content should be able to prompt its readers to start debating and discussing. Once such content is produced which can prompt debates, then that content is attention grabber.

Author:
Go4Promotion, which is India’s top rated SEO promotion company specializes in creating such attention grabbing content for an effective SEO strategy. Get in touch with our SEO experts, and give a boost to your online marketing campaign. Log in to Go4Promotion.com now or just buy YouTube views and become a rock star.

You don’t have to rely on statistics to see that the online business industry is booming. Millions of domains are registered everyday. Today just an ordinary person can build multiple websites on his own. There are definitely large profits from owning a websites but it all depends on the traffic. The big question is, do all websites get decent traffic? No! Some are actually struggling to gain visitors (and profit).

The first thing to consider is the source of the traffic. Organic searches can drive huge amounts of traffic depending on how you optimize your webpage. And the simplest and easiest to do is to focus on keyword-rich content. And when we say keyword-rich, it means that the article has a centralized idea, quality and well written content. Following that basic rule of the quality content helps you creating a SEO-friendly content.

Good SEO-friendly content is what brings visitors to your website, and each of those one time visitors might become a long term reader or user of your services, thus bringing you increased revenue. It’s important to remember that these visitors didn’t just stumble upon your site accidentally, they searched for keywords related to your business.

How is search engine optimization related to your keywords?

KeywordsThe main ingredient to generate targeted web traffic through SEO is usually the keywords. They are highly optimized in site title, meta tags, categories and description. But most of all, in the content. When individuals search online, they rely on keywords. Some keywords are certainly more often utilized than others. If you realize which keywords are widely-used more often, you can incorporate them in your web content without sacrificing the quality. Remember that your goal is not to link bait but to provide helpful information to your readers. While link baiting could work, it has drawback to your site performance and authority.

Once you identify the keyword to work on, start publishing articles. Distribute your keywords and keyword phrases equally and reasonably on your content material. Ensure your keyword density doesn’t go beyond 3 – 5% of the total number of words. If this is done properly, search engine spiders or bots will crawl your webpage and index the new SEO-friendly article content for a better web site ranking.

The good thing about being indexed right away is that your new content might appear in search thumbnails. Whenever people browse the web, and enter the keyword located in your content, the link of the source and part of that content are shown together with other search results. As online users click on your site which is displayed among the others ranked for that keyword you gain one unique visitor that might be interested in your site or products and services offered. Therefore, keywords can really improve the overall visibility of your webpage.

Try to play around with your keywords. Keyword phrases (long tail keywords) may be derived from keywords. They improve site rankings and thus attract more visitors to your site. The higher the traffic, the higher will your sales conversions be provided that the bounce rate is not too high. After all, with quality content you can be sure of longer page views.

You can use Google Adwords to acquire keyword-related information and analyze the keyword competition. We know that Google is the top search engine so you can just as well work with them. Plus, Google Panda has consistent updates which may hit your site if you are not up to their standards. There are also a lot of SEO online tools that you can take advantage of. They can help you with important keyword search and analysis.

Guest Post by:
Liz D. has been blogging since 2005 and started her personal SEO campaign just recently with positive results. Her Make More Money Online website is currently on the first page of Google search for the keyword “make more money” and she is currently optimizing more sites. Top Keywords for SEO is just one of them and it is doing pretty well.

This is the perfect topic for starting a young and possibly successful SEO blog. I will go over all of the steps of the process you need to pay attention to if you want a successful SEO campaign.

1. Keyword research

This is probably the most important step of the process since if you mess this up, you’ll have a hard time fixing it later in the game.
Basically, if your website has been active for some time and if you have created your Google Webmaster Tools and Google Analytic accounts you’ll have some data to work with.

First, check the GWT (Google Webmaster Tools) for all keywords and phrases that get you to show up in search queries and then cross-reffence them with the GA data. Find the crossover between the easiest and the most rewarding keywords and analyze them by checking competition, manually and by using Google Keyword Tool (it lacks precision but it will do the job). Test both options there, your newly found keywords and your website URL (for all the suggestions Google might give). Just make sure to set your geo location and language correctly. When you’ve picked your targeted keywords it’s time for the next phase.

2. On-site Optimization

On-Site Optimization
Image by web-marketing-toronto.com

When you have your keywords, the most important thing (as far as I’m concerned at least) is to properly optimize your website for them (and for the best user experience as well).

Simply, you need to do everything that Google and the other SE suggest:
make your website faster (good starting point is checking your website in GTmetrix), make it highly visible for mentioned keywords. Forget meta keywords and stuffing your meta description with keywords, that will just dissuade visitors from clicking on your listing in the SERP. Pay attention to the use of your page title and H tags. Optimize content for your keywords, pick inner pages for specific keywords, but avoid keyword stuffing (you can use one neat tool on WeSEOAnalytic).
Build your inner link structure, let the link juice flow trough your website. Use canonical tag to avoid any content duplication and “noindex, follow” command for a finer tuning of your content.
Keep checking your website in the SERP for any URLs that might have been mistakenly indexed by Google by using a simple command: site:yourURL in the Google search box. You can fix that with the new addition to the Google Webmaster Tools called “Remove URL”, under “Crawler access” tab.

That’s the way you wanna roll. On-site is far to complex to be fully explained in one article so I will get back to that in some of the next ones. Also, make sure you avoid duplication of page Titles and Descriptions. We are all creative enough to make something up for every page.

3. Off-site Optimisation

Link building tips and tricks
Image by rylanclayne.com

Off-site optimization or, how many like to call it, LINK BUILDING is highly important for whole SEO. Unfortunately, it’s so much more then plain (and it’s even not that plain) link building.
Word alone (optimization) is telling us that you already have something to optimize or work with. Let’s be optimistic and suppose that you already have a website that has been up and running for some time. Let’s also suppose that you’ve spent that time acquiring some links, in organic way or you just built them yourself. Now you have a new set of phrases, and you have your old links pointing to your website with wrong anchors…. Yeah, you’re getting the point. If it was you who built them, just use your previous contacts and ask them to change the anchors to your new targets. Use your old accounts to fix the links you created manually and also use Google Alerts to check for possible mentions of your website’s name/brand without a link, so you can contact the people mentioning you and ask them to add a link to it.

There are numerous ways of building quality links and this article would become boring quite fast if I started to mention them all, so I’ll list just a few of them:

  • guest blogging
  • sponsoring
  • quality directory entries
  • quality profile links
  • plain old asking for a link
  • etc.

Try to keep the link income steady to avoid drawing any kind of bad attention from the Google’s Webspam Team. Build some noFollow links, just to get your backlink portfolio a bit of that natural look. Don’t be insane and buy like 10.000 bad links for a bit of cash and then just lay back and enjoy your rankings. They will last about 3 weeks and then your website will be doomed to Google Hell.

4. Soc.Media promotion

Soc.Media Promotion
Image by talkofthetownworkshop.com

For a quicker indexation or reindexation you can use another addition to the Google Webmaster Tools called “Fetch as Googlebot”, but I prefer the old way of social sharing (twitting, liking, +1) to draw attention of the crawlers. Search engines also like the content that is loved, shared and similar, so they tend to give better rankings to websites that have this kind of content. Soc.Medias are at your disposal, you just need to tame them.

Enough for the first article but stay tuned, there will be some fresh articles from my friends and colleagues and maybe some guest posts soon.