The Panda and Penguin algorithms hit webmasters hard and consistent updates continue to down rank bad SEO practices.
Google is adamant their search engine is designed to create an even playing field and do not take kindly to marketers attempting to manipulate search results. And when the search engine giant hits, it hits hard.
So what can you do to ensure you do not get caught up in algorithm snares?
Don’t pay for guest posts
The biggest killer for link builders is paying third party sites to host content that embed inbound links back to their site. Some SEO agencies are still trying to get around this by including inorganic links in their content and sneaking it past editors. And sometimes it works.
The thing is, many web hosts will accept guest posts because they recognise that loading their site with content is good for their SEO ranking. The problem is there are too many money-grabbers charging publishers to host content which includes outgoing links.
The rule is do not pay for links. Webmasters that do charge for links typically build sites exactly for this purpose and sooner or later Penguin Googlebots will sniff them out – and when they do you will be stripped of your page rank with no questions asked.
Monitor inbound links
Google is ramping up its philosophy of Linking Domain Relevancy (LDR) to ensure that any links passing from one site to another are organic. They test this by assessing whether there is a crossover of interest in similar niches.
However, some blog owners may link into your site to highlight a specific point to their readers or backup a point they want to make. Sometimes the refers site is not in the same niche as your. To Google this looks inorganic.
It is therefore worth your while running through the list of sites linking into you to determine whether the incoming link will be an advantage or a disadvantage. Any sites that do not meet with LDR classification should be removed.
Beefed up social signals
Although Google does not use social signals to determine ranks, a recent deal struck with Twitter for tweets to appear in search results suggests it’s only a matter of time before they do. There are quite a good number of webowners that beef up the amount of tweets and shares on their site to make readers think they are popular and a trustworthy source.
This malpractice is akin to manipulating search results and is deemed as an attempt to fool users in order to gain an advantage. Continue this practice at your peril, especially if you are manipulating Twitter figures. Now the social network is sharing data with Google, the search engine can use it against you.
Don’t duplicate content
Duplicate content was outlawed in 2011, but is still a trap newcomers can fall into. If you have a large site hosting numerous products, make sure you do not feature the same content on two or more pages, even if it is only a placeholder until you fashion suitable copy.
Even worse is posting content that has been plagiarised from a third party site. If you are paying a content writer to submit ‘original’ work on your behalf, make sure it is original and not blatantly copied directly from somewhere else. Use software like copyscape to check for duplicate content.
It can take anything up to a year or more to recover from Google penalties and most web owners fall foul of Penguin and Panda. Providing you do not engage in any bad practices, whether or not they come from third party contractors, you will be okay.