Google is a major search engine, and a penalty from this site can significantly hurt any website’s search engine optimization efforts. A penalty can result in a drop in ranking in search engine results pages, and this can result in a drop in revenue. This is why it is so important to understand what Google penalties are, how to diagnose them and how to fix them.
Understanding and Diagnosing Google penalties
If you have noticed a significant drop in your site’s ranking, it is important to determine if this is the result of a penalty. Google penalties can either be manual or algorithmic.
A manual penalty or a manual action is an instance where the team at Google manually flags your website for violating Google guidelines. The web spam team can either demote your website’s ranking or ban it altogether. To see if there is any manual action on your website, claim your site on Webmaster tools. Navigate to search traffic and select manual action. You will see a notification on this menu telling you whether or not there is any web spam action against your site.
An algorithmic penalty is a little difficult to diagnose, and you may need help from a professional. This is because very many factors can cause a drop in Google rankings, and a professional can help you determine if a penalty caused it. Furthermore, a professional who understands all Google algorithms can help you determine which penalty has caused your website’s drop in rankings.
The most common algorithmic penalties are Google Panda and Google Penguin. However, other Google updates such as Pirate and Mobilegeddon can also affect a website’s rankings.
- Google Panda: This penalty is also known as a quality penalty. This penalty affects websites that have spammy or thin content.
- Google Penguin: This penalty affects websites that have low quality links, that pay for links or that use link-building software to get links.
- Mobilegeddon: These penalties affect sites that are not very mobile-friendly.
- The Pirate: This penalty hits websites that consistently violate copyright laws. Such websites have content that may violate digital media or software copyright laws.
Fixing a Google penalty
To fix a manual penalty, you need to identify the problem. Some of the problems that can cause a manual action penalty on your website include thin content, unnatural links, user-generated spam, cloacky redirects, keyword stuffing and hidden texts. Your website can also be flagged if the web spam team at Google determine that someone else has hacked into it.
Clean up your site by getting rid of all spam, including user-generated spam in the comment section. Ensure the content is relevant to your site. Check the site’s CSS and source code to ensure there is no keyword stuffing or hidden texts. Go to Webmaster Tools and select the Fetch as Google tool. This helps you check whether Google’s bot and your users are viewing the same content.
Get rid of anything that may be seen as manipulative. Make sure your content meets Google’s guidelines and that there is no spam or misleading content. After cleaning up your site, submit a reconsideration request to the web spam team. You can also request a review of your site so that they can see that you have fixed the issue.
To fix an algorithmic penalty, go to Google analytics to see the date when you lost traffic. Use sites such as the Moz algorithmic updates to see whether Google released any updates to any of its algorithms during this time. This updates can help you determine which algorithm is responsible for the drop in rankings.
Algorithmic penalties require heavy detective work, and you may have to hire a professional SEO company to thoroughly audit your website. They can use tools to find any spammy links and plagiarized content on your website. Some of the issues that a thorough edit looks for include:
- Thin or plagiarized content
- Duplicate content on multiple pages.
- Excess or unnecessary CSS or HTML in your code.
- Spammy URL slugs
- Too many advertisements on your site
- Header tags that are badly crafted.
- A website that is poorly designed and that does not allow Google bots to easily crawl the webpages.
- Robots.txt files that block important website resources.
- Non-indexable content or pages
Fixing these issues requires both time and patience. After thoroughly auditing the site, you may have to wait for a while for Google’s bots to re-crawl the web pages and links. However, if a professional SEO company helped you fix your site, you will eventually recover your Google rankings.