Victorious SEO Logo

Don’t Lose Rank: Take These Steps To Fix Google Manual Actions

Hit with a manual action penalty? If the Google Search Quality Team flags one or more of your pages, you need to act fast to protect your Google ranking. Here’s what you need to know about manual actions and how to fix them.

Nov 7, 2022

9 m read

Seeing the phrase “manual action” pop up can evoke a sense of dread for digital marketers. But manual actions aren’t the end of the world — and they don’t have to spell permanent ranking disaster if appropriately fixed.

To lift a manual action, it’s essential to understand exactly what you’re up against and why Google issues these penalties in the first place. In this article, I’ll cover the basics of Google penalties, tips on how to fix them, and how to avoid running afoul of quality guidelines to prevent future problems (and future ranking drops).

What Is a Google Penalty?

When people in the digital marketing industry talk about Google penalties, they’re referring to one of two different types of “punishments:” algorithmic penalties and manual actions.

Both of these harm search rankings, and both are meant to keep potential spam or low-quality sites out of Google search results — but manual actions are unique because a member of the Google team must lift them before Google restores lost rankings.

Let’s explore how these penalties work and how to resolve them.

What Is an Algorithmic Penalty?

Google is constantly updating and improving its search algorithm. They announce major algorithm changes, called “core updates,” months in advance — but push out minor updates almost daily with no fanfare.

An algorithmic penalty is an industry term for when one of these updates results in a drop in rankings. As discussed in our article about algorithm updates, Google uses multiple smaller algorithms and hundreds of metrics to determine rankings, so even a slight update can significantly impact a site. It’s next to impossible to know how a page will be affected by an algorithm change until after the change has already happened.

An algorithmic penalty isn’t a true penalty — really, it’s a consequence of Google improving search capabilities.

What Is a Manual Action?

A manual action means a member of Google’s team looked at your site and found it violated their webmaster quality guidelines in some way. Depending on the violation, a reviewer may issue a manual action against a single page or an entire website.

Manual actions require immediate attention because affected pages or sites automatically lose rankings and may be removed from search results entirely until flagged issues are fixed.

There are many reasons why Google may issue a manual action. Fortunately, Google Search Console will tell you the exact problems they found that triggered a manual action and on which pages those problems appear.

How To See If Your Site Has a Google Penalty

Google will send you an email if your site has any kind of manual action against it. However, it’s always possible to miss an email, so I recommend regularly reviewing your Google Search Console account to check for problems.

After logging into Google Search Console, the main Overview page will alert you if you have any actions against your site. Click on ‘Open Report’ to see a full list of these actions, along with descriptions of each, recommended general solutions, and which pages are affected. You can also navigate to the manual actions report on the left-hand menu.

If your site suddenly loses rankings, but there are no manual actions listed, you may be suffering drops due to algorithm changes. Check the Google Search Central Blog or the Moz.com Google Algorithm Update History for news about an update that might coincide with the drops you’re noticing.

How To Fix Your Google Penalty

To lift a manual action, you’ll need to fix the issues identified in your manual action report and submit a reconsideration request to have the action removed. Here are the steps to follow:

  1. Identify the problem(s) you’re facing. Your Google Search Console dashboard will list the exact problem that led to a manual action and which pages reviewers believe violate Google’s guidelines. You can quickly fix some issues (like a single instance of hidden text), while more significant issues (like site-wide penalties for spam) may take a long, meaningful readjustment of your digital strategy. Plan your next steps carefully.
  2. Fix on-site issues. Do whatever it takes to clean up your site and bring it in line with quality guidelines. Be prepared to tackle each instance of a problem on every page it appears because Google won’t remove actions unless they’re 100% resolved. 
  3. Contact web admins of toxic link sites. If you received an action for unnatural backlinks pointing to your site, contact the offending site’s webmaster to request they remove the offending links. If they don’t fix the issue, disavow the toxic backlinks and document your attempts to communicate with the offending site owners. (I’ll talk more about links later in this article.)
  4. Submit a reconsideration request. Once you’re certain you’ve resolved the problem, submit a reconsideration request to lift the manual action by clicking ‘Request Review’ in Google Search Console. You’ll receive a review confirmation message to let you know your review is in progress. Don’t resubmit a request before receiving a final decision. Since a real person on the Google team will review your site to see if your fixes are sufficient, reconsideration reviews can take some time. 

You’ll know Google has lifted the penalty when you check your manual action report in Search Console and see ‘No issues detected.’

Tips for Submitting a Reconsideration Request

The process of appealing your manual action isn’t as simple as fixing the problem and clicking a button. Reconsideration requests will ask you to describe what you’ve done in detail.

Here are some tips for submitting a successful request:

  • Be sure the changed pages are accessible to reviewers and search crawlers, meaning they shouldn’t be locked behind a password, blocked by robots.txt, or marked ‘noindex.’ Google can’t review a page they can’t see.
  • Describe what you’ve done to address the issue and what you’ve done to change your site. Provide clear, specific examples.
  • Explain the steps you’re taking to ensure the issue doesn’t happen again. That might be working with a new SEO agency, changing your company’s internal web policies, or developing a new content strategy that doesn’t incorporate any shady tactics.
  • Include links to additional documentation if necessary. If you have a document or spreadsheet that supports your case, add a link to it in your request. For example, you might want to include a Google Doc containing screenshots of emails you’ve sent to site owners to ask them to remove toxic backlinks.
  • Don’t submit multiple requests. However, if your initial appeal is rejected, you’ll be able to continue making changes to your site and submit a second request for another review.

How Long Does It Take to Resolve Google Manual Actions?

Google says it takes “several days or weeks” to evaluate a reconsideration request and even longer to handle link-related requests. Technical SEO problems require bots to re-crawl your site, while content issues need a human reviewer to sift through each page.

After Google lifts a manual action, lost rankings won’t typically return to normal right away — especially if your competition has had time to gain ground in your absence from search results. Some sites bounce back immediately, while others take months or longer to recover.

6 Ways to Avoid Google Penalties and Manual Actions

Remember that Google issues manual actions when a site doesn’t comply with its webmaster quality guidelines — which means that the best way to avoid manual action penalties is to follow those guidelines as closely as possible.

While it’s impossible to predict what changes to the algorithm could affect you, following these same quality guidelines and focusing on producing useful, valuable, and evergreen content is the best way to keep your rankings strong.

Let’s look at a few specific ways you can avoid manual actions and reduce the impact of algorithmic penalties.

1. No Black-Hat Tactics

Publishing honest, original, and natural-sounding content is the best way to survive algorithm changes and manual action reviews alike. Don’t try to “game” the system for short-term gains.

Google is always looking for ways to tamp down on black-hat SEO tactics, which aim to deliver fast results by gaming search algorithms. Often, these tactics tend to sacrifice user experience in an attempt to over-optimize for search engine bots — i.e., they try to trick Google to get on page one. 

Unfortunately, certain tactics are still commonly used by inexperienced digital marketers or those looking for quick results, such as:

  • Keyword stuffing. Overusing certain keywords on a page is explicitly prohibited under Google guidelines. Integrate keywords naturally throughout a page. Don’t compile them in a list or repeat them too many times.
  • Link schemes. A “link scheme” is an umbrella term for practices like purchasing backlinks, operating blog publisher networks, link exchanges, and any method of artificially manipulating links to or from your site. Steer clear of them!
  • Hidden links and text. Don’t try to inflate rankings by hiding links and text in a way that human visitors can’t see, but bots can. Examples of this include publishing keywords in white text on a white background, turning a single comma or apostrophe into a link, or hiding content behind an image. Don’t be sneaky, and don’t hide things.
  • Copying and republishing content from other sites. Using plagiarized, duplicated, or “scraped content” from trustworthy sites is a known spam tactic. It’s OK to quote an existing page on another site or reuse an image or video published elsewhere, but pages with republished content should also have plenty of original content.

If you’re not sure whether a tactic is shady, take a step back and ask yourself a few questions. Does following this tactic provide direct value to your audience in some way? Does this tactic require a level of deception, even if it’s only meant to deceive bots? Does it promise results that seem too good to be true? When in doubt, keep it above board.

Make an effort to remove toxic links from other sites, and submit disavowal requests when necessary. Remember to keep track of your link cleanup process by taking screenshots and keeping emails.

Just as a link from a trustworthy site can boost your site’s reputation, a link from a spam site can damage it, too. These toxic backlinks generated by third parties may trigger a penalty even if your site adheres to all other quality guidelines.

Sites that generate bad links typically contain spam and violate the Google quality guidelines in some significant way. Even if these bad backlinks aren’t directing traffic to your site, they can damage your site by association.

If you’ve received a manual action for unnatural links pointing to your site, Google recommends you contact web admins of low-quality sites to ask them to do one of the following:

  • Remove links to your site entirely
  • Tag links to your site with a ‘REL=nofollow’ attribute
  • Redirect links to your site through a URL blocked by a robots.txt file

If website owners don’t comply with your requests, use Search Console’s disavowal tool to tell search engines to ignore the link to your site. Don’t pay if a web admin asks for money to remove a link — just file a disavowal request and keep moving.

By the way, it’s important to keep documentation of this link cleanup process. Providing evidence that you tried to remove toxic backlinks (even if web admins don’t respond to you) will help when you submit a request to remove the manual action.

3. Use Schema Markup Wisely

Only use schema markup when necessary and appropriate — and never try to mislead or misrepresent a page to get clicks.

Schema markup is a powerful tool for sites to structure data and provide rich search engine results. However, misusing schema markup and running afoul of Google’s structured data guidelines can cause more harm than good.

Sites may receive a manual penalty for not following technical requirements outlined in the guidelines and for inappropriately labeling data. Here are some practices that can get a page dinged with a manual action:

  • Mislabeling data as time-sensitive
  • Marking up content that isn’t visible to visitors of the page
  • Marking up fake reviews or ratings, or marking up only some reviews/ratings to make ratings appear more wholly positive than they are
  • Impersonating someone or disguising your organization
  • Misleading users or search engines about the purpose of a page

4. Focus on Publishing Quality Content

Don’t plagiarize or auto-generate content. If you don’t have the time to write new pages, articles, or product descriptions, consider working with an agency with an SEO content writing service or freelance writers.

It’s possible to receive a manual action for having low-quality thin content — but what exactly is “thin” content, and what’s considered low quality?

Officially, Google defines thin content as pages with little-to-no unique value to visitors. They give examples, including:

  • Pages with auto-generated text
  • Affiliate pages with copied and pasted product descriptions and little-to-no original text
  • Content that’s mostly scraped or republished from other sites
  • Doorway pages (pages used to funnel visitors to another page)

Contrary to what some might believe, thin content isn’t necessarily a page with a low word count or one with original content that’s poorly written. Ultimately, it’s safer to have a few original, quality pages with low word counts than to have many pages with thousands of words of stolen or auto-generated text.

5. Don’t Cloak Content

What visitors see should be what they get. Don’t try to serve two different versions of a page to search engines and humans.

Cloaking content is the practice of showing human visitors and search engines two different versions of a page. For example, a cloaked page might serve search engines an HTML version while displaying a page full of images to human users.

Sometimes, web developers who cloak content have good intentions — such as serving two different versions of a page depending on if visitors have Javascript enabled or recreating a Flash-based page in HTML to satisfy search engine crawlers. But, according to official guidelines, there are currently no accepted uses for cloaking, and any type of cloaking is liable to receive a manual action.

6. Wrangle User-Generated Spam

Get user-generated spam under control. Use moderation tools and software to tackle site-wide spam issues if necessary.

Websites that allow users to post original content may run into issues with user-generated spam, including bot profiles, fake reviews, scammy guest posts, and blog comments or forum posts that push unsolicited promotional content.

It’s essential to delete one-off instances of spam as quickly as possible. However, if you’re facing tons of spammy comments and fake accounts flooding your site, you’ll need to take site-wide steps to remove and prevent spam.

For more on how to address these issues, read our guide to user-generated spam and how to find, remove, and prevent it.

Uncover SEO Issues With Our Checklist

It can be overwhelming to run a campaign and optimize a site when there are consistent algorithm updates to contend with and evolving best practices. That’s why we’ve put together a free resource to help. 

Download the Victorious SEO Checklist for a complete, easy-to-follow framework for creating a well-rounded search engine optimization strategy.

blog form image 1 1

SEO Checklist & Planning Tools

Are you ready to move the needle on your SEO? Get the interactive checklist and planning tools & get started!

In This Article

Recommended Reading

25 m read
Blog

Search engines have hundreds of ranking factors — so where do you begin when trying to get on page one? This easy-to-use SEO checklist with free downloadables makes starting and improving your search engine optimization a breeze.

7 m read
Blog

Noindexing a page helps you reduce index bloat and maximize your crawl budget. But telling Google not to index a page sounds antithetical to getting your pages on page one of search results. Learn why these two ends aren’t at...

8 m read
Blog

Are you connecting with potential customers everywhere you operate? A comprehensive SEO strategy that focuses on each of your business locations can help increase your visibility in local searches and make it easier for local customers to find you. Read...