What Are Manual Actions and How To Deal With Them?

A soccer referee with the Google logo on their T-shirt blows a whistle and shows a red card to a website.

Manual action is a temporary penalty imposed by Google on a website. It consists of lowering the search rankings of a given website or removing it from Google Search altogether.

Google may apply manual actions to websites that:

  1. Exploit and abuse search algorithms to rank unnaturally high,
  2. Don’t follow Google’s policies and guidelines, 
  3. Have spammy or harmful content.

Google is committed to raising the quality of search results because it’s in its best interest to have more and more satisfied users. Of course, these efforts must be automated due to the enormous size of the Internet.

There will always be people bent on tricking algorithms for their gain. Manual action is like a red card from a soccer referee, reminding them that it isn’t acceptable behavior. 

In 2019 Google discovered as many as 25 billion spam pages daily. That same year, it received nearly 230,000 reports of search spam and took action on 82% of them. In 2020, manual actions were imposed 2.9 million times. 

This large amount of online spam is the reason why Google uses all possible measures to protect the quality of its index. In 2021 Google caught 200 times more spam sites than at the start of the spam-fighting efforts two decades ago.

In the past, even large and reputable portals like BBC.com or Forbes.com engaged in unfair linking practices. The manual penalties imposed on them were widely commented on across the SEO industry. 

Today there are many types of manual actions, and they often surprise the well-intentioned web admins who occasionally go against Google’s guidelines without knowing it.

Fortunately, Google penalties are not final, and you can fix them. Even after experimenting with Black Hat SEO shortcuts, you can regain Google’s trust and recover by correcting your mistakes.

Onely’s technical SEO services don’t only entail resolving any manual actions you may suffer from, but also preparing a compliant SEO strategy that will boost your organic traffic without risking any penalties from Google.

How can you know your website received a Google penalty?

The soccer player knows they’re in trouble when the referee blows the whistle. But how can you know that your website got penalized by Google? 

Google penalties differ in how they are imposed and what consequences they bring. Some are initiated by algorithms, others – and these are the ones we call manual actions – by humans. Some may result in your website losing rich result display; others may lead to lowering its ranking or complete removal from the Google index.

However, the first symptoms of their imposition will be similar and may include:

  1. A sudden drop in your site’s organic traffic,
  2. Losing traffic on platforms like Google News or Google Merchant Center,
  3. Losing your position on SERP.

If you’ve noticed these symptoms, it might be time for further investigation and checking if your website wasn’t affected by a penalty.

The important thing to note is that Google penalties can apply to the whole website or just part of it.

In the latter case, some sections of your website may still rank high and bring satisfying organic traffic. However, the situation is still far from ideal, especially if the penalty was imposed on the subpages you care about.

Manual actions vs. algorithmic actions

Google can crack down on spammy websites both through manual and automated measures. There are many algorithmic solutions ensuring the quality of search results that can lower the ranking and visibility of those sites.

An example is the famous Penguin update launched in 2012 to combat pages that sell or exchange backlinks. 

However, Google still employs human workers to fight against spam on the Internet. Manual action means that a person is handling your site’s case. 

The most crucial distinction between manual actions and algorithmic penalties is that in the case of a manual action, you’ll receive a notification via Google Search Console. You can find it in the “Security and Manual actions” section. 

Thanks to this message, you’ll know the source of the problem and be able to address it right away. With algorithmic actions gathering this information may be more tricky. 

In addition, after fixing the errors that caused the manual action, you can and should submit a reconsideration request and ask Google to remove the penalty. An algorithmic penalty doesn’t give you that possibility. 

It’s also worth noting that Google Search Advocate, John Mueller, said that manual actions evolve. They used to be needed to tackle the problems that Google now deals with using algorithms. You can expect that as time goes on, fewer and fewer spamming practices will require human intervention.

Google penalties vs. core updates

It often happens that the drops in pages’ ranking and organic traffic occur at the same time as Google core updates. You may wonder if algorithmic penalties and core updates are, therefore, the same thing.

If that were the case, SEO would be a bit simpler, as planned core updates are always announced on Google’s Webmaster Central Blog, and you know when to expect them.

For a few years, the mentioned Penguin algorithm needed to be periodically refreshed, and an algorithmic action was applied to many websites during its updates. However, the reality is more complicated nowadays, and Penguin constantly works in real-time.

Since the algorithms for catching spammy pages are now in place, Google can focus on improving the quality and relevance of search results during core updates. Gary Illyes, Google Webmaster Trends Analyst, stated on Twitter:

If you’re “hit” by a core update, you shouldn’t think of it as a penalty. You might not be doing anything wrong at all, it might just be that someone is doing something better.
source: Gary Ylles

It’s also worth quoting what John Mueller said at the SEO Office Hours meeting on November 5, 2021:

Core updates are more about understanding your site’s overall quality and relevance and less about technical issues and less about spam.

Whenever you’re unsure if the decrease in organic traffic on your website is related to a core update or algorithmic penalty, you can consult your doubts with other webmasters on forums. Try looking through Google Search Central Help Community, where you can find answers recommended by Google. 

List of manual actions and how to fix them

As I mentioned before, there are several types of manual actions that Google can apply. Let’s get to know them better and learn:

  1. What triggers manual actions?
  2. Which manual actions are site-wide, and which are partial?
  3. How to recover from manual actions?

While preparing the list below, I mainly used the manual actions list published by the Google Search Console Help Center. However, in my view, Google’s document has several structural flaws, so I re-categorized some of the penalty types to simplify the classification.

In the “How to fix the problem?” sections, I explain what to do to eliminate the source of the issue. Note, however, that for the manual action to be removed, you should send a reconsideration request to Google. For more information about submitting the request, jump to “What is a Reconsideration Request, and how to file it?”.

Third-party spam

Third-party spam penalties are usually applied to a part of a domain. Google tries to scope them precisely and avoid imposing manual action on the whole domain.

User-generated spam

A manual penalty for user-generated spam is imposed when your website has user-generated content that wasn’t properly maintained. It may mean that your portal is littered with:

  1. Spammy posts on forums or guestbooks that, for example, advertise some services,
  2. Spammy comments on forums that, for example, link to pages unrelated to the thread’s topic,
  3. Spammy user profiles like, for example, “free_mobile_apps.”

A famous example of manual action related to user-generated spam was the penalty that Google imposed on Mozilla in April 2013. 

How to fix the problem?

Identify pages on your site that contain user-generated content and look for the post, comments, or profiles with:

  1. Text that looks like advertisements,
  2. Out-of-context or off-topic links,
  3. Commercial usernames,
  4. Automatically generated text.

While searching for spammy threads, you can use a search operator site: and add commercial or adult keywords unrelated to your site’s topic — for example, type in search [site:yourpage.com free apps].

Next, remove any inappropriate content and consider preventing its creation in the future. You can try:

  1. Moderating comments manually or automatically,
  2. Using CAPTCHA systems that require users to prove they’re actually human and not a spamming script,
  3. Blocking not-yet-trusted content from being indexed by Google. To find ways to do so, you can read our article on the indexability of user-generated content. 

Spammy free host

A manual penalty for the spammy free host may be imposed when your free web hosting service is heavily burdened with spammy pages.

How to fix the problem?

Remove any spammy accounts from your service. Also, consider preventing their creation in the future. You can try:

  1. Publishing a clear abuse policy,
  2. Using CAPTCHA systems that require users to prove they’re actually human and not a spamming script,
  3. Regularly monitoring your site with a search operator site: to detect problems before they grow out of control.

Hacked pages

A manual penalty for having hacked pages may be imposed when a portion of your site gets hacked and littered with out-of-context links or gibberish content.

The gibberish hack automatically creates lots of nonsensical keyword-stuffed pages on your site. Users may be redirected from those pages to unrelated sites like porn pages. 

If, as a result of a hack, your website poses a threat to users, instead of the Manual Actions report, you’ll receive the Security Issues report.

How to fix the problem?

Detect and remove any hacked pages from your site using Google web. dev’s instruction on fighting the gibberish hack.

Consider preventing malicious attacks in the future. You can try:

  1. Regularly scanning your devices,
  2. Regularly changing your passwords,
  3. Regularly updating your plugins and extensions,
  4. Using Two-Factor Authentication,
  5. Subscribing to a security service for your site.

Unnatural linking

Google considers links to either be “natural” or “unnatural.” Natural links correspond well with your content and are added for the benefit of your users. For example, while creating an article about soccer rules, you may link to the latest official version of the game rules published by FIFA. 

Unnatural links, on the other hand, aren’t helpful for users but serve the purpose of manipulating search engines. For example, you may use your popular soccer blog to link to a low-quality shop with children’s toys, hoping that it will improve this shop’s rankings.

Getting a lot of links from high-authority domains would make it easier for the shop to rank high in the search results. However, sooner or later, Google will find out about this scheme, and all websites involved will only lose.

Unnatural links to your site

This manual action is always site-wide and may be imposed when you bought or otherwise obtained unnatural links to your site. Pages that link to your site in this way don’t use appropriate tags such as nofollow, sponsored, or UGC tag. 

This means that when Google calculates the importance of your page and determines its ranking, unnatural links artificially inflate your page’s authority due to the properties of the PageRank algorithm. It’s no secret that such a situation creates a huge danger for the relevance of search results, and Google can’t tolerate it. 

A famous example of this penalty shows that even Google has to follow the rules it sets. In February 2009, Google Japan received a manual penalty for buying links as a part of its promotional activities.

How to fix the problem?

Perform a backlink profile audit and look for unnatural links. Use as many tools as possible, so your research is accurate. 

After finding all links violating the Google guidelines, contact the owners of sites containing those links. Ask them to remove them or to mark them with a nofollow tag. 

Of course, some of these website owners won’t be reachable, and some of them won’t be willing to cooperate. A solution to this problem is to disavow unwanted links. By disavowing links, you submit a list of URLs you don’t want to gain authority from to Google. 

Need help with disavowing spammy backlinks?

Take advantage of link risk management to reevaluate your backlink strategy.

Our experts’ experience shows that sometimes you may receive the described manual action, although you didn’t know about any unnatural links to your website. A third party might have made them accidentally or as a black hat SEO attempt to harm their competitor’s visibility.

In these cases, Google can be strict and not persuaded by arguments that someone generated these links without your knowledge or consent.

While you may feel treated unfairly, getting into a discussion with Google may only delay your website’s recovery.

Unnatural links from your site

This manual action can be site-wide or partial. It may be imposed when your site seems to participate in link schemes, which include:

  1. Exchanging links passing PageRank value for money, goods, or services,
  2. Excessive cross-linking,
  3. Publishing marketing articles with excessively keyword-rich link anchor texts.

Famous examples of this manual action show that earning money by selling links was once tempting even for large, reputable portals. Google penalty for posting unnatural links hit Forbes’ website in February 2011 and BBC’s website in March 2013. 

How to fix the problem?

Identify any links on your website that seem to participate in a link scheme. Remove those links or mark them with nofollow, UGC, or sponsored tag. 

Thin content or no added value

A manual penalty for thin content is imposed for a wide variety of violations. It may affect the pages where  Google detected spammy content or an entire website.

Automatically generated content

Some websites use AI to create content instead of employing writers. The potential benefit of such a solution is faster and cheaper content generation. Google may see it as a problem when:

  1. Auto-generated content is rich in keywords or their synonyms but makes no sense to the reader,
  2. Auto-generated translations aren’t reviewed and edited by humans before publishing,
  3. The AI combines the content of different web pages on the topic without adding any value. 

Thin affiliate pages

Sometimes, small online stores collaborate with large eCommerce and create content that refers to their partners’ products. There is no harm in that unless this new content presents value to users. 

Valuable affiliate pages may contain reviews of products or show their use in specific environments. However, Google may see them as a problem when they simply duplicate the product descriptions from eCommerce sites. 

Doorway pages

Doorway pages are one of the most infamous Black Hat SEO techniques. Typically, it involves deceiving the user that the website will answer their specific, detailed query. But then the user is instead offered very general, unhelpful content.

Imagine someone running an online shoe store. They have a category page dedicated to soccer boots and want this page to rank for many different keywords. Not just for general ones like “soccer boots” or “soccer shoes,” but also for “soccer boots seattle,” “soccer boots san diego,” etc.

To achieve his goal, the shop owner can prepare multiple pages titled “Soccer Boots + City Name,” which, when opened, will only provide a link to the main category page.

A user from San Diego may type in Google Search “soccer boots san diego” because they intend to find the nearest shop to their residence. Instead, they will waste their time opening the site with almost no content that tries to direct them to an unfamiliar eCommerce portal.

Doorway pages are not only misleading for the search engines but also disastrous for the user experience. 

In the past, even well-known brands have been tempted to rank for multiple key phrases easily. Among the websites punished by Google for this practice you can list the BMW website, which was caught using doorway pages in 2006.

How to fix these problems?

If your website has low-quality auto-generated text, thin affiliate pages, or doorway pages, you have no choice but to improve your content. You have to think about what is unique your website can offer to users and invest your time in it.

According to Google, a high-quality site should:

  1. Contain articles written by trustworthy experts,
  2. Avoid duplicate, overlapping, or redundant articles created solely to rank for different keywords,
  3. Have original and insightful articles that investigate the topic in-depth and beyond what is obvious,
  4. Have well-edited articles with correct spelling, grammar, and punctuation,
  5. Show that the authors apply great care to each page.

Manipulative techniques

Manual actions gathered under this category can affect either your entire website or just some part of it, depending on the violation’s severity and nature. 

Structured data issues

A manual penalty for structured data issues is imposed when the structured data markup supposed to help Google understand the content of your website actually contains incorrect or false information. This may be the case whenever:

  1. You mark up content that is visible to search engines but not to readers of the page,
  2. You mark up irrelevant content that has nothing to do with the page’s focus,
  3. You mark up misleading content, such as fake product reviews.
  4. You use structured data to deceive users by – for example – impersonating public benefit organizations you’re not associated with,
  5. You use structured data to misrepresent your website’s purpose.
How to fix the problem? 

Review structured data implemented on your site and remove any markup violating Google’s guidelines. Sometimes a general update to your structured data may be necessary.

Hidden text or keyword stuffing

A manual penalty for hidden text may be imposed when your site contains words visible to computers but not users. Usually, the hidden text is created to fill a page with keywords so that the user won’t be aware of their presence. For example:

  1. The font has the same color as the background,
  2. The text is placed behind the image or outside the screen,
  3. The font size is 0.

A Manual Action report for keyword stuffing may appear in your GSC as a result of:

  1. Repeating the same word over and over again on the same page,
  2. Filling the page with synonyms of the keyword so much that it stops making sense to the reader,
  3. Listing locations you’re trying to rank for without adding value to your content.
How to fix the problem?

Search your site for pages with hidden text. You can use the URL Inspection Tool, which shows Google’s indexed versions of pages.

With its help, you can discover the words behind the pictures or out-of-the-screen. One easy way to reveal text of the same color as the background is to select all the text on the page by the Ctrl+A command.

Delete any hidden text examples you find, or change their style and make them visible to users.

Then check your pages for too often repeated keywords and out-of-context phrases. Remove them and try to make your texts pleasant to read for users.

Cloaking and cloaked images

A manual penalty for cloaking is imposed when your website shows different content to human users and Googlebot. Cloaking takes place when, for example:

  1. You present a page filled with keyword-rich text to a search engine while presenting a page full of images to users,
  2. You insert a group of keywords to a page only when Googlebot browses it,
  3. You show one picture to Googlebot and then obscure it with another image. 

Cloaking can lead to a terrible user experience as users see different content on your site than they expected based on the search results.

You don’t have to worry about manual action for cloaking just because your website shows users different language versions based on their geolocation or because your website adapts to the screens of mobile devices.

As long as you treat Googlebot as a standard browser and don’t behave towards it in any special way, you aren’t violating Google guidelines.

How to fix the problem?

Compare how your pages look to users and how to Googlebot. The URL inspection tool can provide all information regarding how Googlebot sees your website.

Next, look through the code of your web server and find commands deliberately related to Googlebot’s user agent or IP address. Investigate them and remove parts of your website that shows different content to users and search engines. 

Google may struggle with differentiating paywalled content and the practice of cloaking, so make sure that your paywalled articles are adequately marked with structure data. 

Also, watch out for the “Page indexed without content” status in Google Search Console. It indicates that your URL got indexed, but your content didn’t because of possible cloaking practices on your side. Browsing the affected pages in the Page indexing report may help you navigate your cloaking issues.

Sneaky redirects

People usually use redirects when they move their site to another domain or need to consolidate several pages into one. Unfortunately, redirects can also be used for fraudulent purposes.

Google may impose a manual penalty for sneaky redirects on your website if you use redirects to achieve the same result as cloaking and show search engines different content than you display to users.

Sometimes this type of problem may arise without your intention to fool the crawlers. This mainly applies to sites’ mobile versions that are sensitive to:

  1. Poorly implemented scripts that redirect mobile users to a new page in order to display an ad,
  2. Hacking attacks aimed at redirecting mobile users to malicious sites.

You don’t have to worry about manual action for sneaky redirects if you simply adapt your website to the requirements of mobile devices or use JavaScript to redirect users to an internal page once they log into the portal.

How to fix the problem?

Look for URLs on your website that redirect users to other places than they might expect to go judging by the search results. Any URLs with conditional redirects should also arouse your suspicion.

Remove parts of your code that generate those redirects. If you don’t know how to start looking for them, you can use these tips:

  1. Review redirects written in JavaScript,
  2. Review redirects located in your *.htaccess file,
  3. Check redirects generated by your content management system or plugins.

If you didn’t know that your URLs redirect users to different pages, check the Security Issues report to ensure your website wasn’t hacked. Next, investigate third-party scripts and elements on your website. 

AMP content mismatch

The acronym AMP stands for Accelerated Mobile Pages. It’s an open-source coding standard built to provide faster page loading on mobile devices.

A manual penalty for AMP content mismatch may be imposed when the AMP version of your website differs from the canonical one. When Google detects such inconsistencies, it will show the canonical version of the page to mobile device users, which may cause long loading of components and discourage them from engaging in your content.

How to fix the problem?

Using the URL inspection tool, check if Google’s view of the AMP version of your pages is the same as their canonical look. 

Next, check if differences result from some parts of content being blocked by robots.txt.

Finally, make sure that the AMP refers to the correct canonical version of your page.

News and Discover policy violations

Google News and Google Discover are features dedicated to users interested in specific types of content. 

  1. In the Google News tab, users can find the latest news related to their query. 
  2. Google Discover is a personalized list of interesting topics displayed on the main mobile page of the search engine before the user enters their query.

Appearing in Google News or Google Discover is a good chance to generate organic traffic for your pages. However, if you want to be visible these features, you must follow the Google guidelines set up for them.

Failure to follow the rules may result in getting hit by a manual action that won’t disappear until you adapt your content to Google’s requirements. Before that, content from your website won’t rank in Google News or Google Discover.

In this table, you can find a list of manual actions related to Google News and Discover policy:

Adult-themed and sexually explicit content This manual action may be triggered if your site contains images, videos, or descriptions of nudity, sex acts, or sexually suggestive activities.
Artificial freshening This manual action may be triggered if your site uses a program for changing articles’ publication dates to make them seem more relevant.

Providing a new publication date without adding significant information as well as creating a very slightly updated article from one previously published are a violation of Google guidelines.

Dangerous content This manual action may be triggered if your site contains content that, taken seriously, can  directly lead to somebody’s harm. For example, the page may encourage people to perform some dangerous challenges. 

The website may also be penalized when its content poses a danger to animals.

Harassing content This manual action may be triggered if your site causes harm to specific people through:
  1. Cyberbullying,
  2. Threats,
  3. Unwanted sexualization,
  4. Private information exposure.
Hateful content This manual action may be triggered if your site participates in discrimination based on race or ethnic origin, religion, disability, age, nationality, sexual orientation, gender identity, etc.
Terrorist and violent content This manual action may be triggered if your site contains content that:
  1. Incites or glorifies violence,
  2. Promotes extremist acts,
  3. Celebrates terrorist attacks.
Medical content This manual action may be triggered if your site contains content that contradicts or undermines medical consensus. For example, it denies the existence of dangerous diseases.
Manipulated media This manual action may be triggered if your site contains fake news, misrepresented events, or otherwise deceiving content.
Misleading content This manual action may be triggered if your site uses techniques like clickbait to engage users with the content but doesn’t deliver what was promised.
Transparency violation This manual action may be triggered if your site fails to provide such information as date of publication, author, and sources for its articles.
Vulgar language and profanity This manual action may be triggered if your site contains offensive content.

What is a reconsideration request, and how to file it?

After examining the reasons for your website’s manual penalty and fixing any discovered problems, you are only one step away from regaining visibility and traffic. You need to return to the Manual Actions report and press the “request review” button.

This will open your way to submit a reconsideration request and inform Google that your every error and oversight was taken care of and your website is ready to get back to the game.

Remember not to request a review until you have finished repairs. Google is known to expect solidity, and rushing it won’t speed up the penalty removal.

What should be included in a proper reconsideration request?  

When you submit a reconsideration request, you can try to think like your website’s attorney. Your role is to notify and convince Google that all guideline violations have been addressed. Then you need to make Google believe that you won’t go back to old habits and will avoid unwelcome practices in the future.

For this purpose, you’ll have to present appropriate evidence to ensure you win the case. Here are some examples of details that you might want to include:

  1. List of websites you contacted to clean your backlink profile,
  2. If accurate, the information that the SEO agency you hired used unethical methods, so you terminated this collaboration,
  3. If accurate, the information that the violation of the guidelines resulted from mistakes made by your employees, so you put proper training programs in SEO in place,
  4. If accurate, the information that you just purchased your website and violations resulted from the previous owner’s actions that you don’t intend to repeat.

Best practices for reconsideration requests

Including as many details as possible in the reconsideration request is, of course, good intuition. Regardless of how much material you have, try to include all the information in the TXT file of your request. Alternatively, you can include a link to Google Docs or Google Sheet.

Links to other places may not be opened because reconsideration requests are read by human workers, and Google is concerned about their data security.

The tone you use matters. Although you may feel treated unfairly after some manual actions, it’s better to look for solutions than to discuss. It doesn’t pay off for a player thrown out of the field to argue with the referee.

After reviewing your reconsideration request, a person running your website’s case will want to investigate the pages affected by manual action, so make sure they’re not blocked from crawling through robots.txt. 

How long does manual action last?

Unfortunately, each day of manual action means reduced website traffic and losses for your business. Therefore, you may wonder how long a penalty can last.

There’s no time limit to manual action. The process of fixing the problems and lifting the penalty can take days, months, or sometimes more than a year. But your hard work and well-written reconsideration request will undoubtedly shorten the period. 

Much depends on how thoroughly you investigate the problem and how willing you are to admit your mistakes.

Can manual actions expire?

Surprisingly, manual actions can expire by themselves. You can find confirmation of this fact in John Mueller’s tweet:

Yes, manual actions expire after time. Often things change over the years, so what might have required manual intervention to solve/improve back then, might be handled better algorithmically nowadays.
source: John Mueller

Sometimes, if the manual action expires, although the webmaster didn’t fix the problem, it could be that the algorithmic solution replaced human action.

Manual action removal vs. traffic recovery

The manual action removal isn’t equivalent to a return to the state prior to its imposing. The recovery takes longer because you need to show Google once more that your website is worth ranking. 

There is no way to tell how long before your website can regain its rankings and traffic. It may depend on the gravity of the penalty. If it’s just a few pages on your website, it might take a few weeks or a month. If it’s on your whole website, it can take more time.

An increase in traffic by several percent is already a good reason to celebrate. From a positive perspective, rebuilding your visibility after manual action is an opportunity to take care of your SEO strategy with more experience and knowledge of potential risks.

It’s also worth being prepared for the fact that the penalty removal may not take place right after the first reconsideration request but after a few rounds of replies and corrections.

Wrapping up

With manual actions, Google lowers the visibility and rankings of websites that don’t follow its guidelines. You’d receive the Manual Actions report in your Google Search Console if such a penalty hit your pages.

There are many types of manual actions as they respond to various problems and violations. The role of the webmaster is to fix these errors and submit a reconsideration request to Google.

The traffic on your website will not increase dramatically right after submitting a reconsideration request. Still, reliability during the fixing process will speed up the recovery and will pay off for your website in the long run.

Don’t let a Google penalty ruin your online presence. Take advantage of our Google penalty recovery services to get your website back on track and regain your rankings.

Hi! I’m Bartosz, founder and Head of SEO @ Onely. Thank you for trusting us with your valuable time and I hope that you found the answers to your questions in this blogpost.

In case you are still wondering how to exactly move forward with fixing your website Technical SEO – check out our services page and schedule a free discovery call where we will do all the heavylifting for you.

Hope to talk to you soon!