How to handle a pure spam penalty (from GWT) as a blogging platform
-
Hello,
I have a blogging platform which spammers unfortunately used a few years ago to create spam blogs. Since them, we've added spam filters and even if I can't not assume there isn't any spam blog left, I can say that most of the blogs are clean.
The problem is, in Google Webmasters Tools, we have a Pure spam message in the Manual actions page. (https://support.google.com/webmasters/answer/2604777?hl=en), with a list of 1000 blog links.
All these blogs have been marked as spam in our system for at least 1 year, technically it means they return a 410 header and display something like "this blog doesn't meet our quality requirements".
When I've first seen the manual action message in GWT, I have asked for reconsideration request. Google answered within a week saying that they had checked again our website, but when I go went to the manual actions page, there was still a "pure spam" message, with a different list of blogs, which have already been marked as spam for a year at least.
What should I do ? Ask for reconsideration requests as long as Google answers ?
Thank you in advance,
-
Hi Imran,
Instead of adding to this thread, I think it would be better to start a new question about how to check a site regarding duplicate content. Thanks!
-
Hello Marie, their URL still exist but the spam content isn't displayed.
Here is what happens when you go on a blog flagged as spam :
- Header 410
- 301 redirect to a page best practices which explains why this blog has been disabled
- this page is in the robots.txt as disallow, noindex, nofollow, with no link to the original website
Is this good ?
Thanks
-
These blogs are marked as spam but do they still exist at all? I mean, if you type in the url are the pages live? If so, they're still passing pagerank. Is there a way to completely remove the pages? Somehow Google is still seeing them.
-
404s. Remove them from existence.
Why will you have content that is pure spam on the site? if it is spam, delete it.
-
Throw that spam in the 410 can. It let's the crawlers know it's gone 'for good'.
-
Hello Federico,
Thank you for your quick reply.
When you say "Clean ALL the blogs, remove any trace of spam" :
- what is the best way to do : 410 or 404 ?
- if a spam blog has a meta noindex tag, will Google still considerer it ?
I will keep you updated with the future events.
-
Steps you should followÑ
- Clean ALL the blogs, remove any trace of spam (document everything in the process)
- Go back to the first point and make sure you have NO SPAM left (again, if anything comes up, document the changes you make)
- Once you are completely certain that there's no spam left, you can send another reconsideration request, make sure you show them the work you have done to clean the site.
- Wait for their response, and if you still get a negatory, repeat the process as most likely you still have spam in your site.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to Handle a Soft 404 error to an admin page in WordPress
I'm seeing this error on Google Webmaster Console: | URL: | http://www.awlwildlife.com/wp-admin/admin-ajax.php | | | Error details | Linked from | | |
Intermediate & Advanced SEO | | aj613
| Last crawled: 11/15/16First detected: 11/15/16 The target URL doesn't exist, but your server is not returning a 404 (file not found) error. Learn more Your server returns a code other than 404 or 410 for a non-existent page (or redirecting users to another page, such as the homepage, instead of returning a 404). This creates a poor experience for searchers and search engines. More information about "soft 404" errors | Any ideas what I should do about it? Thanks!0 -
What is the Good URL structure for Blog posts
Please let me know what is the goood URL structure for blog posts http://www.abc.com/postname/ or http://www.abc.com/�tegory%/%postname% If Category, Can we name it Blog like website/blog/postname or it is good to use actual categories, and How many categories we can use?
Intermediate & Advanced SEO | | Michael.Leonard0 -
Wordpress Blog in 2 languages. How to SEO or structure it?
Hi Moz community, I have got a wordpress blog currently in the spanish language. I want to create the same blog content but in english version. (manually translate it to english instead of using translation service such as Google Translate). How should i structure the blog for SEO? How will it work? Any structure markups i should know about? Any examples? Thanks
Intermediate & Advanced SEO | | WayneRooney0 -
Best way to handle page filters and sorts
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot. I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL. Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page. What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice: put rel canonical tag on all of the pages with parameters and point to "root" use the google parameter tool and have it not crawl any urls with my parameters put meta no robots on the parameter pages Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Penguin Penalty On A Duplicate url
Hi I have noticed a distinct drop in traffic to a page on my web site which occurred around April of last year. Doing some analysis of links pointing to this page, I found that most were sitewide and exact match commercial anchor text. I think the obvious conclusion from this is I got slapped by Penguin although I didn't receive a warning in Webmaster Tools. The page in question was ranking highly for our targeted terms and the url was structured like this: companyname.com/category/index.php The same page is still ranking for some of those terms, but it is the duplicate url: companyname.com/category/ The sitewide problem is associated with links going to the index.php page. There aren't too many links pointing to the non index.php page. My question is this - if we were to 301 redirect index.php to the non php page, would this be detrimental to the rankings we are getting today? ie would we simply redirect the penguin effect to the non php page? If anybody has come across a similar problem or has any advice, it would be greatly appreciated. Thanks
Intermediate & Advanced SEO | | sicseo0 -
Duplicate blog content and NOINDEX
Suppose the "Home" page of your blog at www.example.com/domain/ displays your 10 most recent posts. Each post has its own permalink page (where you have comments/discussion, etc.). This obviously means that the last 10 posts show up as duplicates on your site. Is it good practice to use NOINDEX, FOLLOW on the blog root page (blog/) so that only one copy gets indexed? Thanks, Akira
Intermediate & Advanced SEO | | ahirai0 -
Have we suffered a Google penalty?
Hello, In January, we started a new blog to supplement our core ecommerce website. The URL of the website is www.footballshirtblog.co.uk and the idea behind it was that we would write articles related to our industry to build a community which would ultimately boost our sales. We would add several posts per day, a mix between shorter news stories of around 150 words and more detailed content pages of around 500 words. Everything was going well, we were making slow but sure progress on the main generic keywords but were receiving several thousand visitors a day, mostly finding the posts themselves on Google. The surge on traffic meant we needed to move server, which we did around 6 weeks ago. When we did this, we had a few teething problems with file permissions, etc, which meant we were tempoarily able to add new posts. As our developers were tied up with other issues, this continued for a 7-10 day period, with no new content being added. In this period, the site completely dropped from Google, losing all it's rankings and traffic, to the extent it now doesn't even rank for it's own name. This is very frustrating as we have put a huge amount of work and content into developing this site. We have added a few posts since, but not a huge amount as it is frustrating to do it with no return and the concern that the site has been banned forever. I cannot think of any logical reason why this penalty has occured as we haven't been link spamming, etc. Does anyone have any feedback or suggestions as to how we can get back on track? Regards,
Intermediate & Advanced SEO | | ukss1984
David0 -
Duplicating an article I wrote on an external blog
Hi, I wrote a blog article on another site. I would like to add the article to my site as well and would like to know the best way to do it. If I duplicate the article that I wrote would I then risk getting a penalty for duplicate content? If so, then what is the best way for me to include the article on my site for the benefit of my readers, but not lead to the duplicate content problem? Would it be better to use a canonical tag? Or to noindex the page? If I use the canonical tag, am I helping to make the article on the external blog stronger? Where is I use the noindex tag I am not helping my site nor that article I think, is that right? Last question, if I offer the copy of the article on my site and use the canonical or noindex tag then my site does not receive any direct benefit from the article for SEO. In other words the article wont appear in the search index with a link to my site. What about the comments that people write on the article on my site? That is unique content which may have great questions or points. I want to ensure those can be indexed properly. If I noindex the page I lose out. If I canonicalize (is that a word?) the page then I don't know if will send search results based on those comments to the external blog where that information (the comments from my site) does not exist. Thank you for any help to better understand this part of seo.
Intermediate & Advanced SEO | | NikkiGaul0