De-indexing thin content & Panda--any advantage to immediate de-indexing?
-
We added the nonidex, follow tag to our site about a week ago on several hundred URLs, and they are still in Google's index. I know de-indexing takes time, but I am wondering if having those URLs in the index will continue to "pandalize" the site. Would it be better to use the URL removal request? Or, should we just wait for the noindex tags to remove the URLs from the index?
-
Whenever Matt Cutts discusses this subject in the Webmaster Tools videos and elsewhere, there is always a caveat along the lines of "while google mostly take notice of noindex and robots.txt, this may not always be acted upon". The primary reason given for this seems to be if content is indexed via a link from another site, or exists in google cache. In these cases it seems logical that it may continue to appear in the index.
Your question reminded me of Dr Pete's Catastrophic Canonicalization Experiment - it seems his method proved quite effective
-
Hey
I don't think it would make a great deal of difference as you are going to need to wait for a full crawl of your site anyhow before you see any benefits.
Out of interest, how are you identifying the low quality pages? One way to have a go at this is to use your analytics and identify all pages with a 100% bounce rate and noindex all of them. If there are lots (sounds like there are) you can do them in chunks and see what happens.
Don't get rid of pages that are doing good search traffic or have a low bounce rate UNLESS you know they are really poor pages as sooner or later, they will be picked up.
Ultimately, it sounds like a big site so you are going to have to be patient here and make incremental changes based on analytical and crawl data until you get the results you are looking for.
I have pulled a site back from the depths, a rather unfairly punished site in my opinion that just got it's content copied by several other sites but the same rules applied. We updated pages, removed blocks of template content to their own pages and just kept on watching and like magic, it came back stronger than before a week or so after we made all the changes.
Hope this helps!
Marcus -
You want to be a bit more patient. Depending on how popular and deep these pages are within your site, I would expect it to take several weeks to see most of them disappear. There is a good chance if you check you will find a percentage of those pages are disappearing each day.
The de-index tool is to remove content which you consider harmful to your business. Of course, any damage to your SEO rankings could be considered harmful, but that is clearly not what Google means. If you use the tool, they clearly explain it is for pages which need to "urgently" need to be removed due to legal reasons, copyright issues, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Only fraction of the AMP pages are indexed
Back in June, we had seen a sharp drop in traffic on our website. We initially assumed that it was due to the Core Update that was rolled out in early June. We had switched from http to https in May, but thought that should have helped rather than cause a problem. Until early June the traffic was trending upwards. While investigating the issue, I noticed that only a fraction (25%) of the AMP pages have been indexed. The pages don't seem to be getting indexed even though they are valid. Accordingly to Google Analytics too, the percentage of AMP traffic has dropped from 67-70% to 40-45%. I wonder if it is due to the indexing issue. In terms of implementation it seems fine. We are pointing canonical to the AMP page from the desktop version and to the desktop version from the AMP page. Any tips on how to fix the AMP indexing issue. Should I be concerned that only a fraction of the AMP pages are indexed. I really hope you can help in resolving this issue.
Technical SEO | | Gautam1 -
Pages not indexed
Hey everyone Despite doing the necessary checks, we have this problem that only a part of the sitemap is indexed.
Technical SEO | | conversal
We don't understand why this indexation doesn't want to take place. The major problem is that only a part of the sitemap is indexed. For a client we have several projects on the website with several subpages, but only a few of these subpages are indexed. Each project has 5 to 6 subpages. They all should be indexed. Project: https://www.brody.be/nl/nieuwbouwprojecten/nieuwbouw-eeklo/te-koop-eeklo/ Mainly subelements of the page are indexed: https://www.google.be/search?source=hp&ei=gZT1Wv2ANouX6ASC5K-4Bw&q=site%3Abrody.be%2Fnl%2Fnieuwbouwprojecten%2Fnieuwbouw-eeklo%2F&oq=site%3Abrody.be%2Fnl%2Fnieuwbouwprojecten%2Fnieuwbouw-eeklo%2F&gs_l=psy-ab.3...30.11088.0.11726.16.13.1.0.0.0.170.1112.8j3.11.0....0...1c.1.64.psy-ab..4.6.693.0..0j0i131k1.0.p6DjqM3iJY0 Do you have any idea what is going wrong here?
Thanks for your advice! Frederik
Digital marketeer at Conversal0 -
Site hit by algorthithmic update in October 2014 - filters and thin content queries.
Back in October 2014 last year, a site we are working with had a significant drop in organic traffic. This coincided with Google's algorithmic update. The side in question uses filters extensively and at the time did not have any canonical tags in place. The lions share of these filter pages had little or no written content just products. The website now has canonical tags throughout and content has started to be added to the top level categories and we will continue to add more, however there is still a large amount of pages with little or no content. Webmaster tools shows that there are large amounts of internal links (for instance 42,000+ to the homepage) which must be due to the filtered pages. I am looking for advice on what is the best way to proceed. Do I edit robots.txt, start adding no follow tags or something else entirely?
Technical SEO | | bfinternet0 -
Duplicate page content
Hello, My site is being checked for errors by the PRO dashboard thing you get here and some odd duplicate content errors have appeared. Every page has a duplicate because you can see the page and the page/~username so... www.short-hairstyles.com is the same as www.short-hairstyles.com/~wwwshor I don't know if this is a problem or how the crawler found this (i'm sure I have never linked to it). But I'd like to know how to prevent it in case it is a problem if anyone knows please? Ian
Technical SEO | | jwdl0 -
Link Structure & Duplicate Content
I am struggling with how I should handle the link structure on my site. Right now most of my pages are like this: Home -> Department -> Service Groups -> Content Page For Example: Home -> IT Solutions -> IT Support & Managed Services -> IT Support Home -> IT Solutions -> IT Support & Managed Services -> Managed Services Home -> IT Solutions -> IT Support & Managed Services -> Help Desk Services Home -> IT Solutions -> Virtualization & Data Center Solutions -> Virtualization Home -> IT Solutions -> Virtualization & Data Center Solutions -> Data Center Solutions This structure lines up with our business and makes logical sense but I am not sure how to handle the department and service group pages. Right now you can click them and it just brings you to a page with a small snippet for the links below. The real content is on the content pages. What I am worried about is that the snippets on those pages are just a paragraph or two of the content that's on the content page. Will this hurt me and get considered duplicate content? What is the best practice for dealing with this? Those department/service group pages have some good content on them but it's just parts of other pages. Am I okay doing this because there are not direct duplicates of other pages just parts of a few pages? Any help on this would be great. Thanks in advance.
Technical SEO | | ZiaTG0 -
Microsite & Ducplicate Content Concern
I have a client that wants to put up a micro-site. It's not really even a niche micro-site, it's his whole site less a category and a few other pages. He is a plastic surgeon that offers cosmetic surgery services for the Face, Breast, and Body at his private practice in City A. He has partnered with another surgeon in City B who's surgical services are limited to only the Face. City B is nearby, but not so close that they consider themselves competitors for Facial surgery. The doctors agreement is that my client will perform only Breast and Body surgery at the City B location. He can market himself in City B (which he currently is not doing on his main site) but only for Breast and Body procedures and is not to compete for Facial surgery. Therefore, he needs this second site to not include content about Facial surgery. My concern is duplicate content. His request plan: the micro-site will be on different domain and C-block, the content, location keywords and meta data will be completely re-written and target City B. However, he wants to use the same theme of his main site - same source code, html/css, same top level navigation, same sub-navigation less the Face section, same images/graphics, same forms, etc. Is it okay to have the same exact site build on a different domain with rewritten copy (less a few pages) to target the same base keywords with only a different location? The site is intended for a different user group in City B, but I'm concerned the search engines won't like this and trigger the filters. I've read a bunch of duplicate content articles including this post panda by Dr. Pete. Great post, but doesn't really answer this particular issue of duplicating code for a related site. Can anyone make a case for or against this? Thanks in advance!
Technical SEO | | cmosnod0 -
Over 1000 pages de-indexed over night
Hello, On my site (www.bridgman.co.uk) we had a lot of duplicate page issues as reported by the Seomoz site report tool - this was due to database driven URL strings. As a result, I sent an excel file with all the duplicate pages to my web developer who put rel canonical tags on what I assumed would be all the correct pages. I am not sure if this is a coincidence, or a direct result of the canonical tags, but a few days after (yesterday) the amount of pages indexed by google dropped from 1,200 to under 200. The number is still declining, and other than the canonical tags I can't work out why Google would just start de-indexing most of our pages. If you could offer any solutions that would be greatly appreciated. Thanks, Robert.
Technical SEO | | 87ROB0 -
Duplicate content
This is just a quickie: On one of my campaigns in SEOmoz I have 151 duplicate page content issues! Ouch! On analysis the site in question has duplicated every URL with "en" e.g http://www.domainname.com/en/Fashion/Mulberry/SpringSummer-2010/ http://www.domainname.com/Fashion/Mulberry/SpringSummer-2010/ Personally my thoughts are that are rel = canonical will sort this issue, but before I ask our dev team to add this, and get various excuses why they can't I wanted to double check i am correct in my thinking? Thanks in advance for your time
Technical SEO | | Yozzer0