De-indexing thin content & Panda--any advantage to immediate de-indexing?
-
We added the nonidex, follow tag to our site about a week ago on several hundred URLs, and they are still in Google's index. I know de-indexing takes time, but I am wondering if having those URLs in the index will continue to "pandalize" the site. Would it be better to use the URL removal request? Or, should we just wait for the noindex tags to remove the URLs from the index?
-
Whenever Matt Cutts discusses this subject in the Webmaster Tools videos and elsewhere, there is always a caveat along the lines of "while google mostly take notice of noindex and robots.txt, this may not always be acted upon". The primary reason given for this seems to be if content is indexed via a link from another site, or exists in google cache. In these cases it seems logical that it may continue to appear in the index.
Your question reminded me of Dr Pete's Catastrophic Canonicalization Experiment - it seems his method proved quite effective
-
Hey
I don't think it would make a great deal of difference as you are going to need to wait for a full crawl of your site anyhow before you see any benefits.
Out of interest, how are you identifying the low quality pages? One way to have a go at this is to use your analytics and identify all pages with a 100% bounce rate and noindex all of them. If there are lots (sounds like there are) you can do them in chunks and see what happens.
Don't get rid of pages that are doing good search traffic or have a low bounce rate UNLESS you know they are really poor pages as sooner or later, they will be picked up.
Ultimately, it sounds like a big site so you are going to have to be patient here and make incremental changes based on analytical and crawl data until you get the results you are looking for.
I have pulled a site back from the depths, a rather unfairly punished site in my opinion that just got it's content copied by several other sites but the same rules applied. We updated pages, removed blocks of template content to their own pages and just kept on watching and like magic, it came back stronger than before a week or so after we made all the changes.
Hope this helps!
Marcus -
You want to be a bit more patient. Depending on how popular and deep these pages are within your site, I would expect it to take several weeks to see most of them disappear. There is a good chance if you check you will find a percentage of those pages are disappearing each day.
The de-index tool is to remove content which you consider harmful to your business. Of course, any damage to your SEO rankings could be considered harmful, but that is clearly not what Google means. If you use the tool, they clearly explain it is for pages which need to "urgently" need to be removed due to legal reasons, copyright issues, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated pages are being indexed?
I have lots of paginated pages which are being indexed. Should I add the noindex tag to page 2 onwards? The pages currently have previous and next tags in place. Page one also has a self-referencing canonical.
Technical SEO | | WTH0 -
Why are only a few of our pages being indexed
Recently rebuilt a site for an auctioneers, however it has a problem in that none of the lots and auctions are being indexed by Google on the new site, only the pages like About, FAQ, home, contact. Checking WMT shows that Google has crawled all the pages, and I've done a "Fetch as Google" on them and it loads up fine, so there's no crawling issues that is standing out. I've set the "URL Parameters" to no effect too. Also built a sitemap with all the lots in, pushed to Google which then crawled them all (massive spike in Crawl rate for a couple days), and still just indexing a handful of pages. Any clues to look into would be greatly appreciated. https://www.wilkinsons-auctioneers.co.uk/auctions/
Technical SEO | | Blue-shark0 -
Homepage/Root domain de-indexed by Google
This morning I discovered that the homepage/root domain of our company site, http://www.collegeplus.org/, has been de-indexed by Google and Bing. Out IT dept. is claiming it's our fault because we changed the meta title on our homepage. But they will not give me access to GWT to see if there's any issues. I believe the issue lies within our robots.txt file - http://www.collegeplus.org/robots.txt I also don't believe we're suffering a penalty because all of our tier 2 pages are still indexed when any type of branded search is performed. We don't do things that can get a site de-indexed like this. Any ideas on what the issue may be? Or at least something to convince our IT dept. that simply changing a meta title won't get your homepage totally de-indexed? Thanks.
Technical SEO | | explorionary0 -
Linking to unrelated content
Hi, Just wanted to know, linking to unrelated content will harm the site? I know linking to unrelated content is not good. But wanted to know weather any chances are there or not. I have a site related to health and the other one related to technology. The technology site is too good having PR 6 and very good strong backlinks. And the health related site has very much tough competition, So i wanted to know may be i could link this health site to technology site to get good link from it. Can you suggest me about it. waiting for your replies...
Technical SEO | | Dexter22387874870 -
I have a site that has both http:// and https:// versions indexed, e.g. https://www.homepage.com/ and http://www.homepage.com/. How do I de-index the https// versions without losing the link juice that is going to the https://homepage.com/ pages?
I can't 301 https// to http:// since there are some form pages that need to be https:// The site has 20,000 + pages so individually 301ing each page would be a nightmare. Any suggestions would be greatly appreciated.
Technical SEO | | fthead90 -
Dealing with indexable Ajax
Hello there, My site is basically an Ajax application. We assume lots of people link into deep pages on the site, but bots won't be able to read past the hashmarks, meaning all links appear to go to our home page. So, we have decided to form our Ajax for indexing. And so many questions remain. First, only Google handles indexable Ajax, so we need to keep our static "SEO" pages up for Bing and Yahoo. Bummer, dude, more to manage. 1. How do others deal with the differences here? 2. If we have indexable Ajax and static pages, can these be perceived as duplicate content? Maybe the answer is to disallow google bot from indexing the static pages we made. 3. What does your canonical URL become? Can you tell different search engines to read different canonical URLs? So many more questions, but I'll stop there. Curious if anyone here has thoughts (or experience) on the matter. Erin
Technical SEO | | ErinTM2 -
Index forum sites
Hi Moz Team, somehow the last question i raised a few days ago not only wasnt answered up until now, it was also completely deleted and the credit was not "refunded" - obviously there was some data loss involved with your restructuring. Can you check whether you still find the last question and answer it quickly? I need the answer 🙂 Here is one more question: I bought a website that has a huge forum, loads of pages with user generated content. Overall around 500.000 Threads with 9 Million comments. The complete forum is noindex/nofollow when i bought the site, now i am thinking about what is the best way to unleash the potential. The current system is vBulletin 3.6.10. a) Shall i first do an update of vbulletin to version 4 and use the vSEO tool to make the URLs clean, more user and search engine friendly before i switch to index/follow? b) would you recommend to have the forum in the folder structure or on a subdomain? As far as i know subdomain does take lesser strenght from the TLD, however, it is safer because the subdomain is seen as a separate entity from the regular TLD. Having it in he folder makes it easiert to pass strenght from the TLD to the forum, however, it puts my TLD at risk c) Would you release all forum sites at once or section by section? I think section by section looks rather unnatural not only to search engines but also to users, however, i am afraid of blasting more than a millionpages into the index at once. d) Would you index the first page of a threat or all pages of a threat? I fear duplicate content as the different pages of the threat contain different body content but the same Title and possibly the same h1. Looking forward to hear from you soon! Best Fabian
Technical SEO | | fabiank0 -
About duplicate content
Hi i'm a new guy around here, but i'm having this problem in my website. Using de Seomoz tools i ran a camping to my website, in results i get to many errors for duplicate conten, for example, http://www.mysite/blue/ http://www.mysite/blue/index.html, so my question is, what is the best way to resolve this problem, use a 301 or use the rel canonical tag? Wich url will be consider for main url, Thanks for yor help.
Technical SEO | | NorbertoMM0