Meta NOINDEX... how long before Google drops dupe pages?
-
Hi,
I have a lot of near dupe content caused by URL params - so I have applied:
How long will it take for this to take effect? It's been over a week now, I have done some removal with GWT removal tool, but still no major indexed pages dropped.
Any ideas?
Thanks,
Ben
-
In his case - he wants to get rid of some duplicate content only.
I see what you mean but if he is not in the situation listed in http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1269119 then it might be the best bet / fastest bet.
For me personally it worked so far very well - if no robots.txt is used as that won't help on the long run as the removal tool has an expiration date of several months.
The down side of the removal tools is the same expiration date - as if you change your mind you will have some issues getting the page sinto the index.
-
You know that I think you are the bees knees, but I am going to have to disagree on this one. Even Google does not recommend using the removal rool for this application.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1269119
Still pals?
-
There are several things that you can do to get Google to crawl your site (or your new content) quicker and more often. You should be doing all of these, but in case you're not, here is the list.
-
Create a Sitemap and submit it through Web Master Tools
-
Install Google Analytics
-
Create social accounts/update your social accounts
-
Fetch as Google Webmaster tools
-
Update your content more often (to get Google to crawl your site more frequently).
-
Adjust the crawl speed on Google Webmaster tools.
-
Check crawl errors on Google Webmaster tools. Are there sever side errors (500)?
I hope that helps!
-
-
Hi,
The best bet is the removal tool from GWT - this is the fastes way.
If your pages are static and google bot is visiting those pages once a month or once 4-5-6 months - you will need to wait until google bot is visiting those pages again, notice the nofollow and drop those from the index.
I'v e seen cases with 6 months.
Anyway you will probably see those pages drop step by step.
What you can try, although is not very straight forward is to build an xml sitemap only with those files and submit it via GWMt - sometimes google bot will think that something new happen and will visit those pages, see the no index and speed the process - but not always as I've seen in some cases that this didn't work - in some cases it did.
Again, the best bet will be the GWMT removal tool.
Cheers.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should you 'noindex' Checkout Pages?
Today I was reviewing my Moz analytics and suddenly noticed 1,000 issues with pages without a meta description. I reviewed the list and learned it is 1,000 checkout pages. That's because my website has thousands of agency pages from which you can buy a product, and it reflects that difference on each version of the checkout. So, I was thinking about no-indexing (but continuing to 'follow') these checkout pages, but wondering if it has any knock-on effects I may be unaware of? Any assistance is much appreciated. Luke
Intermediate & Advanced SEO | | Luke_Proctor0 -
What to do if lots of backend pages have been indexed by Google erroneously?
Hi Guys Our developer forgot to add a no index no follow tag on the pages he created in the back-end. So we have now ended up with lots of back end pages being indexed in google. So my question is, since many of those are now indexed in Google, so is it enough to just place a no index no follow on those or should we do a 301 redirect on all those to the most appropriate page? If a no index no follow is enough, that would create lots of 404 errors so could those affect the site negatively? Cheers Martin
Intermediate & Advanced SEO | | martin19700 -
Google Search Console - Indexed Pages
I am performing a site audit and looking at the "Index Status Report" in GSC. This shows a total of 17 URLs have been indexed. However when I look at the Sitemap report in GSC it shows 9,000 pages indexed. Also, when I perform a site: search on Google I get 24,000 results. Can anyone help me to explain these anomalies?
Intermediate & Advanced SEO | | richdan0 -
Add noindex,nofollow prior to removing pages resulting in 404's
We're working with another site that unfortunately due to how their website has been programmed creates a bit of a mess. Whenever an employee removes a page from their site through their homegrown 'content management system', rather than 301'ing to another location on their site, the page is deleted and results in a 404. The interim question until they implement a better solution in managing their website is: Should they first add noindex,nofollow to the pages that are scheduled to be removed. Then once they are removed, they become 404's? Of note, it is possible that some of these pages will be used again in the future, and I would imagine they could submit them to Google through Webmaster Tools and adding the pages to their sitemap.
Intermediate & Advanced SEO | | Prospector-Plastics0 -
Google+ Pages on Google SERP
Do you think that a Google+ Page (not profile) could appear on the Google SERP as a Rich Snippet Author? Thanks
Intermediate & Advanced SEO | | overalia0 -
To noindex or not to noindex
Our website lets users test whether any given URL or keyword is censored in China. For each URL and keyword that a user looks up, a page is created, such as https://en.greatfire.org/facebook.com and https://zh.greatfire.org/keyword/freenet. From a search engines perspective, all these pages look very similar. For this reason we have implemented a noindex function based on certain rules. Basically, only highly ranked websites are allowed to be indexed - all other URLs are tagged as noindex (for example https://en.greatfire.org/www.imdb.com). However, we are not sure that this is a good strategy and so are asking - what should a website with a lot of similar content do? Don't noindex anything - let Google decide what's worth indexing and not. Noindex most content, but allow some popular pages to be indexed. This is our current approach. If you recommend this one, we would like to know what we can do to improve it. Noindex all the similar content. In our case, only let overview pages, blog posts etc with unique content to be indexed. Another factor in our case is that our website is multilingual. All pages are available (and equally indexed) in Chinese and English. Should that affect our strategy?References:https://zh.greatfire.orghttps://en.greatfire.orghttps://www.google.com/search?q=site%3Agreatfire.org
Intermediate & Advanced SEO | | GreatFire.org0 -
ASP.Net How to Allow Google to Skip a Disclaimer Page
I have an ASP.NET website witch forces users to accept a disclaimer before accessing the website. I want to allow Google and other Searchbots to index/craw all the pages without accepting the disclaimer. What is to best way to do this? Thanks
Intermediate & Advanced SEO | | Tug-Agency0 -
Does Google count links on a page or destination URLs?
Google advises that sites should have no more than around 100 links per page. I realise there is some flexibility around this which is highlighted in this article: http://www.seomoz.org/blog/questions-answers-with-googles-spam-guru One of Google's justifications for this guideline is that a page with several hundred links is likely to be less useful to a user. However, these days web pages are rarely 2 dimensional and usually include CSS drop--down navigation and tabs to different layers so that even though a user may only see 60 or so links, the source code actually contains hundreds of links. I.e., the page is actually very useful to a user. I think there is a concern amongst SEO's that if there are more than 100ish links on a page search engines may not follow links beyond those which may lead to indexing problems. This is a long winded way of getting round to my question which is, if there are 200 links in a page but many of these links point to the same page URL (let's say half the links are simply second ocurrences of other links on the page), will Google count 200 links on the page or 100?
Intermediate & Advanced SEO | | SureFire0