Why is google not deindexing pages with the meta noindex tag?
-
On our website www.keystonepetplace.com we added the meta noindex tag to category pages that were created by the sorting function.
Google no longer seems to be adding more of these pages to the index, but the pages that were already added are still in the index when I check via site:keystonepetplace.com
Here is an example page: http://www.keystonepetplace.com/dog/dog-food?limit=50
How long should it take for these pages to disappear from the index?
-
Google might have already crawled the pages but not indexed them yet. Be patient , if you have enough links coming in and the pages are less than 3 levels deep they will all be crawled and indexed in no time.
-
I guess it depends on the urgency of your situation. If you were just trying to clean things up then it's okay to wait for Google to re-crawl and solve the problem. But if you have been affected by panda and your site is not ranking then I personally would consider that an urgent enough need to use the tool.
-
This link almost makes it seem like I shouldn't use the webmaster tools removal.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1269119
-
The crawlers have so many billions of webpages to get to. We have more than 50,000 on our site; there's about 8,000 that they check more regularly than the others - some are just really deep on the site and hard to get to.
-
You can remove entire category directories from the index in one command using the tool. But the urls won't be removed from the cache, just the index. To remove them from the cache you'll need to enter each url individually. I think that if you are trying to clear things up for Panda reasons, just removing from the index is enough. However, I'm currently trying to decide if it will speed things up to remove from the cache as well.
-
Ok. That makes sense. I wonder why it takes so long? I'll start the long process of the manual removal.
-
Streamline Metrics has got it right.
I've seen pages take MONTHS to drop out of the index after being noindexed. It's best to use the URL removal tool in WMT (not to be confused with the disavow tool) to tell Google to not only deindex the pages but to remove them from the cache as well. I have found that when you do this the pages are gone within 12 hours.
-
In your experience how long does this normally take?
-
Yes it was around December 2nd or 3rd that we added the noindex tags. It just seemed like google wasn't removing any pages yet from the index. It did stop google from adding more of these pages though.
-
It all depends on how long it takes Google to re-crawl those pages with the no index tag on them.
I would do this along with the steps you have already taken in order to help speed the process up if you are in a hurry
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1663419
-
Do you know when you added the noindex tags? Google will need to recrawl the pages to see the noindex tags before removing them. I just looked at one your category pages and it looks like it was cached by Google on December 1st, and there was no noindex tag on that page. Depending on how big your site is and how often your site is crawled will determine when they will be removed from the index. Here's Google's official explanation -
"When we see the noindex meta tag on a page, Google will completely drop the page from our search results, even if other pages link to it. Other search engines, however, may interpret this directive differently. As a result, a link to the page can still appear in their search results.
Note that because we have to crawl your page in order to see the noindex meta tag, there's a small chance that Googlebot won't see and respect the noindex meta tag. If your page is still appearing in results, it's probably because we haven't crawled your site since you added the tag. (Also, if you've used your robots.txt file to block this page, we won't be able to see the tag either.)
If the content is currently in our index, we will remove it after the next time we crawl it. To expedite removal, use the URL removal request tool in Google Webmaster Tools."
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710
-
Or canonical or by robots.txt
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not detecting hreflang tags
Hey guys, Recently (approx 1 month ago) did a migration from the .co.uk version of our site to .com/en. We've been doing a migration every few months to get everything under our .com. Previous migrations haven't had any problems at all, and hreflang tags detected correctly. For this new UK migration (that was done 1 month ago) google is saying that it doesn't detect any hreflang tags. We place our hreflang tags in our sitemap and so far we haven't had any problems with it. Here's the sitemap: https://camaloon.com/en/web-sitemap.xml Any thoughts on what could be happening? I really appreciate your input and help 🙌
Technical SEO | | mooj0 -
Should I noindex pages on my website that are pulled from an API integration
SEO/Moz newbie here! My organisation's website (dyob.com.au), uses an API integration to pull through listings that are shown in the site search. There is a high volume of these, all of which only contain a title, image and contact information for the business. I can see these pages coming up on my Moz accounts with issues such as duplicate content (even if they are different) or no description. We don't have the capacity to fill these pages with content. Here's an example: https://www.dyob.com.au/products/nice-buns-by-yomg I am looking for a recommendation on how to treat these pages. Are they likely to be hurting the sites SEO? We do rank for some of these pages. Should they be noindex pages? TIA!
Technical SEO | | monica.arklay0 -
Does Google read dynamic canonical tags?
Does Google recognize rel=canonical tag if loaded dynamically via javascript? Here's what we're using to load: <script> //Inject canonical link into page head if (window.location.href.indexOf("/subdirname1") != -1) { canonicalLink = window.location.href.replace("/kapiolani", ""); } if (window.location.href.indexOf("/subdirname2") != -1) { canonicalLink = window.location.href.replace("/straub", ""); } if (window.location.href.indexOf("/subdirname3") != -1) { canonicalLink = window.location.href.replace("/pali-momi", ""); } if (window.location.href.indexOf("/subdirname4") != -1) { canonicalLink = window.location.href.replace("/wilcox", ""); } if (canonicalLink != window.location.href) { var link = document.createElement('link'); link.rel = 'canonical'; link.href = canonicalLink; document.head.appendChild(link); } script>
Technical SEO | | SoulSurfer80 -
Meta tags in Single Page Apps
Since the deprecation of the AJAX Crawling Scheme back last October I am curious as to when Googlebot actually reads meta tag information from a page. We have a website at whichledlight.com that is implemented using emberjs. Part of the site is our results pages (i.e. gu10-led-bulbs). This page updates the meta and link tags in the head of the document for things like canonicalisation and robots, but can only do so after the page finishes loading and the JavaScript has been run.When the AJAX crawling scheme was still in place we were able to prerender these pages (including the modified meta and link tags) and serve these to Googlebot. Now Googlebot no longer uses these prerendered snapshots and instead is sophisticated enough load and run our site.So the question I have is does Googlebot read the meta and links tags downloaded from the original response or does it wait until the page finishes rendering before reading them (including any modifications that have been performed on them)
Technical SEO | | TrueluxGroup1 -
See Different Landing page for my main keyword in google search result
I have a website like http://www.bannerbuzz.com, i am promoting home page with vinyl banners keyword, but currently i can see my website's review page for vinyl banners result in google, i want to display my home page instead of review page for my keyword result in google, its frequently change, some time i can see home page for it and some time it shows review page as i attached image. i want to show my home page, so can you please help me to solve it, how can i stable my home page with main keywords. OtOXxiE.png
Technical SEO | | CommercePundit0 -
Why did google pick this page to rank over another one?
I recently started working here and I have noticed that google is ranking some pages over other for the main key word. Example: We are ranking on page one for ATV tires for this url http://www.rockymountainatvmc.com/t/43/81/165/723/ATV-Tires-All I thought google would pick http://www.rockymountainatvmc.com/c/43/81/165/ATV-Tires since it is higher up in the folders. I Have a couple reasons why the are picking the other one. Mostly from link signals from one other site and footer link.. Any other thoughts. If we want google to rank the second url instead what would you suggest?
Technical SEO | | DoRM0 -
How narrowly geo targeted should your Google Places page be?
Hi Mozers I'm still struggling with my London based client with two locations and one business. Basically she has a location in W1W 'Westminster' and a location in 'WD!' Borehamwood. Has anyone any good resources of input concerning geotargeting. I've done some searching but can't get quite the help I'm seeking. I'd like to make the Pages cover a 5mile radius and be highly specific to their locations. Is this the right way to proceed? Thanks
Technical SEO | | catherine-2793880 -
Which pages to "noindex"
I have read through the many articles regarding the use of Meta Noindex, but what I haven't been able to find is a clear explanation of when, why or what to use this on. I'm thinking that it would be appropriate to use it on: legal pages such as privacy policy and terms of use
Technical SEO | | mmaes
search results page
blog archive and category pages Thanks for any insight of this.0