Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How long takes to a page show up in Google results after removing noindex from a page?
-
Hi folks,
A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results.
We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url
Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this.
I know that in some days it will appear but I'd like to have a good reference for the future.
Thanks.
-
Just to let you know that the page was indexed in less than 24hrs. We didn't use Tony's tiip (share on G+) but we did all the following:
- Used GWT tool - fetch as googlebot
- Submit the URL using the button that appears after fetching as googlebot
- Included some sidewide links to the page
- Included the page in our sitemap.xml
Thanks all folks who added some insights and tips!
-
Thanks for the tip Tony! We didn't try this yet.
-
Depends on the site, if the site is Microsoft.com with a link from the home page, you can expect it to appear same day.
If its on boringoldsite.com then it could take a week or more.
But mostly a few days -
You can do two things in Google Webmaster tools to identify how long it will take for a page to index or even speed up the process of re indexation:
- Use Google's crawl rate and indexation reports
2) google tools fetch as googlebot
-
Hi Fabio,
Share the page in question on G+. Indexation of G+ posts (including links) can be as quick as 1/2 hour. Also make sure the website is linked to from the clients main G+ profile as a custom link.
-
We had a sub domain website (very small... four or five pages) that was blocked via the robots.txt file for two or three years. When we decided to have it indexed I did just what you did; fetch via GWT and clicked the button to add it to the index. This worked and then the next day... or maybe two days later, it was gone. I did this a couple of times...
It didn't hit the index and stick for two weeks. But since then everything has been just fine.
-
One of my competitors had a designer put a new look on their website. As soon as they uploaded it we went to the site to sniff the code. We saw that the developer left the "noindex" on all of the files. We laughed and laughed about that. Within a few days their entire site dropped out of search and it took them a couple weeks to figure out what happened while we enjoyed a big increase in sales. But, when they uploaded the site with the noindex removed, within a few days the pages were mostly back in search and two weeks later they were back to normal.
The amount of time required is influenced by the amount of spider action received by the site. If your site has low PageRank and does not receive a lot of spider action you can go much longer without being reindexed. Deep pages on a site without much spider action can take weeks to come back. The site in the example above is a PR6 site with mostly PR3 and PR4 pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How get google reviews on search results?
Hi, We have good google reviews. (4,8) Can we get this rating stars also on our organic search results ? Best remco
Intermediate & Advanced SEO | | remcoz0 -
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
Does having alot of pages with noindex and nofollow tags affect rankings?
We are an e-commerce marketplace at for alternative fashion and home decor. We have over 1000+ stores on the marketplace. Early this year, we switched the website from HTTP to HTTPS in March 2018 and also added noindex and nofollow tags to the store about page and store policies (mostly boilerplate content) Our traffic dropped by 45% and we have since not recovered. We have done I am wondering could these tags be affecting our rankings?
Intermediate & Advanced SEO | | JimJ1 -
I think Google Analytics is mis-reporting organic landing pages.
I have multiple clients whose Google Analytics accounts are showing me that some of the top performing organic landing pages (in terms of highest conversion rates) look like this: /cart.php /quote /checkout.php /finishorder.php /login.php In some cases, these pages are blocked by Robots.txt. In other cases they are not even indexed at all in Google. These pages are clearly part of the conversion process. A couple of them are links sent out when a cart is abandoned, etc. - is it possible they actually came in organically but then re-entered via one of these links which is what Google is calling the organic landing page? How is it possible that these pages would be the top performing landing pages for organic visitors?
Intermediate & Advanced SEO | | FPD_NYC0 -
Google Adsbot crawling order confirmation pages?
Hi, We have had roughly 1000+ requests per 24 hours from Google-adsbot to our confirmation pages. This generates an error as the confirmation page cannot be viewed after closing or by anyone who didn't complete the order. How is google-adsbot finding pages to crawl that are not linked to anywhere on the site, in the sitemap or linked to anywhere else? Is there any harm in a google crawler receiving a higher percentage of errors - even though the pages are not supposed to be requested. Is there anything we can do to prevent the errors for the benefit of our network team and what are the possible risks of any measures we can take? This bot seems to be for evaluating the quality of landing pages used in for Adwords so why is it trying to access confirmation pages when they have not been set for any of our adverts? We included "Disallow: /confirmation" in the robots.txt but it has continued to request these pages, generating a 403 page and an error in the log files so it seems Adsbot doesn't follow robots.txt. Thanks in advance for any help, Sam
Intermediate & Advanced SEO | | seoeuroflorist0 -
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
For how long does Google honor a 302 redirect?
Greetings! I would love some recent experiences to support our experience which is +/- 1 year old on this question. Based on our experiences around a year ago, I believe that Google will only honor a 302 temporary redirect for a relatively short period - perhaps up to a month - and then it will begin treating the redirect as a 301 redirect and will remove the old page from the index. Have others seen this? Is there an update on what the max "safe" period to have a 302 in place could be? We have a domain that is soon to experience about 3 months of "downtime" with no content on it, but the content will be back after that time. Ideally we would 302 redirect the pages elsewhere just for that downtime period. However, I don't want to do a 302 redirect if there is a risk that the pages will lose all of their accumulated authority and indexing. Basically, is there any safe way to just put the domain on ice for a few months? Please share recent experience only. Thanks for your insights!
Intermediate & Advanced SEO | | g-s-m0 -
301 redirection pointing to noindexed pages
I have rather an unusual situation where a recently launched affiliate site does not have any unique content as its all syndicated content. For that reason we are currently using the noindex,nofollow meta tags to keep the pages out of the search engines index until we create unique content for the pages. The problem is that due to a very tight timeframe with rebranding, we are looking at 301 redirecting (on a page to page basis) another high authority legacy domain to this new site before we have had a chance to add unique content to it and remove the noindex,nofollow tags. I would assume that any link authority normally passed through the 301 would be lost in this scenario but Im uncertain of what the broader impact might be. Has anyone dealt with a similar scenario? I know this scenario is not ideal and I would rather wait until the unique content is up and noindex tags are removed before launching the 301 redirect of the legacy domain but there are a number of competing priorities at play outside of SEO.
Intermediate & Advanced SEO | | LosNomads0