Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google Indexing Request - Typical Time to Complete?
-
In Google Search Console, when you request the (re) indexing of a fetched page, what's the average amount of time it takes to re-index and does it vary that much from site to site or are manual re-index request put in a queue and served on a first come - first serve basis despite the site characteristics like domain/page authority?
-
I want to be clear that I'm not referring to a re-crawl, but a re-index. Now I realize there are a gazillion ranking signals and most of the stronger signals are probably not on-page signals (although page title, headers, and anchor text combined is probably a relatively strong signal) so that for most situations, on-page changes are going to like move you from the middle of page 2 to top 3 (except for obscure - low competition long tail keywords of course.)
So is there a delay between re-crawl and re-rank (I'll use that term instead of re-index). I also realize the rank can change based on changes on the other sites in the SERPS. I suppose the re-rank delay could be verified by taking a 'sacrificial' page and totally changing the title, headings, and other on-page items to a completed different keyword theme and see how long it takes for the rank to go down for the previous keyword theme and up for the new theme.
I would think Google would quite possibly add a delay, even a random delay length, to discourage people from constantly requesting re-indexing of a single page to see the rank change. Granted the change if any would be small since on-page signals as I mentioned are a sliver of the signal pie. So most SEO's I would think would be of the opinion this 'trial-and-error' is a waste of time?
-
Hi SEO1805
I agree with Casey, if you go into your Google Analytics account and do a fetch and render to check the new/revised page and then request indexing you will normally see the results updated in a few hours.
Do realize tho that of course google is not a single server, so one person may see the updates very soon and others may not right away as the search index propagates to Googles servers.
Of course the search consol will also tell you how many pages Google is visiting every day so you have an idea about how often they think you are updating your content.
Take care,
Herb
-
Exactly what Logopedia y Más said!
I've just made some sitewide changes to a client site today (3-4 hours ago) and done a fetch straight after, some are already reflected in the SERPs while some aren't. So it really just depends, however, I find it typically doesn't take too long with the sites I have dealt with
-
Hi Seo 1805.
Put, indexing and ranking higher in Google isn’t an exact science. There's really no set timetable for how quickly your new page will be indexed by Google.
It isn’t guaranteed that the URL will be crawled again or that it will be done immediately. It usually takes several days to accept a request. Please also note that they can't guarantee that Google will index all the changes made, since to update the indexed content depends on a complex algorithm.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to stop URLs that include query strings from being indexed by Google
Hello Mozzers Would you use rel=canonical, robots.txt, or Google Webmaster Tools to stop the search engines indexing URLs that include query strings/parameters. Or perhaps a combination? I guess it would be a good idea to stop the search engines crawling these URLs because the content they display will tend to be duplicate content and of low value to users. I would be tempted to use a combination of canonicalization and robots.txt for every page I do not want crawled or indexed, yet perhaps Google Webmaster Tools is the best way to go / just as effective??? And I suppose some use meta robots tags too. Does Google take a position on being blocked from web pages. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Google does not want to index my page
I have a site that is hundreds of page indexed on Google. But there is a page that I put in the footer section that Google seems does not like and are not indexing that page. I've tried submitting it to their index through google webmaster and it will appear on Google index but then after a few days it's gone again. Before that page had canonical meta to another page, but it is removed now.
Intermediate & Advanced SEO | | odihost0 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
301s being indexed
A client website was moved about six months ago to a new domain. At the time of the move, 301 redirects were setup from the pages on the old domain to point to the same page on the new domain. New pages were setup on the old domain for a different purpose. Now almost six months later when I do a query in google on the old domain like site:example.com 80% of the pages returned are 301 redirects to the new domain. I would have expected this to go away by now. I tried removing these URLs in webmaster tools but the removal requests expire and the URLs come back. Is this something we should be concerned with?
Intermediate & Advanced SEO | | IrvCo_Interactive0 -
How can I get a list of every url of a site in Google's index?
I work on a site that has almost 20,000 urls in its site map. Google WMT claims 28,000 indexed and a search on Google shows 33,000. I'd like to find what the difference is. Is there a way to get an excel sheet with every url Google has indexed for a site? Thanks... Mike
Intermediate & Advanced SEO | | 945010 -
When should you redirect a domain completely?
We moved a website over to a new domain name. We used 301 redirects to redirect all the pages individually (around 150 redirects). So my question is, when should we just kill the old site completely and just redirect (forward/point) the old domain over to the new one?
Intermediate & Advanced SEO | | co.mc0 -
How do you de-index and prevent indexation of a whole domain?
I have parts of an online portal displaying in SERPs which it definitely shouldn't be. It's due to thoughtless developers but I need to have the whole portal's domain de-indexed and prevented from future indexing. I'm not too tech savvy but how is this achieved? No index? Robots? thanks
Intermediate & Advanced SEO | | Martin_S0 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0