How long will Google take to read my robots.txt after updating?
-
I updated www.egrecia.es/robots.txt two weeks ago and I still haven't solved Duplicate Title and Content on the website.
The Google SERP doesn't show those urls any more but SEOMOZ Crawl Errors nor Google Webmaster Tools recognize the change.
How long will it take?
-
What I mean is the website logs:
66.249.73.219 - - [21/May/2012:21:50:58 -0700] "GET /robots.txt HTTP/1.1" 200 435 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.73.206 - - [21/May/2012:21:53:00 -0700] "GET /robots.txt HTTP/1.1" 301 239 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
72.21.83.124 - - [21/May/2012:22:05:33 -0700] "GET /robots.txt HTTP/1.1" 304 - "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.73.219 - - [21/May/2012:22:50:58 -0700] "GET /robots.txt HTTP/1.1" 200 435 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.73.206 - - [21/May/2012:23:01:31 -0700] "GET /robots.txt HTTP/1.1" 301 239 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
72.21.83.124 - - [21/May/2012:23:44:15 -0700] "GET /robots.txt HTTP/1.1" 304 - "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.73.219 - - [21/May/2012:23:50:58 -0700] "GET /robots.txt HTTP/1.1" 200 435 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.73.206 - - [22/May/2012:00:16:58 -0700] "GET /robots.txt HTTP/1.1" 301 239 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
72.21.83.124 - - [22/May/2012:00:46:02 -0700] "GET /robots.txt HTTP/1.1" 304 - "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.73.219 - - [22/May/2012:00:50:59 -0700] "GET /robots.txt HTTP/1.1" 200 435 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.73.206 - - [22/May/2012:01:24:08 -0700] "GET /robots.txt HTTP/1.1" 301 239 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.73.219 - - [22/May/2012:01:51:00 -0700] "GET /robots.txt HTTP/1.1" 200 435 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
72.21.83.124 - - [22/May/2012:01:51:17 -0700] "GET /robots.txt HTTP/1.1" 304 - "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.73.206 - - [22/May/2012:02:32:28 -0700] "GET /robots.txt HTTP/1.1" 301 239 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.73.219 - - [22/May/2012:02:50:59 -0700] "GET /robots.txt HTTP/1.1" 200 435 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
72.21.83.124 - - [22/May/2012:02:56:28 -0700] "GET /robots.txt HTTP/1.1" 304 - "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.73.206 - - [22/May/2012:03:40:58 -0700] "GET /robots.txt HTTP/1.1" 301 239 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.73.219 - - [22/May/2012:03:51:00 -0700] "GET /robots.txt HTTP/1.1" 200 435 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
72.21.83.124 - - [22/May/2012:04:01:29 -0700] "GET /robots.txt HTTP/1.1" 304 - "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
72.21.88.227 - - [22/May/2012:04:38:59 -0700] "GET /robots.txt HTTP/1.1" 304 - "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.73.206 - - [22/May/2012:04:43:06 -0700] "GET /robots.txt HTTP/1.1" 301 239 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
66.249.73.219 - - [22/May/2012:04:51:02 -0700] "GET /robots.txt HTTP/1.1" 200 435 "-" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)" -
Thanks Alan, so to see the log you enter the cache version of the url?
-
Hello Christian.
It depends on many things.
In my logs, I see four googlebots today. Each one has read the robots.txt at hourly intervals.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google image search
How does google decide which image show up in the image search section ? Is is based on the alt tag of the image or is google able to detect what is image is about using neural nets ? If it is using neural nets are the images you put on your website taken into account to rank a page ? Let's say I do walking tours in Italy and put a picture of the leaning tower of pisa as a top image while I be penalised because even though the picture is in italy, you don't see anyone walking ? Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Blocking out specific URLs with robots.txt
I've been trying to block out a few URLs using robots.txt, but I can't seem to get the specific one I'm trying to block. Here is an example. I'm trying to block something.com/cats but not block something.com/cats-and-dogs It seems if it setup my robots.txt as so.. Disallow: /cats It's blocking both urls. When I crawl the site with screaming flog, that Disallow is causing both urls to be blocked. How can I set up my robots.txt to specifically block /cats? I thought it was by doing it the way I was, but that doesn't seem to solve it. Any help is much appreciated, thanks in advance.
Intermediate & Advanced SEO | | Whebb0 -
Are links that are disavowed with Google Webmaster Tools removed from the Google Webmaster Profile for the domain?
Hi, Two part question - First, are links that you disavow using google webmaster tools ever removed from the webmaster tools account profile ? Second, when you upload a file to disavow links they ask if you'd like to replace the previously uploaded file. Does that mean if you don't replace the file with a new file that contains the previously uploaded urls those urls are no longer considered disavowed? So, should we download the previous disavow file first then append the new disavow urls to the file before uploading or should we just upload a new file that contains only the new disavow urls? Thanks
Intermediate & Advanced SEO | | bgs0 -
Google disavow tool
I have an algorithmic penalty on one of my websites. I never received a notification of a manual penalty in GWMT and even sent in a reconsideration request 6 months ago ad they told me their were no manual penalties on the website. I have cleaned up my link profile and what I could not clean up I sent in using the Google disavow tool a few days ago. I've heard to just wait if it's algorithmic or should I send in another reconsideration request for disavow links tool?
Intermediate & Advanced SEO | | MarkHIggins0 -
Can links indexed by google "link:" be bad? or this is like a good example by google
Can links indexed by google "link:" be bad? Or this is like a good example shown by google. We are cleaning our links from Penguin and dont know what to do with these ones. Some of them does not look quality.
Intermediate & Advanced SEO | | bele0 -
What is next from Google Panda and Google Penguin?
Does anyone know what we can expect next from Google Panda/Penguin? We did prepare for this latest update and so far so good.
Intermediate & Advanced SEO | | jjgonza0 -
What is a good content for google?
When we start to study SEO and how google see our webpage, one important point is to have good content. But, for beginners like me, we get lost on this. Is not so black and white: what for you is a good content? the text amount matters? there is any trick that all good content websites need to have?
Intermediate & Advanced SEO | | Naghirniac0 -
Can google read ajax
Looking to load a one page product view instead of 10 pages of pagination. Does google read ajax and see all 10 pages as 1 page.
Intermediate & Advanced SEO | | Archers1