When do you use 'Fetch as a Google'' on Google Webmaster?
-
Hi,
I was wondering when and how often do you use 'Fetch as a Google'' on Google Webmaster and do you submit individual pages or main URL only?
I've googled it but i got confused more. I appreciate if you could help.
Thanks
-
I hazard to say that if the new product was in the sitemap it would have also appears in the SERPs. We submit sitemaps every day and products are in the index within hours.
I guess the GWMT manual submission is okay if you need to manually fix some pages, but then it asks the question, how your SEO efforts could not make those visible to bots (via link structure or sitemaps).
-
Thanks Gerd, it's a bit more clear now. Appreciate your help.
-
Thanks Frank, appreciate your help
-
Thank you so much for your reply. I am a bit more clear now what to do. Appreciate your help.
-
Sida, what I meant is that I use the Google Webmaster Tool function "Fetch as Google" only as a diagnostic function to see how GoogleBot receives a request from my website.
It seems that people fetch URLs via the GWMT "Fetch as Google" and then use the function to submit it to the index. I find that not a good idea as any new content should either be discoverable (via SEO) or should be submitted to Google automatically via a sitemap (hinted in robots.txt)
-
Thanks Gerd, Would you mind clarifying a bit more what 'diagnostic tool' is and if you recommend a name as well, that'll be fantastic.
-
Use it as a "diagnostic tool" to check how content or error pages are retrieved via the bot. I specifically look at it from a content and HTTP-status perspective.
I would not use it to submit URLs - for that you should rather use a sitemap file. Think of "Fetch as Google" as a troubleshooting tool and not something to submit pages to an index.
-
Here's an oh-by-the-way.
One of our manufacturer's came out with a product via slow roll literally within the last 5 days. They have not announced the release of it to the retailers. I happened to stumble on it visiting their site while updating products.
I did a search of the term and found I wasn't the only one unaware of it so I scrambled to add the product to the site, promote it and submit it to the index late Tuesday.
It's Thursday and its showing in SERPs.
Would it have appeared that quickly if I didn't submit it via fetch? I don't know for sure but I'm inclined to think not. Call me superstitious.
Someone debunk the myth if you can. One less thing for me to do.
-
If I add a lot a product/articles I just do a sitemap re-submit but if I only add one product or article I just wait till the bots crawl to that links. It usually takes a couple of day before it gets indexed. I never really used the fetch as google unless I made changes to the structure of the website.
Hope this helps.
-
I submit every product and category I add.
Do I have to? No. Is it necessary? No - we have an xml sitemap generator. Google is like Big Brother - he will find you. Fetch is a tool that you can use or not use.
Will Google find it faster and will you show up more quickly in search results if you submit it? I don't know.
-
Thank you AWC, I've read that article arlready but I am not quite sure is that how often this feature should be used. I think i should be more specific..If you have a ecommerce website and adding a product every 2-3days, would you submit the link every time you add a new item ? When you publish a blog article on your website, would you submit it immediately?
-
I think GWT explains it very well.
https://support.google.com/webmasters/answer/158587?hl=en
I typically use it to submit new pages to the index although its probably not necessary if you have an xml sitemap. Not certain on that one.
More tech savvy folks probably use it to also check the crawlability and "health" of pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Switching from HTTP to HTTPS and google webmaster
HI, I've recently moved one of my sites www.thegoldregister.co.uk to https. I'm using wordpress and put in the permanent 301 redirect for all pages to false https for all pages in the htaaccess file. I've updated the settings in google analytics to https for the original site. All seems to be working well. Regarding the google webmaster tools and what needs to be done. I'm very confused by the google documentation on this subject around https. Does all my crawl data and indexing from http site still stand and be inherited by the https version because of the redirects in place. I'm really worried I will lose all of this indexing data, I looked at the "change of address" in the settings of webmaster, but this seems to refer to changing the actual domain name rather than the protocol which i haven't at all. I've also tried adding the https version to the console as well, but the https version is showing a severe warning "is robots.txt blocking some important pages". I don't understand this error as it's the same version and file as the http site being generated by all in one seo pack for wordpress (see below at bottom). The warning is against line 5 saying it will ignore it. What i don't understand is i don't get this error in the webmaster console with the http version which is the same file?? Any help and advice would be much appreciated. Kind regards Steve User-agent: *
Technical SEO | | lqz
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Crawl-delay: 10 ceLAHIv.jpg0 -
What's Moz's Strategy behind their blog main categories?
I've only just noticed that the Moz' blog categories have been moved within a pull down menu. See it underneath : 'Explore Posts by Category' on any blog page. This means that the whole list of categories under that pull-down is not crawlable by bots, and therefore no link-juice flows down onto those category pages. I imagine that the main drive behind that move is to sculpt page rank so that the business/money pages or areas of the website get greater link equity as opposed to just wasting it all throwing it down to the many categories ? it'd be good to hear about more from Rand or anyone in his team as to how they came onto engineering this and why. One of the things I wonder is: with the sheer amount of content that Moz produces, is it possible to contemplate an effective technical architecture such as that? I know they do a great job at interlinking content from one post onto another, so effectively one can argue that that kind of supersedes the need for hierarchical page rank distribution via categories... but I wonder : "is it working better this way vs having crawlable blog category links on the blog section? have they performed tests" some insights or further info on this from Moz would be very welcome. thanks in advance
Technical SEO | | carralon
David0 -
My sites just disappeared from google last night. there is no manual action in webmaster.
can it the penalty if so how do i find out if i was hit with a penalty i keep checking my webmasters but there is no alert for penalty. this is very sad but once i make sure it was a penalty i can move on for a safer seo. Sites are indexed i checked. there is no other indexing issue or robots issue either. Please help
Technical SEO | | samafaq0 -
Google Publisher status
Hi all, I just wondered what the general opinion was with regard getting Google publisher status for medium to large organisations. Lots of our clients write a lot of articles & publications and it would be interesting to get some thoughts on how others view Authorship & in particular Publisher credentials. Thanks!
Technical SEO | | davidmaxwell0 -
Google Places Question......
Hi Guys. I am working with a photographer they do not have a studio they shoot on location. However I noticed many photographers within their industry have their home address listed in their google places, and they too shoot on location. My client doesn't want their home address listed so I wondered what options there would be? Do you think renting mail forwarding address would suffice?
Technical SEO | | RankStealer0 -
What can I do if Google Webmaster Tools doesn't recognize the robots.txt file?
I'm working on a recently hacked site for a client and and in trying to identify how exactly the hack is running I need to use the fetch as Google bot feature in GWT. I'd love to use this but it thinks the robots.txt is blocking it's acces but the only thing in the robots.txt file is a link to the sitemap. Unde the Blocked URLs section of the GWT it shows that the robots.txt was last downloaded yesterday but it's incorrect information. Is there a way to force Google to look again?
Technical SEO | | DotCar0 -
Would duplicate listings effect a client's ranking if they used same address?
Lots of duplication on directory listings using similar or same address, just different company names... like so-and-so carpet cleaning; and another listing with so-and-so janitorial services. Now my client went from a rank around 3 - 4 to not even in the top 50 within a week. -- -- -- Would duplication cause this sudden drop? Not a lot of competition for a client using keyword (janitorial services nh); -- -- -- would a competitor that recently optimized a site cause this sudden drop? Client does need to optimize for this keyword, and they do need to clean up this duplication. (Unfortunately this drop happened first of March -- I provided the audit, recommendations/implementation and still awaiting the thumbs up to continue with implementation). --- --- --- Did Google make a change and possibly find these discrepancies within listings and suddenly drop this client's ranking? And they there's Google Places:
Technical SEO | | CeCeBar
Client usually ranks #1 for Google Places with up to 12 excellent reviews, so they are still getting a good spot on the first page. The very odd thing though is that Google is still saying that need to re-verify their Google places. I really would like to know for my how this knowledge how a Google Places account could still need verification and yet still rank so well within Google places on page results? because of great reviews? --- Any ideas here, too? _Cindy0