Is it dangerous to use "Fetch as Google" too much in Webmaster Tools?
-
I saw some people freaking out about this on some forums and thought I would ask.
Are you aware of there being any downside to use "Fetch as Google" often? Is it a bad thing to do when you create a new page or blog post, for example?
-
Hi Keri
I did yes, i stumbled upon it and thought i'd give my two pennies worth as an SEO!
Certainly wasnt looking for a backlink as it would be pretty irrelevant for our industry and would never expect a dofollow links from a comments section anyway.
Thanks to you also for your feedback
Cheers!
-
Welcome, LoveSavings. Just wanted to make sure you knew this post is a year old, and that all of the links in Q&A are automatically nofollowed. Thanks for the thoughtful answer!
-
Having done lots of tests on this, i would say that fetching as google is the best wat forward.
Although the steps listed above are all excellent ways of boosting the speed at which google will index your page, none of them seem to be as effective as fetching in webmaster tools. you can a few hundred of these a month, so you shouldnt run out unless you are publishing immense amounts of content - in which case google is likely to be indexing your content very quickly anyway.
www.loveenergysavings.com is still relatively small although we publish excellent, though leadership style content. so, to ensure that our posts are indexed as quickly as possible (as we are competing with some massive sites) we always fetch our posts in google webmaster tools. this is always quicker than tweeting, google+ etc. we also have an xml sitemap which automatically adds our post, this doesnt guarantee rapid indexing though.
having messed around with all of these methods, fetching as g-bot is always the quickest and most effective option. as danatanseo says, its there to be utilised by seo's so why not take full advantage? i can't see why google would ever look unfavourably on a site for wanting its content to be available to the public as quickly as possible?
-
I would say it is not a preferred way to alert Google when you have a new page and it is pretty limited. What is better, and frankly more effective is to do things like:
- add the page to your XML sitemap (make sure sitemap is submitted to Google)
- add the page to your RSS feeds (make sure your RSS is submitted to Google)
- add a link to the page on your home page or other "important" page on your site
- tweet about your new page
- status update in FB about your new page
- Google Plus your new page
- Feature your new page in your email newsletter
Obviously, depending on the page you may not be able to do all of these, but normally, Google will pick up new pages in your sitemap. I find that G hits my sitemaps almost daily (your mileage may vary).
I only use fetch if I am trying to diagnose a problem on a specific page and even then, I may just fetch but not submit. I have only submitted when there was some major issue with a page that I could not wait for Google to update as a part of its regular crawl of my site. As an example, we had a release go out with a new section and that section was blocked by our robots.txt. I went ahead and submitted the robots.txt to encourage Google to update the page sooner so that our new section would be :"live" to Google sooner as G does not hit our robots.txt as often. Otherwise for 99.5% of my other pages on sites, the options above work well.
The other thing is that you get very few fetches a month, so you are still very limited in what you can do. Your sitemaps can include thousands of pages each. Google fetch is limited, so another reason I reserve it for my time sensitive emergencies.
-
https://support.google.com/webmasters/answer/158587?hl=en#158587
I just double-checked David, and it looks like the allocation may not be different for different sites. According to Google you get 500 fetches and 10 URL + Linked pages submissions every week.
-
You are welcome David, and no this isn't a lifetime limit at all. I believe it resets at least once every 30 days, maybe more often than that. I manage four different sites, some large, some small and I've never run out of fetches yet.
-
Thanks Dana. Is it possible to get more fetches? Presumably it's not a lifetime limit, right?
-
No, I wouldn't worry about this at all. This is why Google has already allocated a finite number of "Fetches" and URL + Links submissions to your account. These numbers are based on the size of your site. Larger sites are allocated more and smaller sites less. [Please see my revised statement below regarding Google's "Fetch" limits - it isn't based on site size] I don't think enough Webmasters take advantage of the Fetch as often as they should.
Hope that helps!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I Use Multiple rel="alternate" Tags on Multiple Domains With the Same Language?
Hoping someone can answer this for me, as I have spent a ton of time researching with no luck... Is there anything misleading/wrong with using multiple rel="alternate" tags on a single webpage to reference multiple alternate versions? We currently use this tag to specify a mobile-equivalent page (mobile site served on an m. domain), but would like to expand so that we can cover another domain for desktop (possibly mobile in the future). In essence: MAIN DOMAIN would get The "Other Domain" would then use Canonical to point back to the main site. To clarify, this implementation idea is for an e-commerce site that maintains the same product line across 2 domains. One is homogeneous with furniture & home decor, which is a sub-set of products on our "main" domain that includes lighting, furniture & home decor. Any feedback or guidance is greatly appreciated! Thanks!
Intermediate & Advanced SEO | | LampsPlus0 -
The "webmaster" disallowed all ROBOTS to fight spam! Help!!
One of the companies I do work for has a magento site. I am simply the SEO guy and they work the website through some developers who hold access to their systems VERY tightly. Using Google Webmaster Tools I saw that the robots.txt file was blocking ALL robots. I immediately e-mailed out and received a long reply about foreign robots and scrappers slowing down the website. They told me I would have to provide a list of only the good robots to allow in robots.txt. Please correct me if I'm wrong.. but isn't Robots.txt optional?? Won't a bad scrapper or bot still bog down the site? Shouldn't that be handled in httaccess or something different? I'm not new to SEO but I'm sure some of you who have been around longer have run into something like this and could provide some suggestions or resources I could use to plead my case! If I'm wrong.. please help me understand how we can meet both needs of allowing bots to visit the site but prevent the 'bad' ones. Their claim is the site is bombarded by tons and tons of bots that have slowed down performance. Thanks in advance for your help!
Intermediate & Advanced SEO | | JoshuaLindley0 -
Change of URLs: "little by little" VS "all at once"
Hi guys, We're planning to change our URLs structure for our product pages (to make them more SEO friendly) and it's obviously something very sensitive regarding the 301 redirections that we have to take with... I'm having a doubt about Mister Google: if we slowly do that modification (area by area, to minimize the risk of problems in case of bad 301 redirection), would we lose rankings in the search engine? (I'm wondering if they might consider our website is not "coherent" -> not the same product page URLs structure for all the product pages during some time) Thanks for your kind opinion 😉
Intermediate & Advanced SEO | | Kuantokusta0 -
Is "Car Discount" a problematic anchor text for CarDiscount.com (google penguin)?
I have a couple of partial match domains in the format KEYOWRDdiscount.com and also the website name resembles domain name. "Car Discount" is not my website but just an example to illustrate:
Intermediate & Advanced SEO | | lcourse
Is "Car Discount" a problematic anchor text for CarDiscount.com?
Should I try to modify existing external anchor texts to "CarDiscount" or "CarDiscount.com" instead of "Car Discount" Do you know of any cases where such anchor texts coinciding with partial match domain were likely reason for penguin penalization? Thanks.0 -
"Hotel" SEO & TripAdvisor
I am trying to learn a little more about Travel SEO, particularly in the "hotel" vertical. what are some of the top Hotel SEO sites out there and what are they doing right? Tripadvisor is great at SEO in general, but I've heard they struggle a little in the "hotel" vertical. Is there anything they can do to improve their rankings in this area? Does anyone have any suggestions, whether it be a far out idea or on-site optimization? Thanks!
Intermediate & Advanced SEO | | Super70 -
How do I get rid of all the 404 errors in google webmaster tools after building a new website under the same domiain
I recently launched my new website under the same domain as the old one. I did all the important 301 redirects but it seems like every url that was in google index is still their but now with a 404 error code. How can I get rid of this problem? For example if you google my company name 'romancing diamonds' half the link under the name are 404 errors. Look at my webmaster tools and you'll see the same thing. Is their anyway to remove all those previous url's from google's indexes and start anew? Shawn
Intermediate & Advanced SEO | | Romancing0 -
Questions regarding Google's "improved url handling parameters"
Google recently posted about improving url handling parameters http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html I have a couple questions: Is it better to canonicalize urls or use parameter handling? Will Google inform us if it finds a parameter issue? Or, should we have a prepare a list of parameters that should be addressed?
Intermediate & Advanced SEO | | nicole.healthline0 -
Why my site is "STILL" violating the Google quality guidelines?
Hello, I had a site with two topics: Fashion & Technology. Due to the Panda Update I decided to change some things and one of those things was the separation of these two topics. So, on June 21, I redirected (301) all the Fashion pages to a new domain. The new domain performed well the first three days, but the rankings dropped later. Now, even the site doesn't rank for its own name. So, I thought the website was penalized for any reason, and I sent a reconsideration to Google. In fact, five days later, Google confirmed that my site is "still violating the quality guidelines". I don't understand. My original site was never penalized and the content is the same. And now when it is installed on the new domain becomes penalized just a few days later? Is this penalization only a sandbox for the new domain? Or just until the old URLs disappear from the index (due to the 301 redirect)? Maybe Google thinks my new site is duplicating my old site? Or just is a temporal prevention with new domains after a redirection in order to avoid spammers? Maybe this is not a real penalization and I only need a little patience? Or do you think my site is really violating the quality guidelines? (The domain is http://www.newclothing.co/) The original domain where the fashion section was installed before is http://www.myddnetwork.com/ (As you can see it is now a tech blog without fashion sections) The 301 redirect are working well. One example of redirected URLs: http://www.myddnetwork.com/clothing-shoes-accessories/ (this is the homepage, but each page was redirected to its corresponding URL in the new domain). I appreciate any advice. Basically my fashion pages have dropped totally. Both, the new and old URLs are not ranking. 😞
Intermediate & Advanced SEO | | omarinho0