Is there a we to get Google to index our site quicker?
-
I have updated some pages on a website, is there a way to get Google to index the page quicker?
-
Fetching as Googlebot and ensuring the page is visible elsewhere on the web (marketing!) are the best ways to spur quicker crawling and indexing, as others have said. If you notice the cache date of the pages not updating and changes not making it into Google's index, it would be time to check for larger issues that might be preventing or dissuading Google from reaching the site more regularly.
-
please also be aware you can only do a fetch 10 times a month so make them count! Only do it when you must.
-
Google Plus is a great way. There was a study done by Stone Temple Consulting
http://www.stonetemple.com/measuring-google-plus-impact-on-search-rankings/
It concluded that there was no direct impact on ranking, but here was the interesting part. GoogleBot visited a page within 6 minutes of the page being shared on Google Plus.
All of the other points on fetching in GWT, etc are all valid as well, it was just interesting to me how quick GoogleBot reacts to Google Plus.
Cheers!
-
I would be sure to share the page on Google plus. Since you can't otherwise control crawl frequency, make sure your site is well-optimized to ensure that googlebot doesn't have problems crawling it. Check the page speed, fix and HTML errors, correct any missing URLs and broken links.
-
Fetch as Google works well alternatively you can also post on twitter and it will get crawled from there depending on how popular your account is etc.
-
I agree, i would definitely fetch in WMT or you can maybe update your content or post a blog to get them to recrawl.
-
This is a little more specific:
You can get there by going to GWT, clicking on your site, then on the left, click on "crawl," then click on "Fetch as google," then enter the url you want indexed and hit "fetch." You then can pick between just "the page" or "the page and all the pages linked to it "to be fetched. That's pretty much up to you, but if you don't use the tool all that often, "you might as well pick the page and the pages linked to it" option.
Sometimes you'll get this weird error message, but that's (most likely) not your fault. I've had it happen every now and then. I just try it again a few times, and it usually works. If not, still, just try it again in a few hours.
hope this helps,
Ruben
-
Yes. You can use google fetch in webmaster tools. It's more of a request, than a demand. However, it has worked for me in the past, and google has indexed my pages faster when I used it.
- Ruben
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404s in Google Search Console and javascript
The end of April, we made the switch from http to https and I was prepared for a surge in crawl errors while Google sorted out our site. However, I wasn't prepared for the surge in impossibly incorrect URLs and partial URLs that I've seen since then. I have learned that as Googlebot grows up, he'she's now attempting to read more javascript and will occasionally try to parse out and "read" a URL in a string of javascript code where no URL is actually present. So, I've "marked as fixed" hundreds of bits like /TRo39,
Algorithm Updates | | LizMicik
category/cig
etc., etc.... But they are also returning hundreds of otherwise correct URLs with a .html extension when our CMS system generates URLs with a .uts extension like this: https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.html
when it should be:
https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.uts Worst of all, when I look at them in GSC and check the "linked from" tab it shows they are linked from themselves, so I can't backtrack and find a common source of the error. Is anyone else experiencing this? Got any suggestions on how to stop it from happening in the future? Last month it was 50 URLs, this month 150, so I can't keep creating redirects and hoping it goes away. Thanks for any and all suggestions!
Liz Micik0 -
Where has Google found the £1.00 value for the penny black? Is it Google moving beyond the mark-ups too?
Hi guys, I am curious, so am wondering something about the Penny Black SERPs.
Algorithm Updates | | madcow78
Apparently Google shows a value of £1.00 Penny Black SERP From where does it come from? It's not the value Penny Black Value SERP The Wikipedia page hasn't any mark-up about it, actually it has the Price value mark-up of 1 penny Penny Black Wiki Markup Among the rare stamps, also the Inverted Jenny shows a value Inverted Jenny SERP But it's clearly taken from USPS and it's the cost of a new version of this rare stamp USPS Inverted Jenny Indeed, the mark-up matches that value USPS Inverted Jenny Mark-up I've been looking on-line for a new version of the Penny Black, but couldn't find anything.
The only small piece of information that I've found to correlate one pound with the Penny Black is on the Wikipedia page, but the point is: is Google able to strip those information from that piece? It's not a mark-up, it's not a number and mostly it's not a simple sentence like "The penny black cost was of £1.00" It reads "One full sheet cost 240 pennies or one pound sterling". Penny Black Wikipedia particular Is it Google moving beyond the mark-ups too? Thanks, Pierpaolo 9Cm3MOs.jpg f7XYNtF.jpg 5PpwapB.jpg hYUJswI.jpg 7kbIC4Q.jpg jnu1Gbe.jpg Wzltg0t.jpg2 -
Could Retail Price Be A Google Ranking Factor???
I have not done any detailed studies on this but it seems that Google might be using low retail prices for specific items as a ranking factor in their organic SERPs. Does anyone else suspect this? Just askin' to hear your thoughts. Thanks!
Algorithm Updates | | EGOL0 -
Ranking Drop After Switching Sites
I have a client who's rankings dropped after switching to out site. We know that rankings can drop a little after switching, but we are concerned that hers are still low. Any suggestions? As far as I can tell, the links to her site remained the same. Thanks Holly
Algorithm Updates | | hwade1 -
How do I separate 2 Google+ business listings?
Ever since Google Places started merging with Google+, my client's business listing is now showing up in local search results incorrectly under another business name who shares the same address as them. Has anyone else encountered this problem or a way to correct it?
Algorithm Updates | | TheeDigital0 -
Has there been a Google change in the last 24 hours?
We have come in this morning to find our site (paydayuk.co.uk) has suddenly disappeared from their SERPs, we have consistently been ranking in the top 5 for a wide range of search terms but now do not even appear for our brand name of Payday UK where we have been first for many months. Our site is still indexed and we have made no changes for a while as any SEO work is waiting on completion of a CMS system. Looking in https://groups.google.com/a/googleproductforums.com/forum/#!categories/webmasters/crawling-indexing--ranking and there seem to be a lot of people having the same issues but as of yet no answers. I'd also like to add we don’t use black hat techniques so we really don’t understand why we have been penalised. Can anyone help please?
Algorithm Updates | | Sarbs0 -
Does google have the worst site usability?
Google tells us to make our sites better for our readers, which we are doing, but do you think google has horrible site usabilty? For example, in webmaster tools, I'm always being confused by their changes and the way they just drop things. In the HTML suggestions area, they don't tell you when the data was last updated, so the only way to tell is to download the files and check. In the URL removals, they used to show you the URLs they had removed. Now that is gone and the only way you can check is to try adding one. We don't have any URL parameters, so any parameters are as a result of some other site tacking on stuff at the end of our URL and there is no way to tell them that we don't have any parameters, so ignore them all. Also, they add new parameters they find on the end of the list, so the only way to check is to click through to the end of the list.
Algorithm Updates | | loopyal0 -
How do I get the expanded results in a Google search?
I notice for certain site (ex: mint.com) that when I search, the top result has a very detailed view with options to click to different subsections of the site. However for my site, even though we're consistently the top result for our branded terms, the result is still only a single line item. How do I adjust this?
Algorithm Updates | | syount1