How long does it take for an article or a page to be listed by google
-
Hi, my question is a two parter. I think i must be doing something wrong.
With my site map, it is set to show different section of my site while on my old site the site map listed every single article - i am not sure if setting it to each section is correct, can someone please advise me on this.
The second part of the question is, how long does it take for an article to be listed by google.
This article on my site was written today http://www.in2town.co.uk/lifestyle/holidaymakers-ignore-the-importance-of-travel-insurance-according-to-survey
Holidaymakers Ignore The Importance of Travel Insurance According To Survey
but when i check to see if google has listed the article yet by putting in the whole title, it does not come up, i even added the website name at the end and still it did not come up.
This is worrying me a bit as a lot of my articles are news stories which means they are current articles so if google is not picking them up then no one else will be.
can anyone let me know what i should be doing so google picks them up quicker please.
-
If you add new conetnt every day, you will start to get crawled every day.
-
the huge problem i have is getting the news pages picked up straight away, this has been a big headache of mine. there is no point in a news page being read in two days when it is old news.
I need to find a way to promote the latest news on my site and get it picked up by google
-
Bing's duane forrester said that you should not list every page, but the imporatant pages, but when i asked him about this he said that for a small site it is ok to list every page.
A site maop does not mean that your the pages it luist will be indexed, nor does it mean pages that are not included wont be indexed. It is a chance to giove the SE some info about the pages. liek change freqency, last modified, priority and such. It is also a signal of the canonical version of a page.
It is also worth noting that Bing will ignore a sitemap if it is not honest, if you put updated daily but dont do so, they will lose trust in it.
As for how long it takles to get listed, anywhere up to a month in most cases. In bing webmster tools you can place it directly into the index and will be in results shoprtly after, you can do the same in GWMT but using the instant previews or fetch as googlebot (I cant rember which) I have been told.
-
I'm not a Joomla expert - so you're best bet is to check with someone who is, however there are Joomla extensions you can use to automate the generation of your sitemap so you don't have to manually do it every time.
Which one you use is something I'm not prepared to recommend because I am not up to speed enough on Joomla.
-
hi alan, this is great. can you explain more. i use joomla, so not sure how to really set the site map.
this is the site map i am using
http://www.in2town.co.uk/sitemap-xml?sitemap=1
can you explain what i need to do to make sure that all articles are included and should i put the sitemap on my site or leave it in googlewebmaster
-
Diane,
a sitemap.xml file should include links to every page on the site you want indexed. While Google and Bing are fairly good at discovering content, this helps ensure they find pages sooner than their crawler might get around to discovering them. (unless you have a site with more than 10,000 URLS - at which point you should consider splitting sitemap files into multiple files and including a separate sitemap index file that you then submit. )
That then leads to the next question - how often? Every site is different and crawled at a different frequency based on Google's assessment of how often it should happen as well as factoring in that their system can only crawl so many pages on any given day.
That alone is reason to include all your content in sitemap files - and automatically ping search engines each time the sitemap file is updated.
If you have enough "news quality" content, look into a separate news sitemap file as well. With the right footwork and leverage, you can then see if your news specific content can be indexed even faster, and included in the Google news system as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Implemented google adwords via tag manager do it still require to paste script at thank you page?
Hi All Experts, I have implemented google adwords with tag manager, so now query is still it is required to place the google adwords scripts at thank you page?
Technical SEO | | varo0 -
Is there a way to get Google to index more of your pages for SEO ranking?
We have a 100 page website, but Google is only indexing a handful of pages for organic rankings. Is there a way to submit to have more pages considered? I have optimized meta data and get good Moz "on-page graders" or the pages & terms that I am trying to connect....but Google doesn't seem to pick them up for ranking. Any insight would be appreciated!
Technical SEO | | JulieALS0 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
Can you 301 redirect a page to an already existing/old page ?
If you delete a page (say a sub department/category page on an ecommerce store) should you 301 redirect its url to the nearest equivalent page still on the site or just delete and forget about it ? Generally should you try and 301 redirect any old pages your deleting if you can find suitable page with similar content to redirect to. Wont G consider it weird if you say a page has moved permenantly to such and such an address if that page/address existed before ? I presume its fine since say in the scenario of consolidating departments on your store you want to redirect the department page your going to delete to the existing pages/department you are consolidating old departments products into ?
Technical SEO | | Dan-Lawrence0 -
Does Google count the same article in different languages as duplicate content?
If the "content" is the same, but is used in separate articles written in entirely different languages, will Google see the articles as duplicate content? How do international companies handle information that needs to be distributed in different languages?
Technical SEO | | BlueLinkERP0 -
Will having a big list of cities for areas a client services help or damage SEO on a page?
We have a client we inherited that has flat text list of all the cities and counties they service on their contact page. They service the entire southeast so the list just looks crazy ridiculous. --------- Example: ---- South Carolina: Abbeville, Aiken, Allendale, Anderson, Bamberg, Barnwell, Beaufort, Berkeley, Calhoun, Charleston, Cherokee, etc etc ------ end example ------ The question is, will this help or hinder their seo for their very specific niche industry? Is this key word spamming? It has an end-user purpose so it technically isn't spam, but perhaps the engines may look at it otherwise. I couldn't find a definitive answer to the question, any help would be appreciated.
Technical SEO | | Highforge0 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0 -
Discrepency between # of pages and # of pages indexed
Here is some background: The site in question has approximately 10,000 pages and Google Webmaster shows that 10,000 urls(pages were submitted) 2) Only 5,500 pages appear in the Google index 3) Webmaster shows that approximately 200 pages could not be crawled for various reasons 4) SEOMOZ shows about 1,000 pages that have long URL's or Page Titles (which we are correcting) 5) No other errors are being reported in either Webmaster or SEO MOZ 6) This is a new site launched six weeks ago. Within two weeks of launching, Google had indexed all 10,000 pages and showed 9,800 in the index but over the last few weeks, the number of pages in the index kept dropping until it reached 5,500 where it has been stable for two weeks. Any ideas of what the issue might be? Also, is there a way to download all of the pages that are being included in that index as this might help troubleshoot?
Technical SEO | | Mont0