How often should we re-submit the site map
-
Hello,
my question is how often should we need to re- submit our site map in google webmaster tools?
like we are using prestashop and we keep on adding new products to our site. is we have a plugin to generate the site map from our backend.
is it necesary to login in to google webmaster tools every day and re submit our sitemap to google?
-
I do agree with you..as I told before there are tons of tools available to do this process automatically. I will try to use one of the most popular addon from prestashop addons site.. And let's see how it goes
-
Thank you Ravi. I understood the question and the answer is correct.
I'll share one example. I operate a site which runs on XenForo forum software. There is an add-on extension called XenUtiles which offers sitemap functionality. There is an admin panel for the extension whereby I simply check a box to submit to Bing, and another box to submit to Google. There is another tab whereby I set the frequency and time of submissions. It all happens automatically and without logging in to Bing nor Google.
If you pick any software package such a Joomla, for example, there are likely dozens of extensions which provide this functionality. There are many more which can be applied to any site.
-
i guess his question is how can we automatically submit sitemaps to both Google and Bing without the need to log into WMT.
-
There are literally hundreds of software packages which generate sitemaps. You would need to check the instructions for whatever software you use to generate the sitemap.
-
When I look at your sitemap I see mostly web page URLs with some image URLs as well. If you wish to exclude images from your sitemap, you should check the sitemap generating software for instructions on making that adjustment.
-
"You can automatically submit sitemaps to both Google and Bing without the need to log into WMT."
Would you Please clarify how to do it.
-
thanks for the reply..
i guess there is some module in prestashop modules which creates the site map everyday at midnight and submits automatically to WMT.. i will have a look at that..
but as i have asked in my other thread in WMT the type of site map showing is "images" and i dont know how to change that to URL's
this is our site map www.digitalcompring.com/sitemap.xml
thanks
-
Sitemaps are not required at all on a site with solid navigation. With that understood, an HTML sitemap can be useful for users and it is easy to automate both the HTML sitemap generation and the XML submission process.
In a perfect world, you can set your site to automatically submit a new site map whenever a new product or page is added. Just be sure not to go completely overboard. You do not want to submit more then 1 sitemap per hour.
If your developer is not able to generate updated sitemaps automatically based on content, then you can submit a daily sitemap update as well. You can automatically submit sitemaps to both Google and Bing without the need to log into WMT.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you think have to re-submit my site to search engines after I made improvements?
Some time ago I started to do SEO for a one-page website and didn't get any positive result: no traffic, no filled in online booking form (yet another, multiple page website offering the same service yielded in multiple filled-in "schedule an appointment" forms). I found out my one-page website was considered to be "keyword-spamming" and converted it to a multiple page one. Its domain authority went up, but it doesn't still bring any traffic. I am thinking maybe I have to let the search engines know that it has been updated so they stop penalizing it? Do you think it might help and if yes, what exactly I should do? Will be thankful very much for any suggestion!
Technical SEO | | kirupa0 -
Why can no tool crawl this site?
I am trying to perform a crawl analysis on a client's website at https://www.bravosolution.com I have tried to crawl it with IIS for SEO, Sreaming Frog and Xenu and not one of them makes it further than the home page of the site. There is nothing I can see in the robots.txt that is blocking these agents. As far as I can see, Google is able to crawl the site although they have noticed a significant drop in organic traffic. Any advise would be very welcome Regards Danny
Technical SEO | | richdan0 -
Redirect chains after a site migration
Hi A clients site was originally canonicalised to the www. from the non www versions Now its migrating to an international config of www.domain.com/uk and www.domain.com/us with the existing pages/urls (such as www.domain.com/pageA) 301'd to the new www.domain.com/uk/pageA for example Will this will create a 301 redirect chain due to the existence of the original canonicalised urls or is the way that works 'catch all' so to speak, and automatically update the canonical 301 redirects of the non www old architexcture url's to the new international architecture URL's ? I presume so but just want to check ? cheers dan
Technical SEO | | Dan-Lawrence0 -
Mobile site domain authority
Hello, I think this may be a coding issue, but hoping someone can help me. I am still having issues with our mobile site ranking, even though we created redirects/canonical to identify similar content between desktop version and mobile. I did notice through MOZ analysis of backlinks that we have no domain authority. If the mobile site is automatically detected dependent on the user, shouldn't we also have the same domain authority? How does that work exactly? How can we build up the domain authority for our mobile site? Any help would be greatly appreciated! Thanks
Technical SEO | | lfrazer0 -
Site architecture & breadcrumbs
Hi A client hasn't structured site architecture in a silo type format so breadcrumbs are not predicating in a topical hierarchy as one would desire (or at least i think one would prefer) For example: say the site is called www.fruit.com and it has a category called 'types of fruit' and then sub/content pages called things like 'apples' and 'pears'. So in terms of architecture that should be: www.fruit.com/types-of-fruit/apples and www.fruit.com/types-of-fruit/pears etc etc The client has kept it all flat so instead architecture is: www.fruit.com/types-of-fruit and www.fruit.com/apples and www.fruit.com/pears As a result breadcrumbs follow suit and hence since also not employing logical predication dont reflect the topical & sub-topical hierarchy I have seen that some seo's at least used to think this was better for seo since kept the page/s nearer the root but surely its better to structure site architecture in a logical topical hierarchy so long as dont go beyond say 3 or 4 directories/forward slashes in the url's? Also is it theoretically possible to keep url structure as is (flat) and just edit/customise the breadcrumbs to reflect a topical hierarchy in a silo structure rather than change the entire site architecture & required 301'ing etc in order to do this (or is that misleading or just not possible?) Cheers Dan
Technical SEO | | Dan-Lawrence0 -
Google having trouble accessing my site
Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/0 -
Site not indexing correctly
I am trying to figure out what is going on with my site listings. Google is only displaying my title and url - no description. You can see it when you search for Franchises for Sale. The site is www.franchisesolutions.com. Why could this happen? Also I saw a big drop off in a handful of keyword rankings today. Could this be related?
Technical SEO | | franchisesolutions0 -
.CA site same as .com site - are both necessary?
Dear Friend, We representa a major national brand in the auto care industry, and they have locations in both US and Canada. There is a primary content site at .com that we have duplicated at .ca. We are hosting the .ca site on a separate IP on a server in Canada - but by in large it is the same site. (there are some minor changes we made to change US English to Canadian English - though minor. When we search Google.ca we generally see strong search results for the .com site, but rarely, if ever any evidence of rankings for the .ca site. The .com site was launched several years ago about 18 months before the .ca site. Why doesn't Google.ca show the .ca site? Is this an issue of duplicate content, and Google.ca simply shows the .com version which it knew about first? Are we wasting our time, money and efforts having both? Thanks, Tim ps. this isn't about location. We use a separate site to locate local shops, and have coordinated that well with Google Places, and when looking for local auto care - we do well in both US and Canada. The sites described above are largetl content sites.
Technical SEO | | lunavista-comm0