How do i Organize an XML Sitemap for Google Webmaster Tools?
-
OK, so i used am xlm sitemap generator tool, xml-sitemaps.com, for Google Webmaster Tools submission. The problem is that the priorities are all out of wack.
How on earth do i organize it with 1000's of pages??
Should i be spending hours organizing it?
-
I'm sure there are some people that do prioritize their sitemap manually, but not me. I don't think the priority setting is THAT important!
-
part of it is our blog the other part is not CMS but 30+ pages. Do people organize it manually. Is it really worth it?
-
I'm assuming that you have a CMS? Maybe you can program your CMS to generate the sitemap and automatically assign priorities. If not, don't worry about it - the priorities only apply to crawl priority, not rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sudden data lost of links in Webmaster tools & search clicks too decreased to zero
Hi guys please help me to solve this webmaster tools issue and downfall of clicks and impressions. What is the Issue : I have managed hills self storage site since long time, but recently ( since few weeks ) in webmaster what I noticed is that, there is a constant reduction in clicks which reached zero thereafter it has been continuously stuck to zero, however impressions too started decreasing from the same time period and that too reached gradually to zero. What I have tested : I have tested Analytics Traffic which is increasing gradually Keywords ranking in Google Australia is also increasing gradually Before October 2014 Total links are 200+, In mid October 2014 it has started decreasing gradually & now it is showing “No Data Available”. 301 redirection is perfect, google fetching is also ok. What Enhancement we did : Site moved to a new server in October 2014 We switched our site pages from “http” to “https” in October 2014 Kindly reply to my above queries so that I can get back to my “Total Links”, “correct impressions” and “clicks” as previously in the webmaster tools ? Regards, Dave 2eevhoj.jpg zxpro5.jpg
Technical SEO | | akshaydesai0 -
Will an XML sitemap override a robots.txt
I have a client that has a robots.txt file that is blocking an entire subdomain, entirely by accident. Their original solution, not realizing the robots.txt error, was to submit an xml sitemap to get their pages indexed. I did not think this tactic would work, as the robots.txt would take precedent over the xmls sitemap. But it worked... I have no explanation as to how or why. Does anyone have an answer to this? or any experience with a website that has had a clear Disallow: / for months , that somehow has pages in the index?
Technical SEO | | KCBackofen0 -
Removing Redirected URLs from XML Sitemap
If I'm updating a URL and 301 redirecting the old URL to the new URL, Google recommends I remove the old URL from our XML sitemap and add the new URL. That makes sense. However, can anyone speak to how Google transfers the ranking value (link value) from the old URL to the new URL? My suspicion is this happens outside the sitemap. If Google already has the old URL indexed, the next time it crawls that URL, Googlebot discovers the 301 redirect and that starts the process of URL value transfer. I guess my question revolves around whether removing the old URL (or the timing of the removal) from the sitemap can impact Googlebot's transfer of the old URL value to the new URL.
Technical SEO | | RyanOD0 -
DNS error on webmaster tool
Google webmaster tool is showing DNS error and that is leading to many server error (502,500) almost 50+ in every crawl. Recently Google crawled one of our sub domains that we did not want google to crawl. We blocked it via Robots.txt and also removed all the URL's and since then we are having this issue. Any suggestions how to fix this DNS error? Thanks in advance.
Technical SEO | | tpt.com0 -
Does Google Read Javascript?
I would like to include a list of links in a select type box which I would like google to follow. In order to do this, I will be styling it with the help of javascript, and in turn change the select box into a ul and the options into li's. The li's would each contain a link, but if javascript is disabled it will fallback to a normal css styled select box. My question is would google follow the links made by the javascript? Or would the bot just recognize the select box as a select box and not links. Thanks for any help!
Technical SEO | | BrianJenkins0 -
Recent Webmaster Tools Glitch Impacting Site Quality?
The ramifications of this would not be specific to myself but to anyone with this type of content on their pages... Maybe someone can chime in here, but I'm not sure how much if at all site errors (for example 404 errors) as reported by Google Webmaster Tools are seen as a factor in site quality, which would impact SEO rankings. Any insight on that alone would be appreciated. I've noticed some fairly new weird stuff going on in the WMT 404 error reports. It seems as though their engine is finding objects within the source code of the page that are NOT links but look a URL, then trying to crawl them and reporting them as broken. I've seen a couple different of cases in my environment that seem to trigger this issue. The easiest one to explain are Google Analytic virtual pageview Javascript calls where for example you might send a virtual pageview back to GA for clicks on outbound links. So in the source code of your page you would have something like: onclick="<a class="attribute-value">_gaq.push(['_trackPageview', '/outboundclick/www.othersite.com']);</a> Although this is obviously not a crawl-able link, sure enough Webmaster Tools now would be reporting the following broken page with a 404: www.mysite.com/outboundclick/www.otherwite.com I've seen other such cases of thing that look like URLs but not actual links being pulled out of the page source and reported as broken links. Has anyone else noticed this? Do 404 instances (in this case false ones) reported by Webmaster Tools impact site quality rankings and SEO? Interesting issue here, I'm looking forward to hear some people's thoughts on this. Chris
Technical SEO | | cbubinas0 -
Sitemap Creation
Hi I am looking for the best way to generate an XML sitemap for webmaster tools for my website http://www.cheapfindergames.com. I have come across http://www.xml-sitemaps.com/ but it only allows up to 500 links. Is there a PHP script that any experts could share that would create the XML map that I could upload please? Many Thanks
Technical SEO | | ocelot0 -
Duplicate XML sitemaps - 404 or leave alone?
We switched over from our standard XML sitemap to a sitemap index. Our old sitemap was called sitemap.xml and the new one is sitemapindex.xml. In Webmaster Tools it still shows the old sitemap.xml as valid. Also when you land on our sitemap.xml it will display the sitemap index, when really the index lives on sitemapindex.xml. The reason you can see the sitemap on both URLs is because this is set from the sitemap plugin. So the question is, should we change the plugin setting to let the old sitemap.xml 404, or should we allow the new sitemap index to be accessed on both URLs?
Technical SEO | | Hakkasan0