Sitemap Issue - vol 2
-
Hello everyone!
I validated the sitemap with different tools (w3Schools, and so on..) and no errors were found. So I uploaded into my site, tested it through GWT and BANG! all of a sudden there is a parsing error, which correspond to the last, and I mean last piece of code of thousand of lines, .
I don't know why it isn't reading the code and it's giving me this as there are no other errors and I haven't got a clue about what to do in order to fix it!
Thanks
-
Looks fine to me. Ran it through several xml sitemap validators and they all came back as valid.
Only change that would make sense to make it to update the urlset line to:
<urlset xmlns:xsi="<a class="attribute-value">http://www.w3.org/2001/XMLSchema-instance</a>" xmlns:image="<a class="attribute-value">http://www.google.com/schemas/sitemap-image/1.1</a>" xsi:schemaLocation="<a class="attribute-value">http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd</a>" xmlns="<a class="attribute-value">http://www.sitemaps.org/schemas/sitemap/0.9</a>"> (Taken from Yoast)
-
-
Please share the URL to your sitemap (or copy/paste it to http://pastebin.com)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible issues with 301 redirecting to a new domain name
Ive got a current domain and after a bit of a rebrand Im considering 301 rediecting the current site to a newly purchased domain. Id redirect each age to idential pages. Am I likely to see any issues. I know this is the recomended way from Google but just wondering how smoothly it works and whether Im likely to see any ranking drops or other problems?
Intermediate & Advanced SEO | | paulfoz16090 -
Having issues crawling a website
We looked to use the Screaming Frog Tool to crawl this website and get a list of all meta-titles from the site, however, it only resulted with the one result - the homepage. We then sought to obtain a list of the URLs of the site by creating a sitemap using https://www.xml-sitemaps.com/. Once again however, we just go the one result - the homepage. There is something that seems to be restricting these tools from crawling all pages. If you anyone can shed some light as to what this could be, we'd be most appreciative.
Intermediate & Advanced SEO | | Gavo0 -
Add versioning to an xml sitemap?
Is there a way to add versioning to an xml sitemap? Something like <version>x.x</version> outside of the <urlset>?</urlset> I've looked at a bunch of sitemaps for various sites and don't see anyone adding versioning information, but it seems like it would be a common issue - I can't believe someone hasn't come up with some way to do it.
Intermediate & Advanced SEO | | ATT_SEO0 -
International Href Lang Tag Parameter Issue
Hey, let's say I'm on the following page.. site.com/product-name/product-code/?d=womens I view the page source and it looks like this.. My question is, should I remove the parameter for the hreflang tag???? I just need some clarification that NO parameter page should have a canonical tag and / or href lang with parameters..
Intermediate & Advanced SEO | | ggpaul5620 -
Solving pagination issues for e-commerce
I would like to ask about a technical SEO issue that may cause duplicate content/crawling issues. For pagination, how the rel=canonical, rel="prev" rel="next" and noindex tag should be implemented. Should all three be within the same page source? Say for example, for one particular category we may have 10 pages of products (product catalogues). So we should noindex page 2 onwards, rel canonical it back to the first page and also rel="prev" and rel="next" each page so Google can understand they contain multiple pages. If we index these multiple pages it will cause duplicate content issues. But I'm not sure whether all 3 tags need adding. It's also my understanding that the search results should be noindexed as it does not provide much value as an entry point in search engines.
Intermediate & Advanced SEO | | Jseddon920 -
How to create XML sitemap for larger website?
We need to create XML sitemap for a website that has more than 2 million pages. Please suggest me the best software to create XML sitemap for the website. Since there are different strategies that larger websites submit sitemaps, let me know the best way to submit this sitemap for website of this size. Or Is there any tool provided by SEOmoz for XML sitemap generation for larger websites?
Intermediate & Advanced SEO | | DCISEO0 -
Domain and Sitemap Question
Hi - I am hoping you can help me with this issue we are currently trying to solve. We are hosting our mobile site's content on a different domain than what the URL of the site is, though owned by same company. In Google Webmasters tool we have the mobile sitemap under "sitemaps.xyz.com", however the URL of the site is "m.xyz.com". We have submitted 60MM pages in the mobile sitemap, but only 1MM pages have been indexed. Do you think this set up causes confusion with the bots? Does this affect the crawlability of the site? Any thoughts would be greatly appreciated. Thank you!
Intermediate & Advanced SEO | | ladylana
Eva0 -
How to fix issues regarding URL parameters?
Today, I was reading help article for URL parameters by Google. http://www.google.com/support/webmasters/bin/answer.py?answer=1235687 I come to know that, Google is giving value to URLs which ave parameters that change or determine the content of a page. There are too many pages in my website with similar value for Name, Price and Number of product. But, I have restricted all pages by Robots.txt with following syntax. URLs:
Intermediate & Advanced SEO | | CommercePundit
http://www.vistastores.com/table-lamps?dir=asc&order=name
http://www.vistastores.com/table-lamps?dir=asc&order=price
http://www.vistastores.com/table-lamps?limit=100 Syntax in Robots.txt
Disallow: /?dir=
Disallow: /?p=
Disallow: /*?limit= Now, I am confuse. Which is best solution to get maximum benefits in SEO?0