XML Sitemap & Bad Code
-
I've been creating sitemaps with XML Sitemap Generator, and have been downloading them to edit on my pc. The sitemaps work fine when viewing in a browser, but when I download and open in Dreamweaver, the urls don't work when I cut and paste them in the Firefox URL bar. I notice the codes are different. For example, an "&" is produced like this..."&". Extra characters are inserted, producing the error.
I was wondering if this is normal, because as I said, the map works fine when viewing online.
-
Thanks guys! Upon further research what's happening is "Entity Escaping", where symbols have to use a code...ie & =
&, so it's all good.
-
It's probably normal within Dreamweaver, however a browser will see the & probably like a & so that won't be a problem for Google I'd guess if you want to submit your sitemap to the search engines.
-
Dreamweaver does funky stuff when you go from visual to code. Try opening the xml sitemap in notepad and copying/pasting from there and see if you get the same problem.
But based on my experience with that site, you should be fine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang in header...should I do a Sitemap?
A client implemented hreflang tags in the site header. MOZ says you aren't supposed to do an hreflang Sitemap as well. My question is how should I do a Sitemap now (or should I do one at all)?
Intermediate & Advanced SEO | | navdm0 -
Ranking for homepage & category page?
We lost our Google organic ranking (position 1 - 3) for our highest converting key phrase (cotton tees) in February. The ranking was for our homepage (brandname.com) which is very image heavy and doesn't have much readable content. We noticed that all of our competitors are ranking above us for their category page, not their homepage. The difference between us and our competitors is that we specialize in this key phrase and they just offer one category of the key phrase. For example, we only sell cotton tee's and they sell cotton tees, handbags and shoes. When we dropped we noticed that Google began showing our homepage AND category page in the results, so we pointed our brandname.com to brandname.com/cotton-tees canonically. The idea was that this would assure that the homepage and category page were not competing with each other. The homepage was not really optimized for cotton tees so we thought this might help. 1. Is there any harm in removing the canonical and allowing both pages to rank? (We're also working on redesigning the homepage to add more readable text & optimize for cotton tees.) 2. Our homepage URL used to be "brandname.com/cotton-tees" and we consistenly ranked between 1 and 3 for cotton tees during that time. We modified the homepage URL because it seemed spammy and are now just "brandname.com". Does it make sense to go back to the URL with the key phrase in it if that is our main product and we want to rank for it?
Intermediate & Advanced SEO | | EileenCleary0 -
Code Monitor Recommendations
Hi all, I was wondering if you have any recommendations for a code monitor? We'd like to keep track of any code and content changes on a couple of websites. We've taken a look at Page Monitor: https://chrome.google.com/webstore/detail/page-monitor/pemhgklkefakciniebenbfclihhmmfcd?hl=en but I'm not sure if it tracks code changes? Any suggestions for free or paid tools would be appreciated. Edit: We'd also like to avoid a tool that requires any tracking code changes or anything that involves a database/FTP connection.
Intermediate & Advanced SEO | | ecommercebc0 -
Google Frequently Indexing - Good or Bad?
Hi, My website is only 4 months old and receives about 40 to 50 organic visits every day. It currently has about 100 pages out of which only 3-4 rank in the top 10 for the target KWs. I usually try to publish, at least 1 article a day but sometimes certain articles are more than 2000 words long with a few of infographics and hence takes way more time (maybe even 3 days to publish one) Only over the last week, I am observing that every time i am publishing a page (usually daily) google is indexing them the same day. This I have heard happens for moderately big sites but my site is really small at this stage. Note: For the first 80 pages, I used to "fetch as googlebot" in webmasters as otherwise my site would be crawled once in 2 weeks but over the last 3-4 weeks, i rely on googles scheduled visits. Is this a good or bad sign? I would like to assume its good because of my engagement. Though for only organic visits, my Gogle Analytics bounce rate is 65% in analytics out of the remaining 35%, the avg time on site >7 mins. That means if someone sticks to my site, they consume a lot of my content. Also, since analytics' bounce rate is not same as the search bounce (back button) I would like to consider that the bounce is actually lesser than that.
Intermediate & Advanced SEO | | dwautism0 -
Is all duplication of HTML title content bad?
In light of Hummingbird and that HTML titles are the main selling point in SERPs, is my approach to keyword rich HTML titles bad? Where possible I try to include the top key phrase to descripe a page and then a second top keyphrase describing what the company/ site as a whole is or does. For instance an estate agents site could consist of HTML title such as this Buy Commercial Property in Birmingham| Commercial Estate Agents Birmingham Commercial Property Tips | Commercial Estate Agents In order to preserve valuable characters I have also been omitting brand names other than on the home page... is this also poor form?
Intermediate & Advanced SEO | | SoundinTheory0 -
How to create XML sitemap for larger website?
We need to create XML sitemap for a website that has more than 2 million pages. Please suggest me the best software to create XML sitemap for the website. Since there are different strategies that larger websites submit sitemaps, let me know the best way to submit this sitemap for website of this size. Or Is there any tool provided by SEOmoz for XML sitemap generation for larger websites?
Intermediate & Advanced SEO | | DCISEO0 -
XML Sitemap index within a XML sitemaps index
We have a similar problem to http://www.seomoz.org/q/can-a-xml-sitemap-index-point-to-other-sitemaps-indexes Can a XML sitemap index point to other sitemaps indexes? According to the "Unique Doll Clothing" example on this link, it seems possible http://www.seomoz.org/blog/multiple-xml-sitemaps-increased-indexation-and-traffic Can someone share an XML Sitemap index within a XML sitemaps index example? We are looking for the format to implement the same on our website.
Intermediate & Advanced SEO | | Lakshdeep0 -
SEOMoz Internal Dupe. Content & Possible Coding Issues
SEOmoz Community! I have a relatively complicated SEO issue that has me pretty stumped... First and foremost, I'd appreciate any suggestions that you all may have. I'll be the first to admit that I am not an SEO expert (though I am trying to be). Most of my expertise is with PPC. But that's beside the point. Now, the issues I am having: I have two sites: http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx A lot of our SEO efforts thus-far have done good for Federal Auto Loan... and we are seeing positive impacts from them. However, we recently did a server transfer (may or may not be related)... and since that time a significant number of INTERNAL duplicate content pages have appeared through the SEOmoz crawler. The number is around 20+ for both Federal Auto Loan and Federal Mortgage Services (see attachments). I've tried to include as much as I can via the attachments. What you will see is all of the content pages (articles) with dupe. content issues along with a screen capture of the articles being listed as duplicate for the pages: Car Financing How It Works A Home Loan is Possible with Bad Credit (Please let me know if you could use more examples) At first I assumed it was simply an issue with SEOmoz... however, I am now worried it is impacting my sites (I wasn't originally because Federal Auto Loan has great quality scores and is climbing in organic presence daily). That being said, we recently launched Federal Mortgage Services for PPC... and my quality scores are relatively poor. In fact, we are not even ranking (scratch that, not even showing that we have content) for "mortgage refinance" even though we have content (unique, good, and original content) specifically around "mortgage refinance" keywords. All things considered, Federal Mortgage Services should be tighter in the SEO department than Federal Auto Loan... but it is clearly not! I could really use some significant help here... Both of our sites have a number of access points: http://www.federalautoloan.com/Default.aspx and http://www.federalmortgageservices.com/Default.aspx are both the designated home pages. And I have rel=canonical tags stating such. However, my sites can also be reached via the following: http://www.federalautoloan.com http://www.federalautoloan.com/default.aspx http://www.federalmortgageservices.com http://www.federalmortgageservics.com/default.aspx Should I incorporate code that "redirects" traffic as well? Or is it fine with just the relevancy tags? I apologize for such a long post, but I wanted to include as much as possible up-front. If you have any further questions... I'll be happy to include more details. Thank you all in advance for the help! I greatly appreciate it! F7dWJ.png dN9Xk.png dN9Xk.png G62JC.png ABL7x.png 7yG92.png
Intermediate & Advanced SEO | | WPColt0