What is the best practice for replacing an old xml sitemap?
-
I have an existing xml sitemap that my website developer loaded, however I don't think its set up properly.
What is the best practice for replacing an old xml sitemap?
Is there anything I should be concerned about?
-
Depending on the size of your site, I would probably break your sitemap up into categories for better monitoring.
- Sitemap.xml (sitemap index of all your sitemaps)
- sitemap-blog.xml
- sitemap-categories.xml
- sitemap-main.xml
- sitemap-videos.xml (if you have any videos you should create a video sitemap, different then a standard sitemap)
When you have them all complete and tested, then submit them to both Google and Bing via Webmaster Tools.
Sitemap Protocol Information: http://www.sitemaps.org
Google's XML Sitemap information: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156184
-
You could use a XML sitemap builder like http://www.xml-sitemaps.com, I just tested it and seems to work alright.
Make sure you are truthful about how often you update your website with content. This could be daily, weekly monthly and so on. Second, I would let the sitemap builder determine importance of your pages, it should make your home page the most important.
WARNING: Using the above service will let you work with only 500 pages, if you have more than 500 pages on your website do not use the sitemap it generates.
You can find other services by searching "sitemap generator"
The best way would be to use http://code.google.com/p/googlesitemapgenerator/ but configuration and setup is required.
Good Luck! Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Moz is showing old results for site craw issues
Moz is showing old results for site craw issues, the ondemand craw works but I can't fix these issues without knowing what causing them.
Moz Pro | | paulwildweb0 -
Best blog practices for website
For my Insurance website blog, I use MOZ to help me find high DA authoritative sites, then either generate ideas from them, or rewrite the copy. If I rewrite the copy, I tend to pull from 2 - 3 top authoritative sites. Just so I don't get in trouble, but still offer the most concision information. _My question is, Is this ok to do? _ Secondly, I just read that on some .Gov sites the information is public, and that you can use it as long as you give credit. _My questions is, how do I tell which information is public? _ Thank you in advance 🙂
Moz Pro | | MissThumann0 -
How Old is OSE link data?
I ran an anchor text report for my client today, which shows that their site has some incoming comment spam links using totally unrelated phrases (pharma products). However, when looking for the live link, the linking page no longer contains the link to them. Maybe the webmasters removed these, but I can't track down a single one... how old is this data? thanks
Moz Pro | | JMagary0 -
Best way to rank an E-commerce site fast? [GOOGLE]
Does anyone have experience in ranking E-commerce sites? I'm trying to rank 1000+ subcategorys and 300+ products. All less than 1k Exacts some are just 20ish some go up to 1k. I've used Traffic Travis all keywords are ranked as [Easy] along with the tools here. My competition have 0 backlinks to there sites. Some sites have PR1-4 (only 2-3 per SERP rest are 0 so should be quite easy. However I'm up against Amazon and Google shopping for the products. Anyways what would you guys recommend me do?
Moz Pro | | InkCartridgesFast0 -
What is the best method to solve duplicate page content?
The issue I am having is an overwhelmingly large number of pages on cafecartel.com show that they have duplicate page content. But when I check the errors on SEOmoz it shows that the duplicate content is from www.cafecartel.com not cafecartel.com. So first of all, does this mean that there are two sites? and is this a problem I can fix easily? (i.e. redirecting the URL and deleting the extra pages) Is this going to make all other SEO useless due to the fact that it shows that nearly every page has duplicate page content? Or am I just completely reading the data wrong?
Moz Pro | | MarkP_0 -
Keywords Best Practices for On-Page Optimization
Hi guys, we've successfully optimized our home page such that it receives a Grade A for 3 completely different, high traffic keywords. Looking forward to seeing the results! The keywords in question were identified by using the monthly searches reported from the Google Keyword Tool. For one of the keywords, the Google Keyword Tool differentiates between what I thought would be seen as being the same. For example, let's say Google reports these three keywords as high traffic keywords: tea cup
Moz Pro | | yacpro13
tea cups
the tea cup Using the On-Page Report Card, we get a Grade A for 'tea cup', but we get an F for the other 2 terms! I thought Google searches didn't really care about the plural form or adding the word 'the' in front. How should we interpret the result from the On-Page Report Card for the plural form of the keyword and with the word 'the' added in front? Would you track all 3 instances of the keyword independtly in your campaign, or would you just track 'tea cup'? Thanks!0 -
Idea for the Best Method of getting REAL Keyword Search Data
I've been using googles keyword tools to get keyword search data, but I'm pretty sceptical of some of the results it throws out on number of searches. On a number of keywords - particularly when you get to longtail, I've seen results which cannot possibly be right. So - I am trying to find out a way of getting REAL data I can trust. Here are my thoughts on a possible method of getting this key data:
Moz Pro | | James77
1/. Firstly I am taking on the assumption that the adwords campaign monitoring statistics are more reliable and accurate than the keyword research tools - correct me if I am wrong! 2/. Then take a load of keywords you wish to monitor and create didfferent ads for them based of EXACT match. 3/. Make sure non of your adverts conflict with each other, so you may have to turn off any other campaigns you have running. 4/. Put you bids and available spend as high as possible, so that ideally you will your ads will always be shown for every search match, and ideally always come in No1 spot on the ads. IE your aim is to create a neutral environment where your ads are shown for every search match and appear in the exact same position for each search match. Run the campaign for a period of time long enough to be confident that you have enough search data. From this you should have the vital data on keyword searches done on your exact match keywords. 5/. Repeat the test, but this time use phrase matches and again make sure there are no conflicts. 6/. Repeat again with broad match - this would require very careful implimentation not to have any conflicts and you would likely need to make heavy use of the "Not Include" keywords. What are your thoughts on the above process? - Any flaws, or other better solutions? Obviously one key thing with doing this is you need to have to be prepared to have a decent budget to get this data - but it won't be wasted as you will also be getting the adwords traffic. Thanks0