Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sitemaps: Best Practice
-
What should and what shouldn't go in the sitemap?
In particular, pages like subscribe to our newsletter/ unsubscribe to our newsletter? Is there really any benefit in highlighting those pages to the SEs?
Thanks for any advice/ anecdotes
-
So, sometimes, people think adding a sitemap to their company website, is something thats very difficult to do.
for example, they may think they need a web designer to do this for them, yet often you can do it yourself, its very simple.
so if your business has a WordPress website, then it can be a piece of cake to add a site map.
If you use Yoast, its a free plugin, , you can add a site map very easily to your website, which you can then send to your site map to Google Search Console for indexing .
We did this for a large garden room company within the city of Bristol, and what happens is that it makes sure every single page and blog post is indexed.
- topic:timeago_earlier,5 years
-
Pages that I like to call 'core' site URLs should go in your sitemap. Basically, unique (canonical) pages which are not highly duplicate, which Google would wish to rank
I would include core addresses
I wouldn't include uploaded documents, installers, archives, resources (images, JS modules, CSS sheets, SWF objects), pagination URLs or parameter based children of canonical pages (e.g: example.com/some-page is ok to rank, but not example.com/some-page?tab=tab3). Parameters are additional funky stuff added to URLs following "?" or "&".
There are exceptions to these rules, some sites use parameters to render their on-page content - even for canonical addresses. Those old architecture types are fast dying out, though. If you're on WordPress I would index categories, but not tags which are non-hierarchical and messy (they really clutter up your SERPs)
Try crawling your site using Screaming Frog. Export all the URLs (or a large sample of them) into an Excel file. Filter the file, see which types of addresses exist on your site and which technologies are being used. Feed Google the unique, high-value pages that you know it should be ranking
I have said not to feed pagination URLs to Google, that doesn't mean they should be completely de-indexed. I just think that XML sitemaps should be pretty lean and streamlined. You can allow things which aren't in your XML sitemap to have a chance of indexation, but if you have used something like a Meta no-index tag or a robots.txt edit to block access to a page - **do not **then feed it to Google in your XML. Try to keep **all **of your indexation modules in line with each other!
No page which points to another, separate address via a canonical tag (thus calling itself 'non-canonical') should be in your XML sitemap. No page that is blocked via Meta no-index or Robots.txt should be in your sitemap.XML either
If you end up with too many pages, think about creating a sitemap XML index instead, which links through to other, separate sitemap files
Hope that helps!
-
To further on from this, we have some parameter urls in our sitemap which make me uneasy. should url.com/blah.html?option=1 be in the sitemap? If so, what benefit is that giving us?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is single H1 tag still best practice?
Hi Guys, Is having a single h1 tag still best practice for SEO? Guessing multiple h1 tags dilute the value of the tag and keywords within the tag. Thoughts? Cheers.
Intermediate & Advanced SEO | Jan 14, 2019, 3:12 PM | kayl870 -
Mass URL changes and redirecting those old URLS to the new. What is SEO Risk and best practices?
Hello good people of the MOZ community, I am looking to do a mass edit of URLS on content pages within our sites. The way these were initially setup was to be unique by having the date in the URL which was a few years ago and can make evergreen content now seem dated. The new URLS would follow a better folder path style naming convention and would be way better URLS overall. Some examples of the **old **URLS would be https://www.inlineskates.com/Buying-Guide-for-Inline-Skates/buying-guide-9-17-2012,default,pg.html
Intermediate & Advanced SEO | Feb 14, 2018, 1:55 PM | kirin44355
https://www.inlineskates.com/Buying-Guide-for-Kids-Inline-Skates/buying-guide-11-13-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Inline-Hockey-Skates/buying-guide-9-3-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Aggressive-Skates/buying-guide-7-19-2012,default,pg.html The new URLS would look like this which would be a great improvement https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Kids-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Hockey-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Aggressive-Skates,default,pg.html My worry is that we do rank fairly well organically for some of the content and don't want to anger the google machine. The way I would be doing the process would be to edit the URLS to the new layout, then do the redirect for them and push live. Is there a great SEO risk to doing this?
Is there a way to do a mass "Fetch as googlebot" to reindex these if I do say 50 a day? I only see the ability to do 1 URL at a time in the webmaster backend.
Is there anything else I am missing? I believe this change would overall be good in the long run but do not want to take a huge hit initially by doing something incorrectly. This would be done on 5- to a couple hundred links across various sites I manage. Thanks in advance,
Chris Gorski0 -
Splitting One Site Into Two Sites Best Practices Needed
Okay, working with a large site that, for business reasons beyond organic search, wants to split an existing site in two. So, the old domain name stays and a new one is born with some of the content from the old site, along with some new content of its own. The general idea, for more than just search reasons, is that it makes both the old site and new sites more purely about their respective subject matter. The existing content on the old site that is becoming part of the new site will be 301'd to the new site's domain. So, the old site will have a lot of 301s and links to the new site. No links coming back from the new site to the old site anticipated at this time. Would like any and all insights into any potential pitfalls and best practices for this to come off as well as it can under the circumstances. For instance, should all those links from the old site to the new site be nofollowed, kind of like a non-editorial link to an affiliate or advertiser? Is there weirdness for Google in 301ing to a new domain from some, but not all, content of the old site. Would you individually submit requests to remove from index for the hundreds and hundreds of old site pages moving to the new site or just figure that the 301 will eventually take care of that? Is there substantial organic search risk of any kind to the old site, beyond the obvious of just not having those pages to produce any more? Anything else? Any ideas about how long the new site can expect to wander the wilderness of no organic search traffic? The old site has a 45 domain authority. Thanks!
Intermediate & Advanced SEO | Jul 23, 2016, 8:55 PM | 945010 -
URL Rewriting Best Practices
Hey Moz! I’m getting ready to implement URL rewrites on my website to improve site structure/URL readability. More specifically I want to: Improve our website structure by removing redundant directories. Replace underscores with dashes and remove file extensions for our URLs. Please see my example below: Old structure: http://www.widgets.com/widgets/commercial-widgets/small_blue_widget.htm New structure: https://www.widgets.com/commercial-widgets/small-blue-widget I've read several URL rewriting guides online, all of which seem to provide similar but overall different methods to do this. I'm looking for what's considered best practices to implement these rewrites. From what I understand, the most common method is to implement rewrites in our .htaccess file using mod_rewrite (which will find the old URLs and rewrite them according to the rewrites I implement). One question I can't seem to find a definitive answer to is when I implement the rewrite to remove file extensions/replace underscores with dashes in our URLs, do the webpage file names need to be edited to the new format? From what I understand the webpage file names must remain the same for the rewrites in the .htaccess to work. However, our internal links (including canonical links) must be changed to the new URL format. Can anyone shed light on this? Also, I'm aware that implementing URL rewriting improperly could negatively affect our SERP rankings. If I redirect our old website directory structure to our new structure using this rewrite, are my bases covered in regards to having the proper 301 redirects in place to not affect our rankings negatively? Please offer any advice/reliable guides to handle this properly. Thanks in advance!
Intermediate & Advanced SEO | Nov 12, 2015, 1:16 PM | TheDude0 -
Image URLs - best practice
Hi - I'm assuming image URL best practice follows same principles as non image URLs (not too many files and so on) - I notice alot of web devs putting photos in subdomains, so wonder if I'm missing something (I usually avoid subdomains like the plague)!
Intermediate & Advanced SEO | Aug 10, 2015, 4:35 PM | McTaggart1 -
Should sitemap include https pages?
Hi guys, Trying to figure out some onsite issues I've been having. Would appreciate any feedback on the following 2 questions: My homepage (http://mysite.com) is a 301 redirect to https://mysite.com, which is under SSL. Only 2 pages of my site are https, the rest are http. Should the directory of my sitemap be https://mysite.com/sitemap.xml or should it be kept with http (even though the redirected homepage is to https)? Should my sitemap include the https pages (only 2 pages) as well as the http? Thanks, G
Intermediate & Advanced SEO | Nov 29, 2013, 3:45 PM | G.Anderson0 -
Best practice for duplicate website content: same root domain name but different extension
Hi there I have a new client who has two websites: http://www.bayofislandsteambuilding.co.nz
Intermediate & Advanced SEO | Jul 23, 2013, 11:28 PM | turnbullholdingsltd
http://www.bayofislandsteambuilding.org.nz They are the same in every regard apart from the domain extension (.co.nz & .org.nz) which is likely to be causing them issues with Google ranking given the huge amount of duplicate content. What is the best practice approach to fixing this? Normally, if I was starting from scratch, I would set one of the extensions as an alias which redirects to the main domain. Thanks in advance. Laurie0 -
Is it bad to host an XML sitemap in a different subdomain?
Example: sitemap.example.com/sitemap.xml for pages on www.example.com.
Intermediate & Advanced SEO | Jun 14, 2012, 4:03 PM | SEOTGT0