Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should 301-ed links be removed from sitemap?
-
In an effort to do some housekeeping on our site we are wanting to change the URL format for a couple thousand links on our site. Those links will all been 301 redirected to corresponding links in the new URL format. For example, old URL format: /tag/flowers as well as search/flowerswill be 301-ed to, new URL format: /content/flowers**Question:**Since the old links also exist in our sitemap, should we add the new links to our sitemap in addition to the old links, or replace the old links with new ones in our sitemap? Just want to make sure we don’t lose the ranking we currently have for the old links.Any help would be appreciated. Thanks!
-
I'm going to disagree a little bit with the other commenters. I've done quite a few large scale redirect projects and I'm not 100% opposed to using a "dirty sitemap" for a short duration. The better option is to leave some internal links pointed at the old URLs. I know what the search engines say, but I also know what I've experienced when it comes to getting 301'd links crawled again.
Read this post by Everett Sizemore for more info at what I'm describing:
http://moz.com/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well
-
"A sitemap should only contain links to active pages."
Hi shawn81
Alex is absolutely correct there.
In fact, Duane Forrester has said repeatedly that Bing absolutely does not like to find such pages in a sitemap and that you should make sure there are never 3XX, 4XX or 5XX status pages included because it will stop Bingbot from crawling your site.
While Googlebot is not so sensitive, the reality is that all search engines allocate a certain amount of crawl capacity for your site...if your sitemaps include a load of pages that are not likely to be indexed, the result is twofold:
- you are wasting capacity on useless pages and the crawler may never get to the stuff you really want indexed
- if the crawler encounters a lot of non-active pages when it crawls, future crawl capacity (not to mention trust) is likely to be reduced
Replace the old URLs with the new and give the bots a little thrill of adventure
Hope that helps,
Sha
- you are wasting capacity on useless pages and the crawler may never get to the stuff you really want indexed
-
There shouldn't be any 301 links in a sitemap. A sitemap should only contain links to active pages. So in your case, you should remove all the 301 links and replace them with the new links.
Couple notes - Having 301 links in your sitemap won't hurt your site or SEO unless the sitemap is so huge that you need to split it up into multiple files. But you should really only have the final links in the sitemap, neither people nor bots want to be redirected around. If you properly 301'd the crawlers will automatically update their links.
Changing links around in the sitemap generally won't hurt your site. Especially if the links no longer exist and you're improving the list. There are very few cases where making changes will hurt the site.
-
We have had a problem with this ourselves. We put a 301 redirect on our domain when we were building a new site (went from new. to www.) and search engines are still crawling the new. domain. Bing webmaster tools registers it as an error because they can't find the old site. I would lean toward removing it just because your users are probably being redirected somewhere they wouldn't necessarily want to go.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I stop a tracking link from being indexed while still passing link equity?
I have a marketing campaign landing page and it uses a tracking URL to track clicks. The tracking links look something like this: http://this-is-the-origin-url.com/clkn/http/destination-url.com/ The problem is that Google is indexing these links as pages in the SERPs. Of course when they get indexed and then clicked, they show a 400 error because the /clkn/ link doesn't represent an actual page with content on it. The tracking link is set up to instantly 301 redirect to http://destination-url.com. Right now my dev team has blocked these links from crawlers by adding Disallow: /clkn/ in the robots.txt file, however, this blocks the flow of link equity to the destination page. How can I stop these links from being indexed without blocking the flow of link equity to the destination URL?
Technical SEO | | UnbounceVan0 -
Good alternatives to Xenu's Link Sleuth and AuditMyPc.com Sitemap Generator
I am working on scraping title tags from websites with 1-5 million pages. Xenu's Link Sleuth seems to be the best option for this, at this point. Sitemap Generator from AuditMyPc.com seems to be working too, but it starts handing up, when a sitemap file, the tools is working on,becomes too large. So basically, the second one looks like it wont be good for websites of this size. I know that Scrapebox can scrape title tags from list of url, but this is not needed, since this comes with both of the above mentioned tools. I know about DeepCrawl.com also, but this one is paid, and it would be very expensive with this amount of pages and websites too (5 million ulrs is $1750 per month, I could get a better deal on multiple websites, but this obvioulsy does not make sense to me, it needs to be free, more or less). Seo Spider from Screaming Frog is not good for large websites. So, in general, what is the best way to work on something like this, also time efficient. Are there any other options for this? Thanks.
Technical SEO | | blrs120 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1 -
Removing images from site and Image Sitemap SEO advice
Hello again, I have received an update request where they want me to remove images from this site (as of now its a bunch of thumbnails) current page design: http://1stimpressions.com/portfolio/car-wraps/ and turn it into a new design which utilized a slider (such as this): http://1stimpressions.com/portfolio/ They don't want the thumbnails on the page anymore. My question is since my site has a image sitemap that has been indexed will removing all the images hurt my SEO greatly? What would the recommended steps to take to reduce any SEO damage be, if so? Thank you again for your help, always great and very helpful feedback! 🙂 cheers!
Technical SEO | | allstatetransmission0 -
301 with nofollow ?
Hi, our ecommerce link penalty was revoked by google back in Feb 26th 2013, but to this day we have not seen any improvement on our rankings. Due to 80% revenue loss we had to layoff quite a few people to stay alive. Situation now is more dire then ever for our company. We have millions of dollars invested in our business and google just busted it for some "low quality" or "spammy links" as they call it. We want to try to move to a different domain and do a 301 from the old domain to make sure our previous customers can still find us as a last effort to stay alive. But doing so we do not want to the bad links juice to flow to our new domain. Can we do a 301 with nofollow and will that have any negative impact or any impact at all.? any suggestion is greatly appreciated. Thank you Nick We are planning on moving to a different domain after 10 years, and laying off bunch of people due to loss of revenue.
Technical SEO | | orion680 -
Can you have a /sitemap.xml and /sitemap.html on the same site?
Thanks in advance for any responses; we really appreciate the expertise of the SEOmoz community! My question: Since the file extensions are different, can a site have both a /sitemap.xml and /sitemap.html both siting at the root domain? For example, we've already put the html sitemap in place here: https://www.pioneermilitaryloans.com/sitemap Now, we're considering adding an XML sitemap. I know standard practice is to load it at the root (www.example.com/sitemap.xml), but am wondering if this will cause conflicts. I've been unable to find this topic addressed anywhere, or any real-life examples of sites currently doing this. What do you think?
Technical SEO | | PioneerServices0 -
Host sitemaps on S3?
Hey guys, I run a dynamic web service and I will start building static sitemaps for it pretty soon. The fact that my app lives in a multitude of servers doesn't make it easy to distribute frequently updated static files throughout the servers. My idea was to host the files in AWS S3 and point my robots.txt sitemap directive there. I'll use a sitemap index so, every other sitemap will be hosted on S3 as well. I could dynamically mirror the content from the files in S3 through my app, but that would be a little more resource intensive than just serving the static files from a common place. Any ideas? Thanks!
Technical SEO | | tanlup0 -
How to generate a visual sitemap using sitemap.xml
Are there any tools (online preferably) which will take a sitemap.xml file and generate a visual site map? Seems like an obvious thing to do, but can't find any simple tools for this?
Technical SEO | | k3nn3dy30