Any downside to a whole bunch of 301s?
-
I'm working with a site that needs a whole bunch of old pages that were deleted 301'd to new pages.
My main goal is to capture any external links that right now go off to a 404 page and cleaning up the index. In dealing with this, I may end up 301ing pages that didn't have incoming links or may not have ever even really existed in the first place. These links are a mix of http and https.
Is there any potential downside to just 301ing a list of several hundred possible old urls that currently trigger the 404 page?
Thanks! Best... Mike
-
Hi Michael!
I recommend checking out this blog for more insight: http://searchengineland.com/how-many-301s-are-too-many-16960
The video on the blog linked above answers: Is there a limit to how many 301 (Permanent) redirects I can do on a site? How about how many redirects I can chain together?
Other things to watch out for with chained redirects:
- Avoid infinite loops.
- Browsers may also have redirect limits, and these limits can vary by browser, so multiple redirects may affect regular users in addition to Googlebot.
- Minimizing redirects can improve page speed
Hope this helps!
-
Thank you to everyone for chipping in their thoughts on this.
Logan, good article. It gave me a new idea and wanted to see what y'all thought.
If my main goal is to not have all these 404s from unpublished pages and to re-direct the incoming link value to pages that could benefit, what would you think of putting up a noindexed page that links to my top pages that I want to give greater authority to? Then, put in a request to de-index those old urls that have the noindexed (duplicate) content. That would mean not firing off a 404, just showing the same content on hundreds of noindexed/deindexed pages. Given your point about re-directs, chained re-directs and speed for mobile, would that do more for me than re-directing all of these old urls to new pages?
Compounding the problem a little, this particular site has a catalog that comes out twice a year where many product pages are constantly being unpublished. So, even if I re-directed the old unpublished pages to existing urls, some of those might be going away and need another re-direct to add to the chain shortly.
Any thoughts on this appreciated. Thanks! Best... Mike
-
301 redirects do have a significant impact on pagespeed on mobile devices since they are often connected to much less reliable networks. Varvy has a great article with more details: https://varvy.com/mobile/mobile-redirects.html
If Google has already reindexed all of your new URLs, then you don't need to worry about covering every single one of your old URLs - stick with the ones the had links pointing to them.
A good way to measure how many of your 301 redirects are being used is to append query parameters to the end of the resolving URL (ex. below) where you set the src parameter to the referring URL. This gives you some unique identifiers to apply filters to in your landing page report in Google Analytics.
/old-page >> /new-page?redir=301&src=/old-page
-
As I understand it, there is two aspects to 301 redirects.
- User experience
- Organic search
Matt Cutts says, there is no limit the number of 301 redirects, unless they are chained together. (ie. start_page > page1 > page2 > proper_page)
I don't expect it will impact on site speed much, nothing you couldn't regain with a bit of speed optimisation.
From a user perspective if you have moved an old page that has high traffic or some good quality links on it. It is very important to ensure that traffic N is back on the right page using a 301.
From organic search perspective (especially Google) again if you are using 301 is it will eventually update its own index to include the new page indicated.
There are two things you should be aware of: -
- By using a 301 from an old page, you could resurrect a bad back link
- A small amount of link authority is lost (only very small)
-
What happens when you have thousands? Is it sensible to remove 301's from say, two years ago?
-
I generally try to keep redirect lists for my clients under 100. You mentioned you had some links to 404 pages, I'd focus on those and add others as you see fit based on traffic volume to those old pages. I've never actually tested the threshold at which site speed starts to become a problem, I see some experimenting in my future!
-
Hi Logan,
Thanks for the insight. Would a few hundred re-directs be a site speed bummer for Shopify hosted site? I've worked on other sites that had decent speed and hundreds of re-directs. Firing off spitstorm of 404s on urls that used to be landing pages for links seems sub-optimal as well.
Best... Mike
-
Hi,
You should keep your 301s to a minimum. Every time a URL is requested, the server checks every single redirect you have to see if there's a match. The larger your redirect list gets, the more impact it'll have on site speed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I need to do 301s in this situation?
We have an e-commerce site built on Magento 2. We launched a few months ago, and had about 2K categories. The categories got indexed in Google for the most part. Shortly after launch, we decided to go with SLI for search and navigation because the native search/navigation was too slow given our database. The navigation pages are now hosted navigation pages; meaning, the URLs have changed and they are now hosted by SLI. I have done 301s for the most popular categories, but I didn't do 301s for all categories as we have to go through each category one-by-one and map it to the correct navigation page. Our new category sitemap only lists the new SLI category URLs. Will the fact that we have not 301'd all of our former categories hurt us as far as SEO? Do I have to do 301 redirects for all former category pages?
Intermediate & Advanced SEO | | kevin_h0 -
Changing 301s or using 302s after a relaunch?
We are doing a relaunch and changing nearly every URL. Since the list of redirects is > 5.000 we might have some mistakes we want to change later (i.e. having a 301 to a directory but finding a single page later that fits its purpose better). Can I change the 301 later and will seachengines get that? Can I use 302s for a week or two until I'm sure about my redirects and only than do propper 301s?
Intermediate & Advanced SEO | | nabujona0 -
Downsides of Squarespace v WooCommerce
Hello Mozzers! I'm thinking of using either Squarespace or WooCommerce for an ecommerce website. WooCommerce is great - I've used it before... how flexible is Squarespace from an SEO perspective, compared to WooCommerce? Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
301s being indexed
A client website was moved about six months ago to a new domain. At the time of the move, 301 redirects were setup from the pages on the old domain to point to the same page on the new domain. New pages were setup on the old domain for a different purpose. Now almost six months later when I do a query in google on the old domain like site:example.com 80% of the pages returned are 301 redirects to the new domain. I would have expected this to go away by now. I tried removing these URLs in webmaster tools but the removal requests expire and the URLs come back. Is this something we should be concerned with?
Intermediate & Advanced SEO | | IrvCo_Interactive0 -
How to fix issues from 301s
Case: We are currently in the middle of a site migration from .asp to .net and Endeca PageBuilder, and from a homebrewed search provider to Endeca Search. We have migrated most of our primary landing pages and our entire e-commerce site to the new platforms. During the transition approximately 100 of our primary landing pages were inadvertently 302ed to the new version. Once this was caught they were immediately changed to 301s and submitted to the Google’s index through webmaster tools. We initially saw increases in visits to the new pages, but currently (approximately 3 weeks after the change from 301 to 302) are experiencing a significant decline in visits. Issue: My assumption is many of the internal links (from pages which are now 301ed as well) to these primary landing pages are still pointing to the old version of the primary landing page in Google’s cache, and thus have not passed the importance and internal juice to the new versions. There are no navigational links or entry points to the old supporting pages left, and I believe this is what is driving the decline. Proposed resolution: I intend to create a series of HTML sitemaps of the old version (.asp) of all pages which have recently been 301ed. I will then submit these pages to Google’s index (not as sitemaps, just normal pages) with the selection to index all linked pages. My intention is to force Google to pick up all of the 301s, thus enforcing the authority channels we have set up. Question 1: Is the assumption that the decline could be because of missed authority signals reasonable? Question 2: Could the proposed solution be harmful? Question 3: Will the proposed solution be adequate to resolve the issue? Any help would be sincerely appreciated. Thank you in advance, David
Intermediate & Advanced SEO | | FireMountainGems0 -
Any solutions for implementing 301s instead of 302 redirects in SharePoint 2010?
We have an issue with Google indexing multiples of each page in our sitemap (www.upmc.com). We've tried using rel_canonical, but it appears that GoogleBot is not honoring our canonicals. Specifically, any of the pages Google indexes that end without a file extension, such as .aspx are 302 redirected to a .aspx page. Example - The following pages all respond as 302 redirects to http://www.upmc.com/services/pages/default.aspx http://www.upmc.com/services/ http://www.upmc.com/services http://www.upmc.com/Services/ http://www.upmc.com/Services Has anyone been able to correct this inherent issue with Sharepoint so that the redirects are at least 301s?
Intermediate & Advanced SEO | | Jessdyl0 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
Whole site blocked by robots in webmaster tools
My URL is: www.wheretobuybeauty.com.auThis new site has been re-crawled over last 2 weeks, and in webmaster tools index status the following is displayed:Indexed 50,000 pagesblocked by robots 69,000Search query 'site:wheretobuybeauty.com.au' returns 55,000 pagesHowever, all pages in the site do appear to be blocked and over the 2 weeks, the google search query site traffic declined from significant to zero (proving this is in fact the case ).This is a Linux php site and has the following: 55,000 URLs in sitemap.xml submitted successfully to webmaster toolsrobots.txt file existed but did not have any entries to allow or disallow URLs - today I have removed robots.txt file completely URL re-direction within Linux .htaccess file - there are many rows within this complex set of re-directions. Developer has double checked this file and found that it is valid.I have read everything that google and other sources have on this topic and this does not help. Also checked webmaster crawl errors, crawl stats, malware and there is no problem there related to this issue.Is this a duplicate content issue - this is a price comparison site where approx half the products have duplicate product descriptions - duplicated because they are obtained from the suppliers through an XML data file. The suppliers have the descriptions from the files in their own sites.Help!!
Intermediate & Advanced SEO | | rrogers0