Any downside to a whole bunch of 301s?
-
I'm working with a site that needs a whole bunch of old pages that were deleted 301'd to new pages.
My main goal is to capture any external links that right now go off to a 404 page and cleaning up the index. In dealing with this, I may end up 301ing pages that didn't have incoming links or may not have ever even really existed in the first place. These links are a mix of http and https.
Is there any potential downside to just 301ing a list of several hundred possible old urls that currently trigger the 404 page?
Thanks! Best... Mike
-
Hi Michael!
I recommend checking out this blog for more insight: http://searchengineland.com/how-many-301s-are-too-many-16960
The video on the blog linked above answers: Is there a limit to how many 301 (Permanent) redirects I can do on a site? How about how many redirects I can chain together?
Other things to watch out for with chained redirects:
- Avoid infinite loops.
- Browsers may also have redirect limits, and these limits can vary by browser, so multiple redirects may affect regular users in addition to Googlebot.
- Minimizing redirects can improve page speed
Hope this helps!
-
Thank you to everyone for chipping in their thoughts on this.
Logan, good article. It gave me a new idea and wanted to see what y'all thought.
If my main goal is to not have all these 404s from unpublished pages and to re-direct the incoming link value to pages that could benefit, what would you think of putting up a noindexed page that links to my top pages that I want to give greater authority to? Then, put in a request to de-index those old urls that have the noindexed (duplicate) content. That would mean not firing off a 404, just showing the same content on hundreds of noindexed/deindexed pages. Given your point about re-directs, chained re-directs and speed for mobile, would that do more for me than re-directing all of these old urls to new pages?
Compounding the problem a little, this particular site has a catalog that comes out twice a year where many product pages are constantly being unpublished. So, even if I re-directed the old unpublished pages to existing urls, some of those might be going away and need another re-direct to add to the chain shortly.
Any thoughts on this appreciated. Thanks! Best... Mike
-
301 redirects do have a significant impact on pagespeed on mobile devices since they are often connected to much less reliable networks. Varvy has a great article with more details: https://varvy.com/mobile/mobile-redirects.html
If Google has already reindexed all of your new URLs, then you don't need to worry about covering every single one of your old URLs - stick with the ones the had links pointing to them.
A good way to measure how many of your 301 redirects are being used is to append query parameters to the end of the resolving URL (ex. below) where you set the src parameter to the referring URL. This gives you some unique identifiers to apply filters to in your landing page report in Google Analytics.
/old-page >> /new-page?redir=301&src=/old-page
-
As I understand it, there is two aspects to 301 redirects.
- User experience
- Organic search
Matt Cutts says, there is no limit the number of 301 redirects, unless they are chained together. (ie. start_page > page1 > page2 > proper_page)
I don't expect it will impact on site speed much, nothing you couldn't regain with a bit of speed optimisation.
From a user perspective if you have moved an old page that has high traffic or some good quality links on it. It is very important to ensure that traffic N is back on the right page using a 301.
From organic search perspective (especially Google) again if you are using 301 is it will eventually update its own index to include the new page indicated.
There are two things you should be aware of: -
- By using a 301 from an old page, you could resurrect a bad back link
- A small amount of link authority is lost (only very small)
-
What happens when you have thousands? Is it sensible to remove 301's from say, two years ago?
-
I generally try to keep redirect lists for my clients under 100. You mentioned you had some links to 404 pages, I'd focus on those and add others as you see fit based on traffic volume to those old pages. I've never actually tested the threshold at which site speed starts to become a problem, I see some experimenting in my future!
-
Hi Logan,
Thanks for the insight. Would a few hundred re-directs be a site speed bummer for Shopify hosted site? I've worked on other sites that had decent speed and hundreds of re-directs. Firing off spitstorm of 404s on urls that used to be landing pages for links seems sub-optimal as well.
Best... Mike
-
Hi,
You should keep your 301s to a minimum. Every time a URL is requested, the server checks every single redirect you have to see if there's a match. The larger your redirect list gets, the more impact it'll have on site speed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Potential downside of removing past years' calendar events
hi there. my website is for a school. We have several calendars for athletic events, school events, etc. there are thousands of events each year that link to pages for each event. the URLs/pages are generated dynamically, and the urls and pages are very similar from year to year, so we're being penalized for duplicate content. I can delete past years events in bulk, but there's no way to redirect them in bulk. am i taking a big chance by deleting events that occurred prior to 1/1/2019?
Intermediate & Advanced SEO | | BGR0 -
Do I need to do 301s in this situation?
We have an e-commerce site built on Magento 2. We launched a few months ago, and had about 2K categories. The categories got indexed in Google for the most part. Shortly after launch, we decided to go with SLI for search and navigation because the native search/navigation was too slow given our database. The navigation pages are now hosted navigation pages; meaning, the URLs have changed and they are now hosted by SLI. I have done 301s for the most popular categories, but I didn't do 301s for all categories as we have to go through each category one-by-one and map it to the correct navigation page. Our new category sitemap only lists the new SLI category URLs. Will the fact that we have not 301'd all of our former categories hurt us as far as SEO? Do I have to do 301 redirects for all former category pages?
Intermediate & Advanced SEO | | kevin_h0 -
We are redesigning our existing website. The domain is staying the same, but the sub-structure and page names are changing. Do I still need to do 301s?
We are redesigning our existing website. The domain is staying the same, but the sub-structure and page names are changing. Do I still need to do 301 redirects or will search engines know to remove the old 404 pages from the SERPs? We are redesigning our existing website. The domain is staying the same, but the sub-structure and page names are changing. Do I still need to do 301s?
Intermediate & Advanced SEO | | GrandOptimizations0 -
GWT does not play nice with 410 status code approach to expire content? Use 301s?
We have been diligently managing our index size in Google for our sites and are returning a 410 status code for pages that we no longer consider "up-to-date" but still carry value for users to access to have Google remove them from our index to keep it lean. However we have been receiving GWT warning across sites because of the 410 status codes Google is encountering which makes us nervous that Google could interpret this approach as a lack of quality of our site. Does anyone have a view if the 410 approach is the right approach for the given example or if we should consider maybe simply using 301s or another status code to keep our GWT errors clean? Further notes there is hardly ever any link juice being sent to those pages so it is not like we are missing out on that the pages for which we return 410 are also marked as noindex and nofollow
Intermediate & Advanced SEO | | petersocapro0 -
Redirecting thin content city pages to the state page, 404s or 301s?
I have a large number of thin content city-level pages (possibly 20,000+) that I recently removed from a site. Currently, I have it set up to send a 404 header when any of these removed city-level pages are accessed. But I'm not sending the visitor (or search engine) to a site-wide 404 page. Instead, I'm using PHP to redirect the visitor to the corresponding state-level page for that removed city-level page. Something like: if (this city page should be removed) { header("HTTP/1.0 404 Not Found");
Intermediate & Advanced SEO | | rriot
header("Location:http://example.com/state-level-page")
exit();
} Is it problematic to send a 404 header and still redirect to a category-level page like this? By doing this, I'm sending any visitors to removed pages to the next most relevant page. Does it make more sense to 301 all the removed city-level pages to the state-level page? Also, these removed city-level pages collectively have very little to none inbound links from other sites. I suspect that any inbound links to these removed pages are from low quality scraper-type sites anyway. Thanks in advance!2 -
301s from previous site
Hi! Got quite a tricky problem regarding a client, http://www.muchbetteradventures.com/ and their previous site, http://v1.muchbetteradventures.com/ Here's the background: We have approx 1500 'listing' pages like this: http://v1.muchbetteradventures.com/listing/view/1925/the-barre-des-ecrins-or-the-dome-des-ecrins-mountaineering-trip They bring in min 2k hits/month, and also add to the overall site authority I suspect. They will eventually all have a home on main domain. When they do, they will also each have been rewritten to be unique, so the value of them will increase (many are currently not). We also have landing pages like this: http://v1.muchbetteradventures.com/view/559/volunteering-holidays- which despite being hideous are ranked fairly well (page 1 for key terms). We cannot currently fulfil all these on main domain, but do not want to shut them down and lose positioning. Choices as I see it: Make a landing page e.g. muchbetteradventures.com/volunteering and a) redirect from old landing page, b) redirect all related 'listings' to this page. May help preserve rankings of main landing page (the most important), but not of any listings? Import all listings to have a home on main domain, (probably as children of a landing page, but not rewritten to be unique just yet). Make them not accessible from homepage, and change functionality of them so that new visitors from google are told we cannot currently help them with this trip. This is more work to complete so will take longer to do and is a distraction from our core focus so needs good justification! Stay running largely as we are, slowly redirecting 1 page at a time as we carry over more and more options to main domain. This will take over 12 months min.
Intermediate & Advanced SEO | | neooptic0 -
Whole site blocked by robots in webmaster tools
My URL is: www.wheretobuybeauty.com.auThis new site has been re-crawled over last 2 weeks, and in webmaster tools index status the following is displayed:Indexed 50,000 pagesblocked by robots 69,000Search query 'site:wheretobuybeauty.com.au' returns 55,000 pagesHowever, all pages in the site do appear to be blocked and over the 2 weeks, the google search query site traffic declined from significant to zero (proving this is in fact the case ).This is a Linux php site and has the following: 55,000 URLs in sitemap.xml submitted successfully to webmaster toolsrobots.txt file existed but did not have any entries to allow or disallow URLs - today I have removed robots.txt file completely URL re-direction within Linux .htaccess file - there are many rows within this complex set of re-directions. Developer has double checked this file and found that it is valid.I have read everything that google and other sources have on this topic and this does not help. Also checked webmaster crawl errors, crawl stats, malware and there is no problem there related to this issue.Is this a duplicate content issue - this is a price comparison site where approx half the products have duplicate product descriptions - duplicated because they are obtained from the suppliers through an XML data file. The suppliers have the descriptions from the files in their own sites.Help!!
Intermediate & Advanced SEO | | rrogers0 -
Making a whole site website SSL (https)
Our IT department wants to make a change to our website and serve all the pages under SSL (https). This will be happening at the same time as a move from classic ASP to ASP.Net so the file extensions for non re-written urls will change (this doesn't equate to many). They will be ensuring everything is 301 redirected correctly. Even with this I can't help being very nervous about the change. We have tens of thousands of links to the website gained over many years, and I understand even with 301'ing them they will lose some of their value. We receive tens of thousands of natural visitors per day. Has anyone done anything like this before, or have any advice on whether it is the right thing to do?
Intermediate & Advanced SEO | | BigMiniMan0