Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does it really matter to maintain 301 redirect after de-indexing of old URLs?
-
Today, I was reading latest blog post on SEOmoz blog about. Uncrawled 301s - A Quick Fix for When Relaunches Go Too Well
This is very interesting study about 301 & How it useful to maintain traffic. I'm working on eCommerce website and I have done similar stuff on my website. I have big confusion to manage 301 redirect.
My website generates new URLs due to following actions.
- Re-write dynamic URLs.
- Re-launch entire website on different eCommerce platform. [osCommerce to Magento Commerce]
- Re-name category.
- Trasfer one product from one category to another category.
I'm managing my 301 redirect with old practice. Excel sheet data from Google webmaster tools and set specific new URLs for redirect. Hoooo... Now, I have 8.5K redirect in htaccess... And, I'm thinking it's too much.
Can we remove old 301 redirect from htaccess or not? This is big question for me. Because, all pages are not hyperlink on external website. Google have just de-indexed old URLs and indexed new URLs. So, Is it require to maintain 301 redirect after Google process?
-
I always use a 301 redirect.
2K is a lot of pages. If I can redirect them with a couple lines of htaccess code I would do it. If I had to code 2K lines and have that huge file scanned for thousands of visitors I might not do it if I am pretty sure that there is very little traffic.
I use lots of folders on my site and that makes these problems easy to solve.
-
Hi Egol. I am migrating a site with 6k pages. About 2K pages are useless old syndication articles with no incoming links; about 50 are old CID version of forms. These pages won't exist in the new site. How do I delete them, by no doing 301 redirect and not including them in any sitemap? Would they become 404 for a while and then disappear from google index?
-
My ecommerce [www.vistastores.com] website does not have issue due to change or URLs. I rarely change category page URLs and product page URLs. But, I'm facing issue due to narrow by search. If any attribute will remove from narrow by search so 100 pages will convert to 404 due to dynamic structure. That's why I'm setting up 301 redirect to category page URLs from all dynamic pages.
I have concern to reduce 301 redirect and broken links in website. But, I'm not able to stop it. Google may restrict my organic performance due to continue broken links on website and non associated 301 redirect.
-
I have 301 redirects on my sites. Every one that I have done is still out there in htaccess. I am not taking chances on how search engines handle these.
Other than deleting useless pages, I rarely change URLs. If I don't change URLs I don't have to worry about this stuff.
I have not emailed other sites to change the URL in their links. I have only changed the URLs on my own sites. I would worry that asking someone to edit a link might result in a loss... so I am happy with the redirected link.
-
No, I don't want to create romance on my website with 404.
EGOL & you have mentioned same about visitors and back links. Now, I have some good feeling after reading 20K redirect which is maintain by you. 8.5K is not big for me.
-
In my view it is important to keep the old redirects, even after you have of replacing the Google index. With the 301 the relevance obtained in the old page, keep to the new page. The 301 is not only to Google index, but also backlinks your page has.
I consider that 8.5 k is not so big, I have sites with 20k and never had any problems or restrictions.
Anyway, if you want to remove the old ones, let the 404 page beautiful, which is always good: D
-
Oh Jesus... This is strong reason Why I'm your big fan since a member on SEO Chat forum.
I got your point. OR We can edit hyperlink on external website by email them... Right?
-
Let's say that my website has ten links to your old URLs that deliver hundreds of visitors per day.
What will happen to all of those visitors if you remove the 301 redirects?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Old URLs Appearing in SERPs
Thirteen months ago we removed a large number of non-corporate URLs from our web server. We created 301 redirects and in some cases, we simply removed the content as there was no place to redirect to. Unfortunately, all these pages still appear in Google's SERPs (not Bings) for both the 301'd pages and the pages we removed without redirecting. When you click on the pages in the SERPs that have been redirected - you do get redirected - so we have ruled out any problems with the 301s. We have already resubmitted our XML sitemap and when we run a crawl using Screaming Frog we do not see any of these old pages being linked to at our domain. We have a few different approaches we're considering to get Google to remove these pages from the SERPs and would welcome your input. Remove the 301 redirect entirely so that visits to those pages return a 404 (much easier) or a 410 (would require some setup/configuration via Wordpress). This of course means that anyone visiting those URLs won't be forwarded along, but Google may not drop those redirects from the SERPs otherwise. Request that Google temporarily block those pages (done via GWMT), which lasts for 90 days. Update robots.txt to block access to the redirecting directories. Thank you. Rosemary One year ago I removed a whole lot of junk that was on my web server but it is still appearing in the SERPs.
Technical SEO | | RosemaryB3 -
Redirecting old Sitemaps to a new XML
I've discovered a ton of 404s from Google's WMT crawler looking for mydomain.com/sitemap_archive_MONTH_YEAR. There are tons of these monthly archive xmls. I've used a plugin that for some reason created individual monthly archive xml sitemaps and now I get 404s. Creating rules for each archive seems a bad solution. My current sitemap plugin creates a single clean one mydomain.com/sitemap_index.xml. How can I create a redirect rule in the Redirection WP plugin that will redirect any URL that has the 'sitemap' and 'xml' string in it to my current xml sitemap? I've tried using a wildcard like so: mysite.com/sitemap*.*, mysite.com/sitemap ., mysite.com/sitemap(.), mysite.com/sitemap (.) but none of the wildcard uses got the general redirect to work. Is there a way to make this happen with the WP Redirection plugin? If not, is there a htaccess rule, and what would the code be for it? Im not very fluent with using general redirects in htaccess unfortunately. Thanks!
Technical SEO | | IgorMateski0 -
301 redirect: canonical or non canonical?
Hi, Newbie alert! I need to set up 301 redirects for changed URLs on a database driven site that is to be redeveloped shortly. The current site uses canonical header tags. The new site will also use canonical tags. Should the 301 redirects map the canonical URL on the old site to the corresponding canonical for the new design . . . or should they map the non canonical database URLs old and new? Given that the purpose of canonicals is to indicate our preferred URL, then my guess is that's what I should use. However, how can I be sure that Google (for example) has indexed the canonical in every case? Thx in anticipation.
Technical SEO | | ztalk1120 -
How to Stop Google from Indexing Old Pages
We moved from a .php site to a java site on April 10th. It's almost 2 months later and Google continues to crawl old pages that no longer exist (225,430 Not Found Errors to be exact). These pages no longer exist on the site and there are no internal or external links pointing to these pages. Google has crawled the site since the go live, but continues to try and crawl these pages. What are my next steps?
Technical SEO | | rhoadesjohn0 -
De-indexing millions of pages - would this work?
Hi all, We run an e-commerce site with a catalogue of around 5 million products. Unfortunately, we have let Googlebot crawl and index tens of millions of search URLs, the majority of which are very thin of content or duplicates of other URLs. In short: we are in deep. Our bloated Google-index is hampering our real content to rank; Googlebot does not bother crawling our real content (product pages specifically) and hammers the life out of our servers. Since having Googlebot crawl and de-index tens of millions of old URLs would probably take years (?), my plan is this: 301 redirect all old SERP URLs to a new SERP URL. If new URL should not be indexed, add meta robots noindex tag on new URL. When it is evident that Google has indexed most "high quality" new URLs, robots.txt disallow crawling of old SERP URLs. Then directory style remove all old SERP URLs in GWT URL Removal Tool This would be an example of an old URL:
Technical SEO | | TalkInThePark
www.site.com/cgi-bin/weirdapplicationname.cgi?word=bmw&what=1.2&how=2 This would be an example of a new URL:
www.site.com/search?q=bmw&category=cars&color=blue I have to specific questions: Would Google both de-index the old URL and not index the new URL after 301 redirecting the old URL to the new URL (which is noindexed) as described in point 2 above? What risks are associated with removing tens of millions of URLs directory style in GWT URL Removal Tool? I have done this before but then I removed "only" some useless 50 000 "add to cart"-URLs.Google says themselves that you should not remove duplicate/thin content this way and that using this tool tools this way "may cause problems for your site". And yes, these tens of millions of SERP URLs is a result of a faceted navigation/search function let loose all to long.
And no, we cannot wait for Googlebot to crawl all these millions of URLs in order to discover the 301. By then we would be out of business. Best regards,
TalkInThePark0 -
Old URL redirect to New URL
Alright I did something dumb a year a go and I'm still paying for it. I changed my hyphenated URL to the non-hyphenated version when I redesigned my website. I say it was dumb because I lost most of my link juice even though I did 301 redirects (via the htaccess file) for almost all of the pages I could find in Google's index. Here's my problem. My new site took a huge hit in traffic (down 60%) when I made the change and even though I've done thousands of redirects my old site is still showing up in the SERPS and send much if not most of my traffic. I don't want to take the old site down in fear it will kill all of my traffic. What should I do? Is there a better method I should explore then 301 redirects? Could the other site be affecting my current rank since it's still there? (FYI...both sites are built on the WP platform). Any help or ideas are greatly appreciated. Thank you! Joe
Technical SEO | | kaje0 -
Someone is redirecting their url to mine
Hello, I have just discovered that a company in Poland www.realpilot.pl is directing their domain to ours www.transair.co.uk. We have not authorised this, neither do we want this. I have contacted the company and the webmaster to get it removed. If you search for the domain name www.realpilot.pl we (www.transair.co.uk) come up top. My biggest worry is that we will get penalised by Google for this re-direct as it appears to be done using some kind of frame. Does anyone know anything about this kind of thing? Many Thanks Rob Martin
Technical SEO | | brightonseorob0 -
Why google index my IP URL
hi guys, a question please. if site:112.65.247.14 , you can see google index our website IP address, this could duplicate with our darwinmarketing.com content pages. i am not quite sure why google index my IP pages while index domain pages, i understand this could because of backlink, internal link and etc, but i don't see obvious issues there, also i have submit request to google team to remove ip address index, but seems no luck. Please do you have any other suggestion on this? i was trying to do change of address setting in Google Webmaster Tools, but didn't allow as it said "Restricted to root level domains only", any ideas? Thank you! boson
Technical SEO | | DarwinChinaSEO0