Will thousands of redirected pages have a negative impact on the site?
-
A client site has thousands of pages with unoptimized urls. I want to change the url structure to make them a little more search friendly.
Many of the pages I want to update have backlinks to them and good PR so I don't want to delete them entirely. If I change the urls on thousands of pages, that means a lot of 301 redirects. Will thousands of redirected pages have a negative impact on the site?
Thanks,
Dino
-
I've never had a problem on creating a large number of redirects on a site before. It's something that happens quite a bit, for instance if a site is moving to a a site to a new domain or a new CMS, where it can often be very difficult to exactly recreate the same URL structure.
There's no limit to the number of redirects, just the number of hops. If the site had existing redirects in place, you might want to update those existing redirects as well, to point to the new final destination.
-
I'm doing the same thing with a site I'm rebuilding. The page structure is changing to make it more logical to users and hopefully google. I've changed the urls on my site a couple of times over the years and I haven't noticed much change in the short-term and a considerable boost in the long term.
The site I'm building has hundreds of pages with tons of 301 redirects as well.
-
There is some loss in PR with 301 but it's about linking in the future too. Is it easier for someone to link to example.com/red-apples or example.com/?page_id=53.
I personally don't think it's a waste of time and it certainly helps with UX.
-
I do not know how much an optimized url is worth. I also do not know how much link juice would be lost by redirecting. I wasn't aware that any would be lost. If so, then I need to consider if leaving them alone is the best option at this point. Definitely do not want to do more harm than good.
-
You hope to get a tiny bump out of changing the URL ?
But, you are going to waste some linkjuice in redirecting....
and create thousands of htaccess lines that must be processed...
and you are worried that tons of redirects are going to cause a problem....
How much do you think an optmized URL is worth in the search engines?
I don't think that it's worth a whole lot.
-
There won't be any issue. But you need to check with system admin if your server could be able to handle thousands of redirects. Most of the time, reducing duplicates from the current site and using regular expression for 301 redirects will reduce the number of 301 redirects considerably.
-
Simple answer is no, but.....
Make sure that your 301s are not more than 3 redirects from the original post. For example, if you have a page called pageid=44 and it's about red apples, then make sure that the 301 goes to /red-apples (meaning the exact page you want it to go to)
Matt Cutts has a great video on this on WMT.
This should really answer all your questions but if you have any more then please feel free to ask.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How will changing my website's page content affect SEO?
Our company is looking to update the content on our existing web pages and I am curious what the best way to roll out these changes are in order to maintain good SEO rankings for certain pages. The infrastructure of the site will not be modified except for maybe adding a couple new pages, but existing domains will stay the same. If the domains are staying the same does it really matter if I just updated 1 page every week or so, versus updating them all at once? Just looking for some insight into how freshening up the content on the back end pages could potentially hurt SEO rankings initially. Thanks!
Intermediate & Advanced SEO | | Bankable1 -
A single page from site not ranking
Hello, We have a new site launched in March, that is ranking well in search for all of the pages, except one and we don't know why. This page it is optimised exactly the same way like the others, but still doesn't rank in Google. We have verified robots.txt for noffollow, noindex tags, we have verified if it was penalized by Google, but still didn't find nothing. Initially we had another site and was on the topic of this page, but we have redirected it to the new one. In case this old site was anytime in the past penalized by Google, could it be possible that the new page be influenced by this? Also, we have another site that ranks on the first position, that targets the same keywords like the page that does not rank. It was the first site we launched, so it is pretty much old, but we do not have duplicate content on them. Maybe Google doesn't like the fact that both target the same keywords and chooses to display only the old site? Please help us if you have any ideas or have been through such thing. Thank you!
Intermediate & Advanced SEO | | daniela.pirlogea0 -
Will a GEO Localization site create thousands of duplicates?
Hi mozzers, We are about to launch a new site and right now I am worried that this new site may create thousands of duplicate content which will harm all the SEO that has been done in the last few years. Here is a situation: You land on the example.com/Los-angeles page (geo located) but if you modify URI to example.com/chico then a pop up appears and ask you for the location you want to be in (pop up attached). When choosing chico the URI switches to example.com/chico?franchise=chico instead of /chico only. This site has over 40 different microsites so my question are all these arguments ?franchise=city going to be indexed and create thousands of dups? or are we safe because this geo localization happens thanks to javascript? Thanks! GopRinh.png
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Putting "noindex" on a page that's in an iframe... what will that mean for the parent page?
If I've got a page that is being called in an iframe, on my homepage, and I don't want that called page to be indexed.... so I put a noindex tag on the called page (but not on the homepage) what might that mean for the homepage? Nothing? Will Google, Bing, Yahoo, or anyone else, potentially see that as a noindex tag on my homepage?
Intermediate & Advanced SEO | | Philip-DiPatrizio0 -
Thousands of Web Pages Disappered from Google Index
The site is - http://shop.riversideexports.com We checked webmaster tools, nothing strange. Then we manually resubmitted using webmaster tools about a month ago. Now only seeing about 15 pages indexed. The rest of the sites on our network are heavily indexed and ranking really well. BUT the sites that are using a sub domain are not. Could this be a sub domain issue? If so, how? If not, what is causing this? Please advise. UPDATE: What we can also share is that the site was cleared twice in it's lifetime - all pages deleted and re-generated. The first two times we had full indexing - now this site hovers at 15 results in the index. We have many other sites in the network that have very similar attributes (such as redundant or empty meta) and none have behaved this way. The broader question is how to do we get the indexing back ?
Intermediate & Advanced SEO | | suredone0 -
How to conduct catch 301 redirects & have the separate 301 redirects for the key pages
Hi, We've currently done a site migration mapping and 301 redirecting only the sites key pages. However two GWT (Google Webmaster Tools) is picking a massive amount of 404 areas and there has been some drop in rankings. I want to mitigate the site from further decline, and hence thought about doing a catch 301 - that is 301 redirecting the remaining pages found on the old site back to the home page, with the future aim of going through each URL one by one to redirect them to the page which is most relevant. Two questions, (1) can I do a catch 301 and if so what is the process and requirements that I have to give to the developer? (2) How do you reduce the number of increasing 404 errors from a site, despite doing 301 redirects and updating links on external linking sites. Note: The server is apache and the site is hosted on Wordpress platform. Regards, Vahe
Intermediate & Advanced SEO | | Vahe.Arabian0 -
301 - should I redirect entire domain or page for page?
Hi, We recently enabled a 301 on our domain from our old website to our new website. On the advice of fellow mozzer's we copied the old site exactly to the new domain, then did the 301 so that the sites are identical. Question is, should we be doing the 301 as a whole domain redirect, i.e. www.oldsite.com is now > www.newsite.com, or individually setting each page, i.e. www.oldsite.com/page1 is now www.newsite.com/page1 etc for each page in our site? Remembering that both old and new sites (for now) are identical copies. Also we set the 301 about 5 days ago and have verified its working but haven't seen a single change in rank either from the old site or new - is this because Google hasn't likely re-indexed yet? Thanks, Anthony
Intermediate & Advanced SEO | | Grenadi0 -
Question about "launching to G" a new site with 500000 pages
Hey experts, how you doing? Hope everything is ok! I'm about to launch a new website, the code is almost done. Totally fresh new domain. The site will have like 500000 pages, fully internal optimized of course. I got my taticts to make G "travel" over my site to get things indexed. The problem is: to release it in "giant mode" or release it "thin" and increase the pages over the time? What do you recomend? Release the big G at once and let them find the 500k pages (do they think this can be a SPAM or something like that)? Or release like 1k/2k per day? Anybody know any good aproach to improve my chances of success here? Any word will be apreciated. Thanks!
Intermediate & Advanced SEO | | azaiats20