Will thousands of redirected pages have a negative impact on the site?
-
A client site has thousands of pages with unoptimized urls. I want to change the url structure to make them a little more search friendly.
Many of the pages I want to update have backlinks to them and good PR so I don't want to delete them entirely. If I change the urls on thousands of pages, that means a lot of 301 redirects. Will thousands of redirected pages have a negative impact on the site?
Thanks,
Dino
-
I've never had a problem on creating a large number of redirects on a site before. It's something that happens quite a bit, for instance if a site is moving to a a site to a new domain or a new CMS, where it can often be very difficult to exactly recreate the same URL structure.
There's no limit to the number of redirects, just the number of hops. If the site had existing redirects in place, you might want to update those existing redirects as well, to point to the new final destination.
-
I'm doing the same thing with a site I'm rebuilding. The page structure is changing to make it more logical to users and hopefully google. I've changed the urls on my site a couple of times over the years and I haven't noticed much change in the short-term and a considerable boost in the long term.
The site I'm building has hundreds of pages with tons of 301 redirects as well.
-
There is some loss in PR with 301 but it's about linking in the future too. Is it easier for someone to link to example.com/red-apples or example.com/?page_id=53.
I personally don't think it's a waste of time and it certainly helps with UX.
-
I do not know how much an optimized url is worth. I also do not know how much link juice would be lost by redirecting. I wasn't aware that any would be lost. If so, then I need to consider if leaving them alone is the best option at this point. Definitely do not want to do more harm than good.
-
You hope to get a tiny bump out of changing the URL ?
But, you are going to waste some linkjuice in redirecting....
and create thousands of htaccess lines that must be processed...
and you are worried that tons of redirects are going to cause a problem....
How much do you think an optmized URL is worth in the search engines?
I don't think that it's worth a whole lot.
-
There won't be any issue. But you need to check with system admin if your server could be able to handle thousands of redirects. Most of the time, reducing duplicates from the current site and using regular expression for 301 redirects will reduce the number of 301 redirects considerably.
-
Simple answer is no, but.....
Make sure that your 301s are not more than 3 redirects from the original post. For example, if you have a page called pageid=44 and it's about red apples, then make sure that the 301 goes to /red-apples (meaning the exact page you want it to go to)
Matt Cutts has a great video on this on WMT.
This should really answer all your questions but if you have any more then please feel free to ask.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I deindex a page then will Google stop counting those links pointing to it?
Hey everyone, I am deindexing some posts of my website as I think they are not providing any value to the users. My question is that if I deindex a post and it has some good quality links pointing to it, will google stop those links counting for my website?
Intermediate & Advanced SEO | | Bunnypundir0 -
How to speed up transition towards new 301 redirected landing pages?
Hi SEO's, I have a question about moving local landing pages from many separate pages towards integrating them into a search results page. Currently we have many separate local pages (e.g. www.3dhubs.com/new-york). For both scalability and conversion reasons, we'll integrate our local pages into our search page (e.g. www.3dhubs.com/3d-print/Bangalore--India). **Implementation details: **To mitigate the risk of a sudden organic traffic drop, we're currently running a test on just 18 local pages (Bangalore) = 1 / 18). We applied a 301 redirect from the old URL's to the new URL's 3 weeks ago. Note: We didn't yet update the sitemap for this test (technical reasons) and will only do this once we 301 redirect all local pages. For the 18 test pages I manually told the crawlers to index them in webmaster tools. That should do I suppose. **Results so far: **The old url's of the 18 test cities are still generating > 99% of the traffic while the new pages are already indexed (see: https://www.google.nl/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site:www.3dhubs.com/3d-print/&start=0). Overall organic traffic on test cities hasn't changed. Questions: 1. Will updating the sitemap for this test have a big impact? Google has already picked up the new URL's so that's not the issue. Furthermore, the 301 redirect on the old pages should tell Google to show the new page instead, right? 2. Is it normal that search impressions will slowly shift from the old page towards the new page? How long should I expect it to take before the new pages are consistently shown over the old pages in the SERPS?
Intermediate & Advanced SEO | | robdraaijer0 -
How to handle individual page redirects on Wix?
I switched from one domain to another because I wanted a domain that had our company name so it was more brand-y. However, the old domain had better DA/PA. Originally I set up a global 301 from the old to the new, but now I'm finding that I actually need to set up individual 301's from each URL of the old site, or at least from each page. However, I am using Wix so it looks like I can't always do URL-URL 301's, although I can redirect any URL to a page on the new website. The problem is that, in some cases, the content on the new site is different (or, for example, I can only link a particular blog post on the old site back to the new site's blog's main page). How closely do URLS/pages need to resemble each other for link juice to be transferred? Also, should I try to set up all these redirects manually or bite the bullet and go back to using the old domain? The problem is that I did a lot of beginner SEO junk for the new domain, like submitting to a few higher-quality directories, and getting our website on various industry resource sites, etc. I'd need to re-do this entirely if I go back to the old page. What do you think?
Intermediate & Advanced SEO | | BohmKalish1230 -
301 redirect for page 2, page 3 etc of an article or feed
Hey guys, We're looking to move a blog feed we have to a new static URL page. We are using 301 redirects but I'm unsure of what to regarding page 2, page 3 etc. of the feed. How do I make sure those urls are being redirected as well? For example: Moving FloridaDentist.com/blog/dental-tips/ to a new page url FloridaDentist.com/dental-tips. So, we are using a 301 on that old url to the new one. My questions is what to do with the other pages like FloridaDentist.com/blog/dental-tips/page/3. How do we make sure that page is also 301'd to the new main url?
Intermediate & Advanced SEO | | RickyShockley0 -
Is it ok to add snippet of information taken from other sites on product pages?
Hello here, I own an e-commerce website that sells digital sheet music, and I would like to enrich my product pages with short references to artists/composers related to the product, taken from external websites such as mentions, fresh news, information taken from related videos, cross-references, etc. In other words, I'd like to provide our users with a different kind of informational content that our competitors are currently not offering. We could also think of this like "providing some sort of aggregate content on product pages to enrich the user's experience by providing more information about the product". What do you think are the risks or the benefits of such an approach? And if there are any risks, how to avoid/tackle them? Any thoughts are very welcome! Thank you in advance to anyone.
Intermediate & Advanced SEO | | fablau0 -
Will an Add this app contribute to on-page links volume?
My crawl reports are saying there are too many links on page, could this be due to the Add This app on page
Intermediate & Advanced SEO | | BarrattsShoes0 -
How to evaluate and compare sites of different cost and authority for linkbuilding impact
I work for a number of clients on linkbuilding campaigns, and I follow and recommend to clients a white hat "quality" linkbuilding approach. This has achieved great results for my clients but I often get questions such as. Q. What is better for us - £100 for a mozrank 4 placement vs £200 for a mozrank 5 placement? I am trying to build a numeric way of showing cost of placing a link (via article mainly) vs the mozrank / authority of the site its placed on. The key to working this out though is knowing how much more value there is between the different quality of sites, and what are the key factors to build into the formula. Ideally the output we will get is a linkbuilding effect cost which shows the cost per impact of a placement. A highly simplified formula (that wouldn't work) therefore would be Cost / mozrank If we look at mozrank being 8 times higher between levels, how does this correlate with the value of the link from sites at those different levels? We know that we will never have a correct formula for this but we are striving to get a formula which helps us to plan and evaluate different site opportunities at different costs. I would love to know anybody’s thoughts on this, Thanks
Intermediate & Advanced SEO | | Red_Mud_Rookie0 -
Working out exactly how Google is crawling my site if I have loooots of pages
I am trying to work out exactly how Google is crawling my site including entry points and its path from there. The site has millions of pages and hundreds of thousands indexed. I have simple log files with a time stamp and URL that google bot was on. Unfortunately there are hundreds of thousands of entries even for one day and as it is a massive site I am finding it hard to work out the spiders paths. Is there any way using the log files and excel or other tools to work this out simply? Also I was expecting the bot to almost instantaneously go through each level eg. main page--> category page ---> subcategory page (expecting same time stamp) but this does not appear to be the case. Does the bot follow a path right through to the deepest level it can/allowed to for that crawl and then returns to the higher level category pages at a later time? Any help would be appreciated Cheers
Intermediate & Advanced SEO | | soeren.hofmayer0