Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does removal of internal redirects(301) help in SEO
-
I am planning to completely remove 301 redirects manually by replacing such links with actual live pages/links. So there will be no redirects internally in the website. Will this boost our SEO efforts?
Auto redirects will be there for incoming links to non-existing pages.
Thanks,
Satish
-
Hi Dirk,
You got it right. All the pages we redirected were pointing to similar pages once, so probably they should be okay as u said.
Regarding disavow; what are the metrics to decide on a link? Some links might be good looking with decent DA and might be hurting us. What's the best way to findout actual back-links dropping us down. Disavow comes with risk as there are chances we may reject good links, so it's better to make sure about the links.
Thanks,
Satish
-
Hi Satish,
Not sure if I fully understand your answer.
Google will consider a redirect as a soft 404 if you redirect pages to non-related pages. Example: if you redirect a page about "shirts" to a page about "pants" or if you redirect this page to your homepage. If the pages are similar (example "green shirts" to "shirts") it's not considered as a soft 404. I understand that you are redirecting to similar pages - so that should be ok.
If you have pages with low quality incoming links (or a mix of high/low quality links) you can still redirect them - but in case of low quality links it's probably a good idea to disavow them (using the search console) - check https://support.google.com/webmasters/answer/2648487?hl=en
Hope this helps,
Dirk
-
Hi Dirk,
Thanks for the response with a detailed answer. Very informative. We have decent number of redirects from homepage and top tier pages too. So I decided for this activity.
Regarding "auto redirects", we have redirected tens of links these days in the process of link reclamation to increase the back-links and pagerank/da. But we have significantly dropped post such redirects even though we have cross examined to make sure the incoming links are relevant to current pages we are linking. However, those external links were pointing to our pages in past. But as we deleted many pages in process of website redesign and content update, we replaced such pages. Why actually they consider such as soft 404 even we redirect non-existing pages to almost same pages with high relevancy? I think some of the links we reclaimed are kind of spammy and pushed us down. What's your idea on this. Thank you.
-Satish
-
The crawl by Googlebot will be more efficient if it can go directly to the destination page rather than having to go trough a redirection.
There is some discussion whether 3xx redirections do have an impact on page rank / page authority - Google official point of view is that it doesn't.
Redirects do however indicate have an impact on speed (check here: https://developers.google.com/speed/docs/insights/mobile: "we strongly encourage webmasters to minimize the number, and ideally eliminate redirects entirely" - context is mobile but is applicable on all redirects) - but again for most sites this won't make a huge difference on total load time.
I doubt that simply cleaning your site and removing the 301 will give a boost to your search traffic, but it just something you need to from time to time (idem for internal 4xx errors) to improve the general health of your site.
Dirk
PS I am a bit puzzled about the remark "auto redirects" - you must make sure if you redirect that you redirect to a page which is similar to the page that has disappeared. Google considers most other type of redirects as "soft 404". If the page never existed - like domain.com/kklfjklgjkldfjg - it should return 404 and not be redirect.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need some help understanding SEO - Please help before I lose [pull out] all my hair
I'm new to SEO, and am stubbornly trying to educate myself. I have a telescope shop in Canada, it's a small business that we run on the side. We're driving lots of traffic through FB and our outreach programs but I really want to increase our presence on search. We released a new website back in January and it killed some of our rankings. We're working our way back with a very specific set of efforts on regular SEO: Metadata and titles, although it seems that's not super relevant Building high quality backlinks and eliminating any spammy backlinks Rewriting product listings so that they are original content though I'm not sure how important this is in e-commerce Writing high quality articles and blog posts Working relevant keywords into our product pages and titles I understand that good SEO is about pushing on all the levers, and trying to make sure that your site is as valuable to the end user as possible. We're making some good progress, but I'm puzzled by the #1 shop in Canada. They don't put any apparent effort into SEO and they still rank #1 on every key product we compete with them on. I've worked with two separate, highly ranked and regarded SEO firms on this and neither has been able to tell my why this other site ranks so highly. Here's a specific example on a popular product that we both sell, the Celestron NexStar 8SE. Here’s the link to Telescope Canada’s page for their Celestron 8SE: https://telescopescanada.ca/products/celestron-nexstar-8se-computerized-telescope-11069 Here’s a link to the Celestron 8SE page from the manufacturer website: https://www.celestron.com/products/nexstar-8se-computerized-telescope Telescopes Canada has just copied and pasted. There is no original content aside from adding the shipping and return policy to the tab, and having some options for selecting accessories on the page. Here is our page: https://all-startelescope.com/products/celestron-nexstar-8se We have higher page authority, higher domain authority, and they keyword analyzer in moz says that our page is higher quality than the Telescopes Canada page. I can’t find a single metric on any tool (ubbersuggest, Moz, ahrefs, semrush) that says Telescopes Canada is a better site, or has a better NexStar 8SE product page. But they keep ranking ahead of us, and right at the top of google search. Our titles are good, our metadata is good (but I don’t think that’s been a serious ranking factor for about ten years). Our text is original, it’s relevant, we have healthy internal links to the page. According to Moz's page ranker it's 20 points higher than Telescope Canada's page. We have invensted in some excellent blog content, we’re adding new products to the website so that we rank for more keywords. All of those things are helping, but I fundamentally don’t understand why Telescopes Canada is #1 almost across the board on every key product in our market. There is something that I’m not seeing here. Can you see any metric, any tool in your toolbox that indicates why they rank at the top, or even higher than we do for in these search terms specific to that product: Celestron NexStar 8SE
Intermediate & Advanced SEO | | nkennett
NexStar 8SE
Celestron NexStar 8SE Canada
NexStar 8SE Canada I have a feeling it's something technical that I'm missing, but I'm not sure how obvious it is with two 'professional' firms not finding it. I'd really appreciate any help or insight that you can offer.0 -
I need help in doing Local SEO
Hey guys I hope everyone is doing well. I am new to SEO world and I want to do local SEO for one of my clients. The issue is I do not know how to do Local SEO at all or where to even start. I would appreciate it if anyone could help me or give me an article or a course to learn how to do it. Main question The thing that I want to do is that, I want my website to show up in top 3 google map results for different locations(which there is one actual location). For example I want to show up for
Intermediate & Advanced SEO | | seopack.org.ofici3
online clothing store in new york
online clothing store in los angeles or... Let's assume that we can ship our product to every other cities. So I hope I could deliver what I mean. I'd appreciate it if you could answer me with practical solutions.0 -
Is there any benefit to changing 303 redirects to 301?
A year ago I moved my marketplace website from http to https. I implemented some design changes at the same time, and saw a huge drop in traffic that we have not recovered from. I've been searching for reasons for the organic traffic decline and have noticed that the redirects from http to https URLs are 303 redirects. There's little information available about 303 redirects but most articles say they don't pass link juice. Is it worth changing them to 301 redirects now? Are there risks in making such a change a year later, and is it likely to have any benefits for rankings?
Intermediate & Advanced SEO | | MAdeit0 -
Should we 301 redirect old events pages on a website?
We have a client that has an events category section that is filled to the brim with past events webpages. Another issue is that these old events webpages all contain duplicate meta description tags, so we are concerned that Google might be penalizing our client's website for this issue. Our client does not want to create specialized meta description tags for these old events pages. Would it be a good idea to 301 redirect these old events landing pages to the main events category page to pass off link equity & remove the duplicate meta description tag issue? This seems drastic (we even noticed that searchmarketingexpo.com is keeping their old events pages). However it seems like these old events webpages offer little value to our website visitors. Any feedback would be much appreciated.
Intermediate & Advanced SEO | | RosemaryB0 -
How do you 301 redirect URLs with a hashbang (#!) format? We just lost a ton of pagerank because we thought javascript redirect was the only way! But other sites have been able to do this – examples and details inside
Hi Moz, Here's more info on our problem, and thanks for reading! We’re trying to Create 301 redirects for 44 pages on site.com. We’re having trouble 301 redirecting these pages, possibly because they are AJAX and have hashbangs in the URLs. These are locations pages. The old locations URLs are in the following format: www.site.com/locations/#!new-york and the new URLs that we want to redirect to are in this format: www.site.com/locations/new-york We have not been able to create these redirects using Yoast WordPress SEO plugin v.1.5.3.2. The CMS is WordPress version 3.9.1 The reason we want to 301 redirect these pages is because we have created new pages to replace them, and we want to pass pagerank from the old pages to the new. A 301 redirect is the ideal way to pass pagerank. Examples of pages that are able to 301 redirect hashbang URLs include http://www.sherrilltree.com/Saddles#!Saddles and https://twitter.com/#!RobOusbey.
Intermediate & Advanced SEO | | DA20130 -
301 redirect subdirectory to new domain
I'm planning on using 301 redirects to spin out a subdirectory of my current website to be its own separate domain. For instance, I currently have a website www.website.com and my writers write tech news at www.website.com/news. Now I want to 301 redirect www.website.com/news to www.technews.com. Will this have any negative impact on SEO? What are some steps that I can take to minimize these impacts?
Intermediate & Advanced SEO | | Chris_Bishop1 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Reverse Proxy better than 301 redirect?
Are reverse proxies that much better than 301 redirects? Should I invest the time in doing this? I found out about reverse proxies here: http://www.seomoz.org/blog/what-is-a-reverse-proxy-and-how-can-it-help-my-seo
Intermediate & Advanced SEO | | brianmcc0