Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
410 or 301 after URL update?
-
Hi there,
A site i'm working on atm has a thousand "not found" errors on google console (of course, I'm sure there are thousands more it's not showing us!).
The issue is a lot of them seem to come from a URL change. Damage has been done, the URLs have been changed and I can't stop that... but as you can imagine, i'm keen to fix as many as humanly possible.
I don't want to go mad with 301s - but for external links in, this seems like the best solution?
On the other hand, Google is reading internal links that simply aren't there anymore. Is it better to hunt down the new page and 301-it anyway? OR should I 410 and grit my teeth while google crawls and recrawls it, warning me that this page really doesn't exist?
Essentially I guess I'm asking, how many 301s are too many and will affect our DA? And what's the best solution for dealing with mass 404 errors - many of which aren't attached or linked to from any other pages anymore?
Thanks for any insights
-
Yeah, of course I can explain more.
HTTP 410 status code tells google that you've eliminated that page and will never be live again.
So google will kill that URL in its database and never ever crawl it again. Thus said, GoogleBot follows assumptions that site is working poorly or that there is some big problem.
How can this happen? when you have a massive amount of 404, 301 redirects, 410, 5xx you might have your site downgraded, possible deindexed, reduced bot crawling frequency or any other penalty you might imagine.Some info about 410 status code:
HTTP/1.1 status code definitionsHope it helps.
Best luck.
GR -
That's really helpful thank you.
Based on the videos you sent, I'll keep 301ing these pages to take the user to the right place!
When you say 410s are "too powerful", can you elaborate? Some old blog posts or non-existant pages (with no updated page to redirect users to) I've 410ed. Would you recommend something else?
-
Hi Fubra!
There are some things to say here:
- Google can handle up to 5 redirects, so if you are under that number, you are fine.
- Serving 404's to google is not wrong neither will impact negatively in your rankings. But only give google a 404 only if that's the correct answer to the user.
- GoogleBot re-crawls from time to time every URL that has discovered in your website lifetime. The only URLs that will not re-crawl are those that you served a 410. HANDLE WITH CARE, 410 are too powerful.
- Redirects and DA are differents things. Dont know the new Mozscape Algorithm, I do not think that redirects would impact negatively in your DA. Focus in trying not to get GoogleBot angry or close some doors.
That said, my advice is: Crawl, scrape or manually get all those old URLs and analyze them. Then decide, whether to: 301, 404 or 410. Also, I'd give a little time to create a really powerful, interesting and userfriendly 404 page, so if some user land there they can keep being in your site and there is no bonces.
To back what i'm saying about how many redirections, Matt Cutts said it in these videos:
Can too many redirects from a single URL have a negative effect on crawling? Is there a limit to how many 301 (Permanent) redirects I can do on a site?Hope it helps.
Best Luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirect in breadcrumb. How bad is it?
Hi all, How bad is it to have a link in the breadcrumb that 301 redirects? We had to create some hidden category pages in our ecommerce platform bigcommerce to create a display on our category pages in a certain format. Though whilst the category page was set to not visable in bigcommerce admin the URL still showed in the live site bread crumb. SO, we set a 301 redirect on it so it didnt produce a 404. However we have lost a lot of SEO ground the past few months. could this be why? is it bad to have a 301 redirect in the breadrcrumb.
Intermediate & Advanced SEO | | oceanstorm0 -
Old URL that has been 301'd for months appearing in SERPs
We created a more keyword friendly url with dashes instead of underscores in December. That new URL is in Google's Index and has a few links to it naturally. The previous version of the URL (with underscores) continues to rear it's ugly head in the SERPs, though when you click on it you are 301'd to the new url. The 301 is implemented correctly and checked out on sites such as http://www.redirect-checker.org/index.php. Has anyone else experienced such a thing? I understand that Google can use it's discretion on pages, title tags, canonicals, etc.... But I've never witnessed them continue to show an old url that has been 301'd to a new for months after discovery or randomly.
Intermediate & Advanced SEO | | seoaustin0 -
301 Redirecting from domain to subdomain
We're taking on a redesign of our corporate site on our main domain. We also have a number of well established, product based subdomains. There are a number of content pages that currently live on the corporate site that rank well, and bring in a great deal of traffic, though we are considering placing 301 redirects in place to point that traffic to the appropriate pages on the subdomains. If redirected correctly, can we expect the SEO value of the content pages currently living on the corporate site to transfer to the subdomains, or will we be negatively impacting our SEO by transferring this content from one domain to multiple subdomains?
Intermediate & Advanced SEO | | Chris81980 -
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Changing URL structure of date-structured blog with 301 redirects
Howdy Moz, We've recently bought a new domain and we're looking to change over to it. We're also wanting to change our permalink structure. Right now, it's a WordPress site that uses the post date in the URL. As an example: http://blog.mydomain.com/2015/01/09/my-blog-post/ We'd like to use mod_rewrite to change this using regular expressions, to: http://newdomain.com/blog/my-blog-post/ Would this be an appropriate solution? RedirectMatch 301 /./././(.) /blog/$1
Intermediate & Advanced SEO | | IanOBrien0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
Multiple URLs for the same page
I am working with a client and recently discovered that they have several URLs that go to the same page. http://www.maps.com/FunFacts.aspx
Intermediate & Advanced SEO | | WebMarketingandDesign
http://www.maps.com/funfacts.aspx
http://www.maps.com/FunFacts.aspx?nav=FF
http://www.maps.com/FunFacts.aspx?nav=FS
http://www.maps.com/funfacts.aspx?nav=FF
http://www.maps.com/funfacts.aspx?nav=ffhttp://www.maps.com/FunFacts.aspx?nav=MShttp://www.maps.com/funfacts.aspx?nav=
http://www.maps.com/FunFacts.aspx?nav=FF#
http://www.maps.com/FunFacts
http://www.maps.com/funfacts.aspx?.nav=FF I am afraid this is happening all over the site. So, my question is: Is this hurting the SEO and how? If so what is the best way to go about fixing this problem? Thanks for your help!0