301, 404 or 410? what is the best practice
-
Hi
I'm currently working on a project to correct some really bad practices from years of different SEO's.
Basically they had made around 1500 pages of delivery counties and town, only change 3 words on every page.
Now apart from duplicate content issues, this has really hammered the site with the latest round of Panda updates.
I've pulled the pages, but i'm in several frames of mind on how to best fix this.
The pages won't ever be used again, so i'm thinking a 410 code would be best, but reading another post: http://moz.com/community/q/server-redirect-query i'm not sure if i should just let them go to 404's if anyone ever finds them.
Incidentally i'm Disavowing over 1100 root domains, so extremely unlikely to find links out there.
-
Thanks for the responses, a 410 is a lot of work for probably little gain, so i think i'll run with just leaving the 404.
I have done an analytic's check on the url's in question and 10 had a tiny bit of traffic, so for these only i'll 301 to one relevant page.
Thanks again.
-
404, 410 it does not matter, you have removed the pages that is the main thing.
But to be correct you should use a 410, as they are gone forever, while 404 just means not found.
-
Hi Paul,
If they are unlikely to have external links to them (or at least no good ones) and they are not internally linked then I think your best bet is just to let them 404. You should anyway fix up your 404 page to let users know there has been a site redesign or similar and give links to homepage and other important pages etc.
You could 410 them also which is said to remove these pages more quickly from the index and to be a final word that these pages no longer exist and will never come back but that might create more overhead than it is worth in regards setting up different 4xx header responses for different types of pages. The differences between 404 and 410 headers in practice seems to be very little according to most things I have read. Since there will be no links to them anyway, 404ing them is an easy solution and should not create any problems.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best factors to increase DA PA?
I want to increase DA Pa for Our Company website Tandem NZ. How Can i Increase it?
White Hat / Black Hat SEO | | Tandem-Digital0 -
Can I 301 redirect old URLs to staging URLs (ex. staging.newdomain.com) for testing?
I will temporarily remove a few pages from my old website and redirect them to a new domain but in staging domain. Once the redirection is successful, I will remove the redirection rules in my .htaccess and get the removed pages back to live. Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
Do I need to undo a 301 redirect to dissavow links from the source domain?
A client came to me after being hit by Penguin and had already performed a 301 redirect from site A to Site B. Site B was subsequently hit by the penalty a number of weeks later and we are planing on performing link removal for Site A. Only the webmaster tools account for Site B exists, none is still available for site A. I assume that I cannot dissavow links to site A from Site B's webmaster tool account (even though website A's links show up in the GWT account). So do I need to undo the 301 and then create a new GWT account for site A in order to disavow the links pointing to site A, or can I submit from Site B's GWT account since they are 301'd to site B? Thanks! Chris [edited for formatting]
White Hat / Black Hat SEO | | SEOdub0 -
Can i 301 redirect a website that does not have manual penalty - but definetly affected by google
ok, i have a website (website A) which has been running since 2008, done very nicely in search results, until january of this year... it dropped siginificantly, losing about two thirds of visitors etc... then in may basically lost the rest... i was pulling my hair out for months trying to figure out why, i "think" it was something to do with links and anchor text, i got rid of old SEO company, got a new SEO company, they have done link analysis, trying to remove lots of links, have dissavowed about 500 domains... put in a reconsideration request... got a reply saying there is no manual penalty... so new seo company says all they can do is carry on removing links, and wait for penguin to update and hopefully that will fix it... this will take as along as it takes penguin to update again... obviously i can not wait indefinetely, so they have advised i start a new website (website B)... which is a complete duplicate of website A. Now as we do not know whats wrong with website A - (we think its links - and will get them removed) my seo company said we cant do a 301 redirect, as we will just cause what ever is wrong to pass over to website B... so we need to create a blank page for every single page at website A, saying we have moved and put a NO FOLLOW link to the new page on website B.... Personally i think the above will look terrible, and not be a very user friendly experience - but my seo company says it is the only way to do it... before i do it, i just wanted to check with some experts here, if this is right? please advise if 301 redirects are NOT correct way to do this. thanks
White Hat / Black Hat SEO | | isntworkdull
James0 -
Attracting custom from 3 cities - Is this the best way to optimize?
Hi, I'm working for a client that draws custom from 3 nearby cities - I was thinking of creating a new page for 2 of the cities, reachable from within the website and not simply doorway pages. Each new page would include (1) General info (2) info relevant to the city in question, if relevant to client - perhaps well-known customers already coming from the city in question (3) transport from the city - directions. Is it OK to do this, or could Google see it as manipulative seeing that business is not geographically located in all 3 cities (in actual fact the business is in just one location, within the official borders of one city, in another city for some administrative services and 40 miles away from the third). Thanks in advance, Luke
White Hat / Black Hat SEO | | McTaggart0 -
What's the best way to set up 301's from an old off-site subdomain to a new off-site subdomain?
We are moving our Online store to a new service and we need to create 301's for all of the old product URLs. Being that the old store was hosted off-site, what is the best way to handle the 301 re-directs? Thanks!
White Hat / Black Hat SEO | | VermilionDesignInteractive0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
Best Link Building Practices to Avoid Over Optimizing
With all the new over opting talk, one of the things mentioned is having the same anchored text linking to a page over and over without variation. Is there a good estimate on how many external linking in keywords should be exact versus how many should be in variation? Also, keeping value of pages links in mind. Would it be best to use [Exact] phrase for the higher PR sites or more relevant higher traffic sites? and save the long tail or keyword variation text for the lesser valued sites. When to use exact phrase and when to long tail is my question/discussion I always stay relevant in my link building, and all my links are liking within context. Because I know that relevancy has been an important factor. After watching this video from Matt Cutt's http://youtu.be/KyCYyoGusqs I assume relevancy is becoming even more of an important factor.
White Hat / Black Hat SEO | | SEODinosaur0