Best practice for removing pages
-
I've got some crappy pages that I want to delete from a site.
I've removed all the internal links to those pages and resubmitted new site maps that don't show the pages anymore, however the pages still index in search (as you would expect).
My question is, what's the best practice for removing these pages?
Should I just delete them and be done with it or make them 301 re-direct to a nicer generic page until they are removed from the search results?
-
Thanks for all the responses.
Think I'll opt for a server 301 to a suitable page
-
An alternative for removing them from search results, besides a 301, is meta robots - noindex. You would do this if you do not want them passing any link value to 301d pages, which it doesn't sound like you would want them to.
-Dan
-
Hi Shelly
If the pages are 'crappy' as you describe, then a good idea to delete them as poor quality pages can negatively affect the site overall.
If you are going to delete the pages, then yes, cisaz is right, ensure they are 301 redirected to a relevant page to retain some of the link benefits and to give direct visitors to those pages somewhere to go other than a 404 error page.
Also, request the removal of the URLs of the deleted pages within Google Webmaster Tools to speed up the de-indexing of them. (Will need a NoIndex robots tag on the pages, and/or a Disallow in the robots.txt file).
Regards
Simon
-
I would go with your second Idea, "Delete them and be done with it or make them 301 re-direct to a nicer generic page until they are removed".
Dont want to loose any link back juice you may have.
Hope this helps. ; )
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Reuse an old juicy URL or create a new with the best practices?
I'm optimizing a site with all new URL`s, categories, titles, descriptions. All URL's will change but I've old URLs with a lot of backlinks, SEO juice. What is better for SEO with them: 1 - Change those URLs and 301 redirect traffic to the new page.
Intermediate & Advanced SEO | | Tiedemann_Anselm
2 - Keep the URL and work just on new title, description, etc. In option 1 I understand that I'll lose some SEO juice because of the redirect, but the new URL will be correct. In option 2 everything will be strong except from the URL that will make less sense than with option 1. It will not exactly match the product name, title. It`s a reuse of a strong URL.0 -
What's the best way to noindex pages but still keep backlinks equity?
Hello everyone, Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages? For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page? The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
Intermediate & Advanced SEO | | fablau3 -
Does it make sense to create new pages with friendlier URLs then redirect old pages to new?
Hi Moz! My client has messy URLs. does it make sense to write new clean URLs, then 301 redirect all old URLs to the new ones? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Best practice for H1 on site without H1 - Alternative methods?
I have recently set up a mens style blog - the site is made up of articles pulled in from a CMS and I am wanting to keep the design as clean as possible - so no text other than the articles. This makes it hard to get a H1 tag into the page - are there any solutions/alternatives? that would be good for SEO? The site is http://www.iamtheconnoisseur.com/ Thanks
Intermediate & Advanced SEO | | SWD.Advertising0 -
Best way to handle page filters and sorts
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot. I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL. Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page. What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice: put rel canonical tag on all of the pages with parameters and point to "root" use the google parameter tool and have it not crawl any urls with my parameters put meta no robots on the parameter pages Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
NOINDEX listing pages: Page 2, Page 3... etc?
Would it be beneficial to NOINDEX category listing pages except for the first page. For example on this site: http://flyawaysimulation.com/downloads/101/fsx-missions/ Has lots of pages such as Page 2, Page 3, Page 4... etc: http://www.google.com/search?q=site%3Aflyawaysimulation.com+fsx+missions Would there be any SEO benefit of NOINDEX on these pages? Of course, FOLLOW is default, so links would still be followed and juice applied. Your thoughts and suggestions are much appreciated.
Intermediate & Advanced SEO | | Peter2640 -
Setting a 404, best practices
Is it enough to just delete a page, or is it necessary to do something else to 404 a page correctly? Is there a great link to explain how to set http status codes?
Intermediate & Advanced SEO | | nicole.healthline0