Best practice for removing pages
-
I've got some crappy pages that I want to delete from a site.
I've removed all the internal links to those pages and resubmitted new site maps that don't show the pages anymore, however the pages still index in search (as you would expect).
My question is, what's the best practice for removing these pages?
Should I just delete them and be done with it or make them 301 re-direct to a nicer generic page until they are removed from the search results?
-
Thanks for all the responses.
Think I'll opt for a server 301 to a suitable page
-
An alternative for removing them from search results, besides a 301, is meta robots - noindex. You would do this if you do not want them passing any link value to 301d pages, which it doesn't sound like you would want them to.
-Dan
-
Hi Shelly
If the pages are 'crappy' as you describe, then a good idea to delete them as poor quality pages can negatively affect the site overall.
If you are going to delete the pages, then yes, cisaz is right, ensure they are 301 redirected to a relevant page to retain some of the link benefits and to give direct visitors to those pages somewhere to go other than a 404 error page.
Also, request the removal of the URLs of the deleted pages within Google Webmaster Tools to speed up the de-indexing of them. (Will need a NoIndex robots tag on the pages, and/or a Disallow in the robots.txt file).
Regards
Simon
-
I would go with your second Idea, "Delete them and be done with it or make them 301 re-direct to a nicer generic page until they are removed".
Dont want to loose any link back juice you may have.
Hope this helps. ; )
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google webcache of product page redirects back to product page
Hi all– I've legitimately never seen this before, in any circumstance. I just went to check the google webcache of a product page on our site (was just grabbing the last indexation date) and was immediately redirected away from google's cached version BACK to the site's standard product page. I ran a status check on the product page itself and it was 200, then ran a status check on the webcache version and sure enough, it registered as redirected. It looks like this is happening for ALL indexed product pages across the site (several thousand), and though organic traffic has not been affected it is starting to worry me a little bit. Has anyone ever encountered this situation before? Why would a google webcache possibly have any reason to redirect? Is there anything to be done on our side? Thanks as always for the help and opinions, y'all!
Intermediate & Advanced SEO | | TukTown1 -
Is it best practice to have a canonical tags on all pages
The website I'm working on has no canonical tags. There is duplicate content so rel=canonicals need adding to certain pages but is it best practice to have a tag on every page ?
Intermediate & Advanced SEO | | ColesNathan0 -
Page Authority
Hi We have a large number of pages, all sitting within various categories. I am struggling to rank a level 3 for example, or increase authority of this page. Apart from putting it in the main menu or trying to build quality links to it, are there any other methods I can try? We have so many pages I find it hard to workout what the best way to internal link these pages for authority. At the moment they're classified in their relevant categories, but these go from level 1 down to 4 - is this too many classification levels?
Intermediate & Advanced SEO | | BeckyKey1 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
Is it bad practice to create pages that 404?
We have member pages on our site that are initially empty, until the member does some activity. Currently, since all of these pages are soft 404s, we return a 404 for all these pages and all internal links to them are js links (not links as far as bots are concerned). As soon as the page has content, we switch it to 200 and make the links into regular hrefs. After doing some research, I started thinking that this is not the best way to handle this situation. A better idea would be to noindex/follow the pages (before they have content) and let the links to these pages be real links. I'd love to hear input and feedback from fellow Mozzers. What are your thoughts?
Intermediate & Advanced SEO | | YairSpolter0 -
Best practices with reoccurring event listings
On our client's events page there are a few reoccurring events that each have their own detail page. I'm trying to figure out what's the best practice for minimising duplicate content. For example, for the Bribie Island Markets that repeat weekly there are 2 (+more) detailed event pages: http://www.ourbribie.com/e/bribie-island-markets/1869/2013-12-07/2013-12-07
Intermediate & Advanced SEO | | michaelp85
http://www.ourbribie.com/e/bribie-island-markets/1869/2013-12-14/2013-12-14 While they both contain duplicated content, they're unique in that they display the specific events date/time. My thinking is that the future events (e.g. 2013-12-14) should have a canonical link to the upcoming/next event (i.e. 2013-12-07). However this would require constantly updating/changing the canonical links. What's the best way to deal with this from a duplicate content prospective? Any better recommendations?0 -
Do 404 pages pass link juice? And best practices...
Last year Google said bad links to 404 pages wouldn't hurt your site. Could that still be the case in light of recent Google updates to try and combat spammy links and negative SEO? Can links to 404 pages benefit a website and pass link juice? I'd assume at the very least that any link juice will pass through links FROM the 404 page? Many websites have great 404 pages that get linked to: http://www.opensiteexplorer.org/links?site=http%3A%2F%2Fretardzone.com%2F404 - that was the first of four I checked from the "60 Really Cool...404 Pages" that actually returned the 404 HTTP Status! So apologies if you find the word 'retard' offensive. According to Open Site Explorer it has a decent Page Authority and number of backlinks - but it doesn't show in Google's SERPs. I'd never do it, but if you have a particularly well-linked to 404 page, is there an argument for giving it 200 OK Status? Finally, what are the best practices regarding 404s and address bar links? For example, if
Intermediate & Advanced SEO | | Alex-Harford
www.examplesite.com/3rwdfs returns a 404 error, should I make that redirect to
www.examplesite.com/404 or leave it as is? Redirecting to www.examplesite.com/404 might not be user-friendly as people won't be able to correct the URL in the address bar. But if I have a great 404 page that people link to, I don't want links going to loads of random pages do I? Is either way considered best practice? If I did a 301 redirect I guess it would send the wrong signal to the crawlers? Should I use a 302 redirect, or even a 304 Not Modified redirect?1 -
I try to apply best duplicate content practices, but my rankings drop!
Hey, An audit of a client's site revealed that due to their shopping cart, all their product pages were being duplicated. http://www.domain.com.au/digital-inverter-generator-3300w/ and http://www.domain.com.au/shop/digital-inverter-generator-3300w/ The easiest solution was to just block all /shop/ pages in Google Webmaster Tools (redirects were not an easy option). This was about 3 months ago, and in months 1 and 2 we undertook some great marketing (soft social book marking, updating the page content, flickr profiles with product images, product manuals onto slideshare etc). Rankings went up and so did traffic. In month 3, the changes in robots.txt finally hit and rankings decreased quite steadily over the last 3 weeks. Im so tempted to take off the robots restriction on the duplicate content.... I know I shouldnt but, it was working so well without it? Ideas, suggestions?
Intermediate & Advanced SEO | | LukeyJamo0