Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best Practice for Deleting Pages
-
What is the best SEO practice for deleting pages? We have a section in our website with Employee bios, and when the employee leaves we need to remove their page.
How should we do this?
-
What we decided to do was a process for each deletion along with setting up a 301 redirect for any missing or incorrect bio to our Bio's home page.
First, we will remove the bio from the XML sitemap and resubmit the sitemaps.
Second, we wait a couple of days then delete the actual bio page from our site.
So far this seems to be going alright.
-
Brent, we'd be interested in hearing what you chose to do in the case with the employee bios, and if you encountered anything unexpected.
-
I'd just 301 it to your homepage, seriously doubt it would be worth the effort doing anything else unless this employee was famous and getting links from all around the web.
If you must, you could always do what others have suggested and write a nice "no longer working with us (content rich page) " and 301 all past employees pages to it.
-
I agree that it would serve little or no positive gain in terms of SEO, however, for usability and customer friendliness it should be a win-win.
Without our principles, where would our industry be?

-
I think that's a great idea! Having a custom 404 for deleted employees would be great for branding purposes and general web 2.0 friendliness - I'm sure SEOmoz would agree.
However from a strictly SEO point of view, removing the content and replacing it with 404esque material wouldn't help. However my comment(s) is pretty much a moot point given that there is almost certainly no SEO value on this page anyway. But I guess I'm just a principles kind of guy.
-
Why don't you 301 to either the main bio entry page or create a page for deleted empoyees (kinda like a custom 404) and update your sitemap. That way no benefit is lost and anyone landing on the page from say an external link, will not get frustrated.
-
This is why I suggested the Google webmaster tools.
Bing has a similar tool aswell.
-
Any negatives to using 301 on something like this?
-
Hey Brent,
Bing and Google won't see a 404 if you redirect. There also wouldn't be an issue with duplicate content - what exactly are you referring to here?
Speaking of 404s... your avatar is doing one.
-
I would rather delete the page, but I just hate having Google/Bing seeing 404s for a while. I would redirect but don't want to duplicate content pages.
-
There's always a way. Perhaps I would unlink it from the employee bios and whack on a noindex,follow meta tag to ensure it still passes rank if it was being linked to. This way users would never find it.
But more often than not I would just 301 unless for some reason there was a bunch of PageRank that would get lost in a redirect to an irrelevant(ish) page.
-
Normally I would agree Nick but he already stated the employees have left the company, leaving content about them on the site is not proper business.
-
It sounds like it's just a simple case of deletion? In this case, set up a 301 redirect so that it points to the employee bios 'home' page. That way any links that were pointing to the removed page will have their 'juice' moved to a page that does exist. Although with the content not being the same, the amount of PageRank passed is dubious but still worth doing.
If you do a 301 then you wouldn't have to worry about updating HTML sitemaps. But Bing does openly admit that they hate untidy XML sitemaps (i.e. URLs that include 301s, 302s, 404s etc) so I would clean that up - and probably do the same for Google too while I'm at it.
Personally, as an SEO (with varying degrees of tunnel vision) I wouldn't want to ever delete content.
-
Remove it and also update your sitemap to reflect the change. I know in Google webmaster tools it will allow you to block certain pages from now being crawled.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Optimization Error
Hi, I am trying to track a page optimization feature for one of my project, https://shinaweb.com but i keep getting this below error: "PAGE OPTIMIZATION ERROR
On-Page Optimization | | shinawebnavid
There was a problem loading this page. Please make sure the page is loading properly and that our user-agent, rogerbot, is not blocked from accessing this page." I checked robots.txt file, it all looks fine. Not sure what is the problem? Is it a problem with Moz or the website?0 -
Landing page separate from product page
Hello there, I have a wordpress website with a woocommerce plugin. I have 4 landing pages that describe my products and at the end of the pages, I have a CTA to my product page. is it bad for SEO? my website: https://relationadviser.ir
On-Page Optimization | | Aaron.be1 -
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
Should I optimize my home-page or a sub-page for my most important keyword
Quick question: When choosing the most important keyword set that I would like to rank for, would I be better off optimizing my homepage, or a sub page for this keyword. My thinking goes as follows: The homepage (IE www.mysite.com) naturally has more backlinks and thus a better Google Page Rank. However, there are certain things I could do to a subpage (IE www.mysite.com/green-widgets-los-angeles ) that I wouldn't want to do to the homepage, which might be more "optimal" overall. Option C, I suppose, would be to optimize both the homepage, and a single sub-page, which is seeming like a pretty good solution, but I have been told that having multiple pages optimized for the same keywords might "confuse" search engines. Would love any insight on this!
On-Page Optimization | | Jacob_A2 -
Home page and category page target same keyword
Hi there, Several of our websites have a common problem - our main target keyword for the homepage is also the name of a product category we have within the website. There are seemingly two solutions to this problem, both of which not ideal: Do not target the keyword with the homepage. However, the homepage has the most authority and is our best shot at getting ranked for the main keyword. Reword and "de-optimise" the category page, so it doesn't target the keyword. This doesn't work well from UX point of view as the category needs to describe what it is and enable visitors to navigate to it. Anybody else gone through a similar conundrum? How did you end up going about it? Thanks Julian
On-Page Optimization | | tprg0 -
Punctuation at the Start of Page Titles
one of my clients appears to be using an exclamation mark (e.g. "! Graphic Prints By Mirrorin - Fun Childrens Graphic Prints") and to be completely honest, I have no idea if this is bad practice or if it wont have any affect from an SEO point of view? Any help would be appreciated because it is site wide, therefore if it is an issue I would like to be able to get it sorted asap! Thanks
On-Page Optimization | | ZaddleMarketing0 -
Missing meta descriptions on indexed pages, portfolio, tags, author and archive pages. I am using SEO all in one, any advice?
I am having a few problems that I can't seem to work out.....I am fairly new to this and can't seem to work out the following: Any help would be greatly appreciated 🙂 1. I am missing alot of meta description tags. I have installed "All in One SEO" but there seems to be no options to add meta descriptions in portfolio posts. I have also written meta descriptions for 'tags' and whilst I can see them in WP they don't seem to be activated. 2. The blog has pages indexed by WP- called Part 2 (/page/2), Part 3 (/page/3) etc. How do I solve this issue of meta descriptions and indexed pages? 3. There is also a page for myself, the author, that has multiple indexes for all the blog posts I have written, and I can't edit these archives to add meta descriptions. This also applies to the month archives for the blog. 4. Also, SEOmoz tells me that I have too many links on my blog page (also indexed) and their consequent tags. This also applies to the author pages (myself ). How do I fix this? Thanks for your help 🙂 Regards Nadia
On-Page Optimization | | PHDAustralia680 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5