Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What to do with removed pages and 404 error
-
I recently removed about 600 'thin' pages from my site which are now showing as 404 errors in WMT as expected. As I understand it I should just let these pages 404 and eventually they'll be dropped from the index. There are no inbound links pointing at them so I don't need to 301 them. They keep appearing in WMT as 404's though so should I just 'mark as fixed' until they stop appearing? Is there any other action I need to take?
-
If they are truly gone, then a 410 would be the best option for you. Since they are indexed even if there are no links pointing at them, people can still find them besed upon what they are searching for. You never know when your link will show up, because you dont know how long google is going to take to get rid of the links.
http://www.checkupdown.com/status/E410.html
"The 410 error is primarily intended to assist the task of Web maintenance by notifying the client system that the resource is intentionally unavailable and that the Web server wants remote links to the URL to be removed. Such an event is common for URLs which are effectively dead i.e. were deliberately time-limited or simply orphaned. The Web server has complete discretion as to how long it provides the 410 error before switching to another error such as 404"
We did this for a client that needed old defunct pages removed. Once you set the pages to return a 410, and use Google url removal tool, you should see them dropping off really quick. (all of ours were gone within a month) Having that many pages return a 404 may be hurting the experience of your users as when they see a 404, they go right for the back button.
-
410 is the recommended way to tell search engines the page is gone. all of the things mentioned above are a facet of how you should deal with this issue. sorry for the brevity and terrible punction. moz forum is a pretty iffy thing via mobile. my eggs are getting cold.
-
Hi!
The reason why these pages keep popping up in WMT is that they have already been indexed. You could try to remove them from Google's index by using the removal tool in WMT (https://www.google.com/webmasters/tools/url-removal) or by setting up "301 Redirect" for them to more ideal pages.
Hope this helps
Anders -
Hi,
I would look at this from two perspectives.
1. These thins pages could have been beefed-up with some unique content or at least the content could have been re-written to make them unique. Personally, I prefer to make the duplicate pages unique instead of deleting them.This of course depends on the number of pages and the level of duplication.
2. Now that these pages have been removed from the website, you should be erasing all the links to these from within the website from all the places like, sitemaps and internal linking so that the search engines do not find a link pointing to them that might end-up in a 404 error. You should also consider if there have been any references left to these pages from third-party web properties.
Best regards,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Optimization Error
Hi, I am trying to track a page optimization feature for one of my project, https://shinaweb.com but i keep getting this below error: "PAGE OPTIMIZATION ERROR
On-Page Optimization | | shinawebnavid
There was a problem loading this page. Please make sure the page is loading properly and that our user-agent, rogerbot, is not blocked from accessing this page." I checked robots.txt file, it all looks fine. Not sure what is the problem? Is it a problem with Moz or the website?0 -
Site name in page title - leave it or remove it?
Hi all, Recently came across some authority blog (quicksprout to be precise) which stated that apart from main page, contact page, about us and some other generic pages, site name should be removed as it might produce duplicate content. example "How to blog | Example Site name" This mostly is the issue with tags and categories pages as it shows on Moz issues. Is that really a problem and site name should be taken off them? Thank you.
On-Page Optimization | | Optimal_Strategies1 -
Page Title Length
Hi Gurus, I understand that it is a good practice is to use 50-60 characters for the a page title length. Google appends my brand name to the end of each title (15 characters including spaces) it index. Do I need to count what google adds as part of the maximum recommended length? i.e.
On-Page Optimization | | SunnyMay
is the maximum 50-60 characters + the 15 characters brand name Google adds to the end of the title or 50-60 including the addition? Many thanks!
Lev0 -
Why are http and https pages showing different domain/page authorities?
My website www.aquatell.com was recently moved to the Shopify platform. We chose to use the http domain, because we didn't want to change too much, too quickly by moving to https. Only our shopping cart is using https protocol. We noticed however, that https versions of our non-cart pages were being indexed, so we created canonical tags to point the https version of a page to the http version. What's got me puzzled though, is when I use open site explorer to look at domain/page authority values, I get different scores for the http vs. https version. And the https version is always better. Example: http://www.aquatell.com DA = 21 and https://www.aquatell.com DA = 27. Can somebody please help me make sense of this? Thanks,
On-Page Optimization | | Aquatell1 -
Should I optimize my home-page or a sub-page for my most important keyword
Quick question: When choosing the most important keyword set that I would like to rank for, would I be better off optimizing my homepage, or a sub page for this keyword. My thinking goes as follows: The homepage (IE www.mysite.com) naturally has more backlinks and thus a better Google Page Rank. However, there are certain things I could do to a subpage (IE www.mysite.com/green-widgets-los-angeles ) that I wouldn't want to do to the homepage, which might be more "optimal" overall. Option C, I suppose, would be to optimize both the homepage, and a single sub-page, which is seeming like a pretty good solution, but I have been told that having multiple pages optimized for the same keywords might "confuse" search engines. Would love any insight on this!
On-Page Optimization | | Jacob_A2 -
How to remove subdomains in a clean way?
Hello, I have a main domain example.com where I have my main content and then I created 3 subdomains one.example.com, two.example.com and three.example.com I think the low ranking of my subdomains is affecting the ranking of my main domain, the one I care the most. So, I decided to get rid of the subdomains. The thing is that only for one.example.com I could transfer the content to my main domain and create 301 redirects. For the other two subdomains I cannot integrate the content in my main domain as it doesn't make sense. Whats the cleanest way to make them dissapear? (just put a redirect to my main domain even if the content is not the same) or just change the robots to "noindex" and put a 404 page in the index of each subdomain. I want to use the way that will harm the least the performance with Google. Regards!
On-Page Optimization | | Gaolga0 -
Missing meta descriptions on indexed pages, portfolio, tags, author and archive pages. I am using SEO all in one, any advice?
I am having a few problems that I can't seem to work out.....I am fairly new to this and can't seem to work out the following: Any help would be greatly appreciated 🙂 1. I am missing alot of meta description tags. I have installed "All in One SEO" but there seems to be no options to add meta descriptions in portfolio posts. I have also written meta descriptions for 'tags' and whilst I can see them in WP they don't seem to be activated. 2. The blog has pages indexed by WP- called Part 2 (/page/2), Part 3 (/page/3) etc. How do I solve this issue of meta descriptions and indexed pages? 3. There is also a page for myself, the author, that has multiple indexes for all the blog posts I have written, and I can't edit these archives to add meta descriptions. This also applies to the month archives for the blog. 4. Also, SEOmoz tells me that I have too many links on my blog page (also indexed) and their consequent tags. This also applies to the author pages (myself ). How do I fix this? Thanks for your help 🙂 Regards Nadia
On-Page Optimization | | PHDAustralia680 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5