Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Thoughts about stub pages - 200 & noindex ok, or 404?
-
With large database/template driven websites it is often possible to get a lot of pages with no content on them.
What are the current thoughts regarding these pages with no content, options;
-
Return a 200 header code with noindex meta tag
-
Return a 404 page & header code
-
Something else?
Thanks
-
-
I would agree with all the comments on how to technically deal with the random pages, but it is a losing battle until you get your website database/templates under control. I once had a similar issue and had to work months to get a solution in place as the website would create all kinds of issues like this.
We had to implement a system so that the creation of these pages would be minimized. I think the issue is that you need to make sure that any random page requests, make sure they get a 404 to start with so that the URL does not get indexed to start with.
That said, all the random URLs that are already indexed, I like the 200 options with the noindex meta tag. My reasons: This is because otherwise with the 404s you get all these error messages that are meaningless in GWT. The noindex also gets the page out of the index. I have seen Google retry 404s on one of our sites, crazy. Ever since Google started showing soft 404s for 301s that redirect many pages to a single URL, I only try to use 301s on more of a one to one basis.
Good luck.
- topic:timeago_earlier,6 months
-
Ok, a understand better. I have the same problem with a Site un Drupal, I think is better use a robot.txt to block the empty pages.
These because the link juice that the page transfere is minimum and use extra resources from the server.
If you can't block with robots.txt the noindex,follow meta es ok. But if you see in Analytics that some Landing Pages are www.example.com/product/ {} random_text_here es better use a 404 with redirect 301 to Site Map for user experience.
-
Thanks for the info.
For more information, let me try and explain the scenario a little better.
When using a template to generate all product page on a site, often these are designed in a way so that any URLs of the form "www.example.com/product/{something}" will map to a script called "GenerateProductPage.java" likely based on the rule that anything in the /product/ directory will map there (or .asp etc depending on the language being used).
On the site, there are only going to be links to the actual products that are stored in the DB, so for a user there are no issues there.
But Google manages to find all manor of strange URLs and since they are of the form "www.example.com/product/{random_text_here}" then this also will 'try' and generate a product page. Since there is no actual product in the database called 'random_text_here' then this will result in an empty product page with nothing there except the template navigation, footer links and menus etc.
We currently are doing as you mentioned, by "noindex, follow" the pages for the same reasons you listed.
So the question was; is this ok to do? is this bad to do? (if so why). Is there any harm in doing things the current way? Should we be 404'ig the pages (and what value does this have over the other methods?) etc.
Thanks for your input Carlo as it shows your thoughts are along the same lines as ours.
Has anyone else got anything to add to the information provided?
Thanks
-
Hi, mmm, I not really sure that understand why you have invalid pages, options:
- Products without stock
- Is build based in other database
If you have a product name without content is better a meta noindex, follow because transferred link juice.
But like I say I dont know why these products exist. If you have more info I could help more
-
Thanks for the response.
I guess what I was getting at with the question is when websites are built on flexible platforms and can easily create these pages automatically.
For example, if there was flexible URLs in place whereby URLs such as www.example.com/product/{product_name} all mapped to one script which generated a product page.
So www.example.com/product/{invalid_product_name} would also work and essentially show a blank product page.
The question being, how is the best way to handle these for Google and is there any benefit/harm from either of the methods outlined in the original question.
Has anyone else any thoughts on best ways to handle these scenarios?
Thanks
-
If you know that a Page doesn't have content I recomend:
- A page without content have to response 404.
- If the Page return a 404 make a 301 to Site map.
- In the Site Map use meta noindex, follow to transfer the link juice.
- Eventually you need clean these pages because is bad for users and SEO.
Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Cookies disabled pointing to a 404 page
Hi mozzers, I am running an audit and disabled cookies on our homepage for testing purposes, this pointed to a 404 http response? I tried on other pages and they were loading correctly. I assume this is not normal? Why this is happening? and could this harm the site's SEO? Thanks!
Technical SEO | Dec 9, 2021, 12:39 PM | Taysir0 -
Yoast SEO. After set up 404 error pages
Hello all, Something strange happened with my blog site. I recently signed to MOZ tools. Initially everything was fine, but during my last crawl I got loads of 404
Technical SEO | Feb 14, 2016, 3:53 PM | A_Fotografy
pages. Few days ago I was tweaking some settings in SEO plugin according to this post https://moz.com/blog/setup-wordpress-for-seo-success What I noticed was that 404 pages were coming from my blog posts, but for
some reason category was missing in those posts. For example this link is 404
https://a-fotografy.co.uk/inchcolm-island-wedding-photography-bailie The one with category is https://a-fotografy.co.uk/wedding-pictures/inchcolm-island-wedding-photography-bailie/ So basically for some reason category was missing. Please let me know how can I fix this instead of doing hundreds of
redirects now. Thank you,
Regards,
Armands0 -
Should I noindex my blog's tag, category, and author pages
Hi there, Is it a good idea to no index tag, category, and author pages on blogs? The tag pages sometimes have duplicate content. And the category and author pages aren't really optimized for any search term. Just curious what others think. Thanks!
Technical SEO | Oct 10, 2014, 4:53 AM | Rignite0 -
Backlinks that we have if they are 404?
Hi All, Backlinks that we have if they are 404? Open site explorer shows 1,000 of links and when I check many are 404 and those are spammy links which we had but now the sites are 404 I am doing a link profile check which is cleaning up all spammy links Should i take any action on them? As open site explorer or Google still shows these links on the searches. Should we mention these URL's in disallow in Google webmaster. Thanks
Technical SEO | Jul 28, 2013, 11:57 PM | mtthompsons0 -
Best way to handle pages with iframes that I don't want indexed? Noindex in the header?
I am doing a bit of SEO work for a friend, and the situation is the following: The site is a place to discuss articles on the web. When clicking on a link that has been posted, it sends the user to a URL on the main site that is URL.com/article/view. This page has a large iframe that contains the article itself, and a small bar at the top containing the article with various links to get back to the original site. I'd like to make sure that the comment pages (URL.com/article) are indexed instead of all of the URL.com/article/view pages, which won't really do much for SEO. However, all of these pages are indexed. What would be the best approach to make sure the iframe pages aren't indexed? My intuition is to just have a "noindex" in the header of those pages, and just make sure that the conversation pages themselves are properly linked throughout the site, so that they get indexed properly. Does this seem right? Thanks for the help...
Technical SEO | Jun 26, 2013, 6:35 PM | jim_shook0 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | Jun 2, 2013, 12:00 PM | reidsteven750 -
Product Pages Outranking Category Pages
Hi, We are noticing an issue where some product pages are outranking our relevant category pages for certain keywords. For a made up example, a "heavy duty widgets" product page might rank for the keyword phrase Heavy Duty Widgets, instead of our Heavy Duty Widgets category page appearing in the SERPs. We've noticed this happening primarily in cases where the name of the product page contains an at least partial match for the desired keyword phrase we want the category page to rank for. However, we've also found isolated cases where the specified keyword points to a completely irrelevent pages instead of the relevant category page. Has anyone encountered a similar issue before, or have any ideas as to what may cause this to happen? Let me know if more clarification of the question is needed. Thanks!
Technical SEO | May 14, 2013, 7:16 PM | ShawnHerrick0 -
Adding 'NoIndex Meta' to Prestashop Module & Search pages.
Hi Looking for a fix for the PrestaShop platform Look for the definitive answer on how to best stop the indexing of PrestaShop modules such as "send to a friend", "Best Sellers" and site search pages. We want to be able to add a meta noindex ()to pages ending in: /search?tag=ball&p=15 or /modules/sendtoafriend/sendtoafriend-form.php We already have in the robot text: Disallow: /search.php
Technical SEO | Apr 5, 2012, 1:59 PM | reallyitsme
Disallow: /modules/ (Google seems to ignore these) But as a further tool we would like to incude the noindex to all these pages too to stop duplicated pages. I assume this needs to be in either the head.tpl or the .php file of each PrestaShop module.? Or is there a general site wide code fix to put in the metadata to apply' Noindex Meta' to certain files. Current meta code here: Please reply with where to add code and what the code should be. Thanks in advance.0