Is it bad practice to create pages that 404?
-
We have member pages on our site that are initially empty, until the member does some activity. Currently, since all of these pages are soft 404s, we return a 404 for all these pages and all internal links to them are js links (not links as far as bots are concerned). As soon as the page has content, we switch it to 200 and make the links into regular hrefs.
After doing some research, I started thinking that this is not the best way to handle this situation. A better idea would be to noindex/follow the pages (before they have content) and let the links to these pages be real links.
I'd love to hear input and feedback from fellow Mozzers. What are your thoughts?
-
Yair,
See the infographic on this page regarding rel nofollow tags in links, and when you may want to consider using them. Specifically, see the part about User Generated Content:
http://searchengineland.com/infographic-nofollow-tag-172157
However, Google can decide to crawl whatever they want to crawl, whether it is a nofollowed link, links on a page with a nofollow meta tag, or javascript links. If you really want to keep Google out of those portions of the site you should use the robots.txt disallow statement, as I mentioned in your other thread, or use the X-Robots-Tag as described here.
-
Thanks Everett,
As far as I know, nofollows don't conserve crawl budget. The bots will crawl the link, they just wont transfer any PR.
-
I'm sure Jane meant that it would block indexation of the page.
In my opinion you should probably noindex,follow (robots meta tag) the pages and make the internal links just normal links, possibly with a rel= nofollow link attribute, until the user fills out the profile.
I will go look into your other question as well.
PS: The drawback of this solution is that bots will still be spending crawl budget crawling those URLs if you are linking to them internally. [Edited]
-
Thanks for the clear and concise answer, Jane. You hit the nail right on the head! I appreciate your input.
One question, though. You say that noindex will block bot access to these pages. I'm pretty sure the bots will still crawl the pages (if they find them), just they won't be indexed and presumably they won't be "counted against us" like 404 pages. Is that what you meant?
If you have a minute, maybe you can help me out with this question next: http://moz.com/community/q/internal-nofollows
(Side note: Er_Maqul was referring to the original version of the question (before I edited it) where I had mistakenly written that we nofollow the links.)
-
Hi there,
A 404 certainly isn't the best way to handle a new URL / page before it is populated with content. It is good that Google isn't finding these pages yet (as you state in a later comment), but keep in mind that it could - the pages aren't linked to, but there is never any particular guarantee about what Google will and won't find. It's highly unlikely if you don't link to them, but still - it's not worth taking the risk. As you also say, there's no stopping anyone else from linking to the pages and / or for Google to go on a exploratory mission of its own.
As a note, internal 404s / 410s ("Gone") are perfectly okay if they're appropriate for the situation, i.e. a page has been removed. Not ever removed resource has to be 301ed elsewhere. This isn't the case here though.
To my mind, blocking bot access to these pages while they are empty is a better option, and noindex / follow would achieve this. I believe Er_Maqul has misunderstood what you were saying here - there is no "nofollow" in this situation.
-
Got it, I see.
Well, let's see here. I will state I am no expert in this realm. This is much more of a job for the likes of EGOL or RobertFischer. EGOL in particular with his intimate knowledge of NOINDEX
http://www.mattcutts.com/blog/google-noindex-behavior/
(Scroll down and look at the first post on Matt Cutt's blog there.)
That being said, I still have a few thoughts.
I think you certainly could continue to do what you are doing, I also think that Er_Maquai brought up a point that I touched on as well. Obviously the best scenario is to avoid a 404, and to have original content. Now unfortunately if you don't have have any content to write, or the person doesn't in this case, that set's you up for thin content, and a lot of duplicate content.
It seems to me there is no way to avoid having the extra pages without some sort of script or coding which houses the profiles somewhere other than on a separate page or on a flux capacitor somewhere. So going off that you could create a generic "no profile page" that gets published and use the rel=canonical tag.
I take back my prior statement about "ANY" content. Thin, pointless content, is thin and pointless, and won't benefit you at all. I hope that wasn't interpreted that way.
Again, I think this one is somewhat out of my scope of help, and it might even be worth calling in an SEO professional who specializes in forums for a second opinion. It's like having surgery, gotta go to another Dr to verify your diagnosis.
Sorry I couldn't give a better answer!
-
Wow - thanks for the thorough response, HashtagHustler!
Let me explain a little better...
We get hundreds of signups a day. Each new member has a profile page, which is empty until they do something. Sometimes they never do. So we don't link to the empty pages and they return a 404. As soon as the page has some content, we do link to it and it returns a 200.
Google is not reporting 404s for these pages because they are not linked to. In the pat, when we did link to them, Google reported them as soft 404s.
The current system is working fine.
My question is simply if it makes more sense to allow Google to find these pages (link to them) but noindex them since they do not have content (and are considered soft 404s by Google) or if we should continue doing it the way we are today (which makes me a little uncomfortable since we are creating 1000's of pages - that theoretically may be linked to by other sites - that are 404s)?
-
Yes... all 404 can hurt your SEO campaign, because even if they doesn't count for points (i don't know these), at least the spider ignores a bit more or crawl your page more slowly. Because this, you need to get away all your possible 404 errors.
Think, a empty page can have something. You can use a default text, or use a noindex tag while the page are empty, or simply make a standard profile and link to them all empty pages with a rel=canonical (I think there's the best option). But having more pages, even with low quantity of text, that's better who i doesn't have the pages itself, and, very much better than have 404 errors.
Also, think one more thing. Even more changes have your page, more indexed you're get and low times between indexing. There's another reason to have the pages working, even when they doesn't have any custom info. Because the creation of the page are a change, and customization of them are another change too.
-
Good Morning Yair!
A 404 is a 404 plain and simple. And if Google was able to report it, then they were able to get to it. Which basically means, someone else could. I'm not sure what platform you are using currently using but there are plenty of easy options for a quick fix.
If you wanted a simple option you could just throw up a 302 if you are planning on putting content up SOON. Temporary redirects differ from 301 because they are temporary. Google has to evaluate if they should keep the old page, so in this case, when you are planning on launching the old one (well, technically new one) you might be able to use that to your advantage.
A better idea would be to noindex/follow the pages (before they have content) and let the links to these pages be follow links.
I was unsure exactly what you meant by this. The only difference between follow and no follow links is that you are telling Google not to let your linkjuice carry over to the page you are linking to. You are linking to the page without vetting them. My apologies if you already knew this, but I was slightly confused by your sentence. Google will still go to that page to check it out.
Another option: change your schedule and don't put up pages that far in advance that you aren't planning on publishing. I edit everything offline. Google likes new stuff, especially content. Of course they take longevity into account as well, but as far as making a big splash, putting a website up piecemeal is like having people show up at your birthday saying "Suprise!" when you answer the door when you were the one who invited them. The notion is nice, but it just doesn't have the same effect. Not only in the Google realm, but even more so in Social Media.
My favorite option, and my personal recommendation would be to play the cards you were dealt. Get rid of the 404 and the 200 and embrace that you have a new page! Go on and write a profile piece for the member. Write some sort of Biographical data that can act as a placeholder. At this point it doesn't even need to be stuffed with keywords and amazing seo phrases.
On Second Thought: I'm not sure exactly what your forum is for, and if this issue is specific to a few members or if you are referring to bulk membership, but I have a few ideas on how you might be able to extrapolate some of the signup data into even a simple post to avoid getting a 404! Even if you parsed some of the forms from signup and created a simple little one page that displayed, I think that could help.
At the end of the day, Google loves forums that are strong and authoritative. They also understand that every single person on a forum isn't going to post and isn't going to interact. So depending on what kind of forum you have, and what exactly you are doing, some of the forum issues will just have to be accepted. I think it would be more valuable to clean up negative linking and and analyze internal issues and build form recognition etc. than fix soft 404's coming from a handful of users. Again all of this depends on the size of your pool.
Also, you could just track their usage, if someone is logging in all the time and not posting then fine, I would deal with it because who knows what they are doing in the real world. If someone made an account in 1999, and then threw their computer out of the window in Y2K and never bought another one because they still believe that everything crashed, well then maybe its time to fix that 404.
Hope that helps!
Sorry if my train of thought is a little off this morning.... not enough coffee!
-
Thanks for your quick response, Er.
You are correct about the 404s and I realized that what I wrote in the question was a mistake. We don't have any internal links to these pages (not even nofollow). Until there is content on the page, we make all links to the page into js links. I corrected this in the question now.
Concerning what you said about the pages being useful for SEO even without content: I don't think this is correct. Before we started 404ing the empty profile pages, Webmaster was reporting them as soft 404s. Doesn't this mean that they were hurting us (especially since we have many of them)?
-
Any 404 is bad for SEO. Make a little page for the profile without data, and you can gain a lot of SEO even if your pages doesn't have content.
Even if the links have a nofollow, google follows them to see what is in the other side. For this reason, avoid ALWAYS you can to have a link to a 404 page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Detecting Real Page as Soft 404 Error
We've migrated my site from HTTP to HTTPS protocols in Sep 2017 but I noticed after migration soft 404 granularly increasing. Example of soft 404 page: https://bit.ly/2xBjy4J But these soft 404 error pages are real pages but Google still detects them as soft 404. When I checked the Google cache it shows me the cache but with HTTP page. We've tried all possible solutions but unable to figure out why Google is still indexing to HTTP pages and detecting HTTPS pages as soft 404 error. Can someone please suggest a solution or possible cause for this issue or anyone same issue like this in past.
Intermediate & Advanced SEO | | bheard0 -
How will canonicalizing an https page affect the SERP-ranked http version of that page?
Hey guys, Until recently, my site has been serving traffic over both http and https depending on the user request. Because I only want to serve traffic over https, I've begun redirecting http traffic to https. Reviewing my SEO performance in Moz, I see that for some search terms, an http page shows up on the SERP, and for other search terms, an https page shows. (There aren't really any duplicate pages, just the same pages being served on either http or https.) My question is about canonical tags in this context. Suppose I canonicalize the https version of a page which is already ranked on the SERP as http. Will the link juice from the SERP-ranked http version of that page immediately flow to the now-canonical https version? Will the https version of the page immediately replace the http version on the SERP, with the same ranking? Thank you for your time!
Intermediate & Advanced SEO | | JGRLLC0 -
On 1 of our sites we have our Company name in the H1 on our other site we have the page title in our H1 - does anyone have any advise about the best information to have in the H1, H2 and Page Tile
We have 2 sites that have been set up slightly differently. On 1 site we have the Company name in the H1 and the product name in the page title and H2. On the other site we have the Product name in the H1 and no H2. Does anyone have any advise about the best information to have in the H1 and H2
Intermediate & Advanced SEO | | CostumeD0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
Home page not being indexed
Hi Moz crew. I have two sites (one is a client's and one is mine). They are both Wordpress sites and both are hosted on WP Engine. They have both been set up for a long time, and are "on-page" optimized. Pages from each site are indexed, but Google is not indexing the homepage for either site. Just to be clear - I can set up and work on a Wordpress site, but am not a programmer. Both seem to be fine according to my Moz dashboard. I have Webmaster tools set up for each - and as far as I can tell (definitely not an exper in webmaster tools) they are okay. I have done the obvious and checked that the the box preventing Google from crawling is not checked, and I believe I have set up the proper re-directs and canonicals.Thanks in advance! Brent
Intermediate & Advanced SEO | | EchelonSEO0 -
Infographics and articles on the same page
Hi i'm currently designing an infographic and the infographic is based on a article my writer had created. I was thinking of ways in which i can add the article and infographic so they complement each other. Obviously the infographic is more in-depth then the article as it contains much more information. The infographic is designed to go viral. I was thinking of putting the info graphic on the top of the page and the written content below it. This way the person looking at the infographic can scroll down to find the more in-depth written discussion/article on the topic. Also from a SEO perspective, the search engines can index the written content (as it won't be able to index the the infographic since it's a image. What do you guys think is the best approach for this situation? Regards, Matt
Intermediate & Advanced SEO | | Mattcarter080 -
Where is the best place for Landing Pages to reside on the Home Page?
On this site http://www.austintenantadvisors.com/ I have my main landing pages listed in the navigation under "Types". The reason why I did this is because I am not sure where to insert those on the home page where it does not look spammy to Google and looks natural for users. Obviously they need to appear somewhere on the home page for Google to be able to continue crawling and indexing them. Any thoughts or suggestions would be appreciated.
Intermediate & Advanced SEO | | webestate0 -
Page Authority Issue
My home page http://www.musicliveuk.com has a domain authority of 42 and page authority of 52. However I have set up other pages on the site to optimise for one keyword per page as I thought this was best practice. For example http://www.musicliveuk.com/home/wedding-bands targets 'wedding band' but this has a page authority of 24 way below my competitors. Having used the keyword difficulty tool on here it appears that is why I am struggling to rank highly (number 9). This is the same problem for several of my main keywords. I am building links to this and other pages in order to increase their authority and eventually rank highly but am I not better off optimising my home page that already has a good page authority and would probably out rank my competitors? Or am I missing something?
Intermediate & Advanced SEO | | SamCUK0