Why would changing 404 pages increase traffic by 9%?
-
Neil Patel claimed in this article that by creating a custom 404 page that links out to 25 to 50 random internal pages on the website, he was able to increase the traffic of Techcrunch by 9%.
I'm a bit skeptical about this claim. A couple of questions:
- Is this theory sound? If you've personally tried this or have read other articles supporting Neil, I'd love to learn more.
- Would a big site like Techcrunch really have problems with Google not indexing all of its pages?
- Also, does getting more pages crawled help you get more traffic? Specifically, would it help a site like mine? For reference, my site gets an average of 12,040 pages crawled per day in last 90 days. Currently 28,922 pages have been indexed.
- Are there any possible downsides to trying this?
Thanks!
-
I agree with you.
It was a chest thumper article. It simply crows about rigging a fix for historic bad practice. It does not explain what caused the problem and the randon links solution isn't the best way to handle 404 traffic and probably not the best way to repair site structure problems.
-
Hi Robert! Nice seeing you again.
Yeah, it looks like Neil is pushing his Quicksprout Traffic University very aggressively. Have you heard either good or bad things about that course? It has a money back guarantee so there's no risk to money. But if the advice is bad, then the damage might be severe.
-
The article claims that the traffic bump is coming from increased indexing of TechCrunch due to Neil adding a widget that links out to 25-50 random internal pages on the 404 error pages.
But your explanation makes more sense, and that means this article is kinda misleading. Most small-mid sized sites don't have problems with systematically deleted pages. So for this tip to be added to an article directed at webmasters of small-mid sized sites seems a bit out of place.
-
I feel the same way about the ads. They are like a bad dog that rushes at you when you approach the property line and chases you down the street. Worse even than Forbes - and they are really bad IMO.
-
Egol's answer is really well thought out. One thing that really surprised me by Neil's article was the huge slap in the face ads. I say no to the first and it is as if I am assaulted over and over. Cannot stand that kind of experience and I actually think Neil is quite bright. Bummer.
Best
-
You can do this and it might be helpful. But, I am betting that TechCruch had big problems from tons of systematically deleted pages.
So, Neil Patel did not really "create" this traffic, he simply "salvaged" it... grabbed it before it went down the drain as a result of sloppy work by TechCrunch.... and like most "salvaged goods" it was probably low quality traffic after being 404ed and disappointed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
More Singular KW Targeted Landing Pages vs. Less Multiple KW Targeted Landing Pages
So my question is... I have a adopted a site which currently ranks quite well for some industry competitive keywords with a number of poor quality landing pages which specifically target a singular keyword. I am wondering if its worth merging some of these pages together into one authoritative, better quality landing page targeting multiple keywords (as the intent for some of these keywords are largely the same). What i don't want to do is jeopardise the existing rankings in doing so. The alternative option would just be to improve the content on the existing landing pages without merging. What are peoples thoughts on this? Are there any positive case studies out there where merging has had a positive effect? Any help would be great. Regards,
On-Page Optimization | | NickG-1231 -
Moving from Bigcommerce to Woocommerce on WP. Should we redirect size pages into one page?
We are moving from Bigcommerce to Woocommerce on WP. On Bigcommerce, due to some bizarre reasoning the previous developer had 3 separate URLS for the same product in different sizes - S, M and L. Now we plan to have one product page where the sizes can be selected and 301 redirect the 3 urls to the new one. Is this advisable? Or should we just have 3 separate pages. OR should we have one of the sizes pages as the new page and then redirect the other 2 to this one? I ask this because the site has a LOT of ranking power and we do not want to jeopardise that.
On-Page Optimization | | MashBonigala0 -
Is it still worth changing a url with half the pages target keyphrase in to the entire phrase still ?
Hi If a pages url has half the pages target keyphrase (i.e. 1 word instead of 2) is it still worth changing to include entire keyphrase (2 words) given need to then add 301 redirects etc after ? If it was a new page then I would definately include full keyphrase but the page is a few months old and has quite high page authority as is (i know a 301 should transfer most authority) but given this page and other sub pages would also need to be 301'd if this change occurs and the dev time/cost that would incurred/charged by the design/dev agency. Also thinking Google being cleverer now (the pages content will be about the target kw) so thinking G would work it out from rest of page content and partial match kw in url. In other words to best target keyphrase is it best to leave url as is or change url to include keyphrase ? For example if the pages target kw is 'swimming clubs' and the current url is www.franksleisurecentres.com/clubs changing it to www.franksleisurecentres.com/swimming-clubs :Thanks Dan
On-Page Optimization | | Dan-Lawrence0 -
Too many page links warning... but each link has canonical back to main page? Is my page OK?
The Moz crawl warns me many of my pages have too many links, like this page http://www.webjobz.com/jobs/industry/Accounting ...... has 269 links but many of the links are like this /jobs/jobtitles/Accounting?k=&w=3&hiddenLocationID=463170&depth=2 and are used to refine search criteria.... when you click on those links they all have a canonical link back to http://www.webjobz.com/jobs/industry/Accounting Is my page being punished for this? Do I have to put "no follow" tags on every link I do not want the bots to follow and if I do so is Roger (moz bot) not going to count this as a link?
On-Page Optimization | | Webjobz0 -
PAGE TİTLE
<title> </span>Home to home moving 4356 <span></title> page A <title> </span>Home to home moving 3723 <span></title> page B These two titles are the same?
On-Page Optimization | | iskq0 -
Old pages
I have a site where I have 5,000 new products each year, I never waned to deleted the old pages due to links pointing to them and keywords. But I now have 20,000 plus pages, does having that many pages spread out my link juice or does it effect me in any other ways over having a site with 5,000 pages or should I keep not deleting old pages so I dont loose any links? Along with that I currently do not link to my old pages from my site so Im guessing google does not get to them very often if at all, if you agree to still keep them should I link to them somewhere? Because the products are not that simiiar and they do bring added value I dont think canonical would work here
On-Page Optimization | | Dirty0 -
Page without content
Hey Everyone, I've started an SEO On Page analysis for a web site and I've found a lot of duplicate content and useless pages. What do I have to do? Delete this useless page, redirect or do canonical tag? If I have to delete what is the best way to do? Should I use GWT to delete? or just delete from the server? This URL for example: http://www.sexshopone.com.br/?1.2.44.0,0,1,13,0,0,aneis-evolved-boss-cock's.html [admin note: NSFW page} There is no content and it is duplicate in reference of this: http://www.sexshopone.com.br/?1.2.44.0,0,1,12,0,0,aneis-evolved-boss-cock's.html [admin note: NSFW page} and the correct page of the product is: http://www.sexshopone.com.br/?1.2.44.0,423,anel-peniano-evolved-boss-cock's-pleasure-rings-collar-white-reutilizavel-e-a-prova-d'agua-colecao-evolved.html [admin note: NSFW page} What is happening is that we have 8.000 pages like this. Useless and without any content. How do I proceed? Thanks!
On-Page Optimization | | luf07090 -
Page URL Hiearchy
So I have read on here that page URL Hiearchy is important. My question is from a search engine standpoint which of the following methods would be the best to use (or another if not listed) COMPACT and naturally hierarchical MountainBiking.com MountainBiking.com/adventures ( a list of the pages below ) MountainBiking.com/adventures/in whistler (for each page) MountainBiking.com/adventures/in utah OR VERBOSE but reptetive MountainBiking.com MountainBiking.com/Mountain Biking adventures ( intro + a list of the pages below ) MountainBiking.com/Mountain Biking Adventures/Mounting Biking adventures in whistler MountainBiking.com/Mountain Biking Adventures/Mountain Biking Adventures in Utah It seemed like the blog I read suggested the compact form, but it seems to me that the verbose (though admittedly a bit clunky) seems better so far as exact keyword match etc. Experience and or advice on this?
On-Page Optimization | | bThere0