Webmaster tools 404
-
Hey,
I'm getting a soft 404 error on a webpage that has content and is deferentially not a 404.
We've redirect a load of urls to the web page. The url has parameters which was used before the redirect but are no longer used on by the new url, these parameters have been carried over in the redirect.
Is this whats causing the soft 404 error or is there another problem that may need addressing?
Also a canonical has been set on the webpage.
Thanks, Luke.
-
Hi Luke,
I think I'm going to need a little more information to fully understand your question. Could you give us the URL in question?
Best,
Kristina
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 Error Complications
Hello Moz World! I am receiving a 404 error on one of my webpages. When I directly input the URL into my search bar I receive a 404 error. However, when I am on my website and link over to the broken webpage from my website I do not receive an error. The page will show up with no issues, and the address in the URL is the address that is receiving the 404 error. i.e. www.mywebsite.com/services Does anyone know how i should go about troubleshooting this issue? Any suggestions on how I can resolve this? To me, I would think that if the link is not broken when being directed from the website, it shouldn't be broken when entering the url directly into the search bar. Right? Any info/advice is appreciated. B/R Will
Intermediate & Advanced SEO | | MarketingChimp100 -
Is it bad practice to create pages that 404?
We have member pages on our site that are initially empty, until the member does some activity. Currently, since all of these pages are soft 404s, we return a 404 for all these pages and all internal links to them are js links (not links as far as bots are concerned). As soon as the page has content, we switch it to 200 and make the links into regular hrefs. After doing some research, I started thinking that this is not the best way to handle this situation. A better idea would be to noindex/follow the pages (before they have content) and let the links to these pages be real links. I'd love to hear input and feedback from fellow Mozzers. What are your thoughts?
Intermediate & Advanced SEO | | YairSpolter0 -
Using Webmaster Tools to Redirect Domain to Specific Page on Another Domain
Hey Everyone, we redirected an entire domain to a specific URL on another domain (not the homepage). We used a 301 Redirect, but I'm also wondering if I should use the Google Webmaster Tools "Change of Address" section to redirect. There is no option to redirect the old domain to the specific URL on the new domain within the "Change of Address" section. Thoughts?
Intermediate & Advanced SEO | | M_D_Golden_Peak0 -
Limit on Google Removal Tool?
I'm dealing with thousands of duplicate URL's caused by the CMS... So I am using some automation to get through them - What is the daily limit? weekly? monthly? Any ideas?? thanks, Ben
Intermediate & Advanced SEO | | bjs20100 -
Disavow Tool - WWW or Not?
Hi All, Just a quick question ... A shady domain linking to my website is indexed in Google for both example.com and www.example.com. If I wan't to disavow the entire domain, do I need to submit both: domain:www.example.com domain:example.com or just: domain:example.com Cheers!
Intermediate & Advanced SEO | | Carlos-R0 -
Why is Google Webmaster Tools reporting a massive increase in 404s?
Several weeks back, we launched a new website, replacing a legacy system moving it to a new server. With the site transition, webroke some of the old URLs, but it didn't seem to be too much concern. We blocked ones I knew should be blocked in robots.txt, 301 redirected as much duplicate data and used canonical tags as far as I could (which is still an ongoing process), and simply returned 404 for any others that should have never really been there. For the last months, I've been monitoring the 404s Google reports in Web Master Tootls (WMT) and while we had a few hundred due to the gradual removal duplicate data, I wasn't too concerned. I've been generating updated sitemaps for Google multiple times a week with any updated URLs. Then WMT started to report a massive increase in 404s, somewhere around 25,000 404s per day (making it impossible for me to keep up). The sitemap.xml has new URL only but it seems that Google still uses the old sitemap from before the launch. The reported sources of 404s (in WMT) don't exist anylonger. They all are coming from the old site. I attached a screenshot showing the drastic increase in 404s. What could possibly cause this problem? wmt-massive-404s.png
Intermediate & Advanced SEO | | sonetseo0 -
Google Disavow Tool - Waste of Time
My humble opinion is that Google's disavow tool.... is a utter waste of your time! My site, http://goo.gl/pdsHs was penalized over a year ago after the SEO we hired used black hat techniques to increase ranking. Ironically, while having visibility, Google itself had become a customer. (I guess the site was pretty high quality, trust worthy and user friendly enough for Google employees to purchase from.) Soon enough the message about detecting unnatural links had shown up on the webmaster tools and as expected, our rankings sank and out of view. For a year we had contacted webmasters, asking them remove links pointing back to us. 90% didn't respond, the other 10% complied). Work on our site continued, adding high quality, highly relevant unique content.
Intermediate & Advanced SEO | | Prime85
Rankings never recovered and neither did our traffic or business….. Earlier this month, we learned about Google’s "link disavow tool" and were excited! We had hoped that following the cleanup instruction, using the “link disavow tool”, we would get a chance at recovery!
We watched Matt Cutts’ video, read the various forums/blogs/topics online that were written about it, and then we felt comfortable enough to use it... We went through our backlink profile, determining which links were either spammy or seemed a result of black hat practices or the links added by a 3rd party possibly interested in our demise and added them to a .txt file. We submitted the file via the disavow tool and followed with another reconsideration request. The result came a couple of weeks later… the same cookie cutter email in the WMT suggesting that there are “unnatural links” to the site. Hope turned to disappointment and frustration. Looks like the big box companies will continue to populate the top 100 results of ANY search, the rest will help Google’s shareholders… If your site has gotten in the algorithm crosshairs, you have a better chance of recovering by changing your URL than messing around with this useless tool.0 -
Is Google Webmaster tools Accurate?
Is Google webmaster Tools data completely inaccurate, or am I just missing something? I noticed a recent surge in 404 errors detected 3 days ago (3/6/11) from pages that have not existed since November 2011. They are links to tag and author archives from pages initially indexed in August 2011. We switched to a new site in December 2011 and created 301 redirects from categories that no longer exist, to new categories. I am a little perplexed since the Google sitemap test shows no 404 errors, neither does SEO MOZ Crawl test, yet under GWT site diagnostics, these errors, all 125 of them, just showed up. Any thought/insights? We've worked hard to ensure a smooth site migration and now we are concerned. -Jason
Intermediate & Advanced SEO | | jimmyjohnson0