Webmaster tools reporting spurious errors?
-
For the past 3 or so months Webmaster tools has been reporting 404 errors on my pages... The odd thing is that I can't figure out what they are seeing.
Here is an example of a link they claim is a 404
antiquebanknotes/nationalcurrency/rare/1895-Ten-Dollar-Bill.aspx
This is strange because it's a malformed URL. It says it's linked from this page:
http://www.antiquebanknotes.com/antiquebanknotes/rare/1882-twenty-dollar-bill.aspx
Which is a URL that doesn't exist. The bolded portion of this URRL shouldn't be there.
Can anyone give me an idea what is happening here?
Kind regards,
Greg
-
Hi Dana,
First, thank you for your response! I will change my image tags and add the canonical tag. I seem to be getting 40 to 50 of these 404's a day and my traffic and rankings have fallen dramatically. Scary stuff.
Kind regards,
Greg
-
Hi Greg,
I have seen this problem rear its ugly head before and most often it was connected to the use of relative instead of absolute URLs. I notice that you are using relative URLs for images, which is very common. I also notice that there is no self-referencing canonical tag on the page.
First, I would try adding a self-referencing canonical tag to this page if possible. Then, resubmit your sitemap to GWT and wait. GWT isn't very quick, so it could take some time to see if anything changes.
Also, I would experiment with changing one page's image URLs to absolute URLs instead of relative. Again, submit to Google and wait. See if the 404 errors being reported (specifically the ones for the page you updated) go away.
How many 404 errors are there? Is it a small handful or thousands? If it's a relatively small number (in proportion to the number of overall pages on your site), I really wouldn't worry about it at all. 404s aren't bad. Google uses them all the time and they are a natural result for either a URL with a typo or content that has been removed from a site. I would suggest making a custom 404 page though, because the ones served up by default by servers are usually pretty scary and unhelpful to your end users.
Hope that helps a little. These may or may not end up being solutions, but it gives you a place to start. Cheers!
Dana
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting error in webmasters
My site was running perfectly from last one year... I don't know what happened now google is showing error while I am trying to use fetch option in webmasters. http://prntscr.com/6mtud5
Technical SEO | | Srinu0 -
Sitemap issue - Tons of 404 errors
We've recreated a client site in a subdirectory (mysite.com/newsite) of his domain and when it was ready to go live, added code to the htaccess file in order to display the revamped website on the main url. These are the directions that were followed to do this: http://codex.wordpress.org/Giving_WordPress_Its_Own_Directory and http://codex.wordpress.org/Moving_WordPress#When_Your_Domain_Name_or_URLs_Change. This has worked perfectly except that we are now receiving a lot of 404 errors am I'm wondering if this isn't the root of our evil. This is a WordPress self-hosted website and we are actively using the WordPress SEO plugin that creates multiple folders with only 50 links in each. The sitemap_index.xml file tests well in Google Analytics but is pulling a number of links from the subdirectory folder. I'm wondering if it really is the manner in which we made the site live that is our issue or if there is another problem that I cannot see yet. What is the best way to attack this issue? Any clues? The site in question is www.atozqualityfencing.com https://wordpress.org/plugins/wordpress-seo/
Technical SEO | | JanetJ0 -
How to Remove a website from your Bing Webmaster Tools account
I have a site in Bing Webmaster Tools that I no longer work on. I can't seem to find where to delete this website from my webmaster tools account. Anyone know how (there doesn't seem to be anything obvious under Bing Help or on a Google Search).
Technical SEO | | TopFloor0 -
Difference between SEOMOZ and Webmaster Tools information
Hello, There is an issue that confuses me and I thought perhaps you will be able to help me shed some light on it. I have a website which shows 2,549 crawled pages on SEOMOZ and 24,542 pages on webmaster tools! Obviously there is some technical issue with the site, but my question is: why the vast difference between what the SEOMOZ crawl report and webmaster tools report show? Thanks! Guy Cizner
Technical SEO | | ciznerguy0 -
What are the best tools for back links?
I am a new to SEO, please help me in choosing the right tools for back links. I am thinking to buy Ultimate demon, Should I buy it or not? I have a range of you tube videos to rank.
Technical SEO | | Sajiali0 -
Duplicate Page Content Report
In Crawl Diagnostics Summary, I have 2000 duplicate page content. When I click the link, my Wordpress return "page not found" and I see it's not indexed by Google, and I could not find the issue in Google Webmaster. So where does this link come from?
Technical SEO | | smallwebsite0 -
What can I do if Google Webmaster Tools doesn't recognize the robots.txt file?
I'm working on a recently hacked site for a client and and in trying to identify how exactly the hack is running I need to use the fetch as Google bot feature in GWT. I'd love to use this but it thinks the robots.txt is blocking it's acces but the only thing in the robots.txt file is a link to the sitemap. Unde the Blocked URLs section of the GWT it shows that the robots.txt was last downloaded yesterday but it's incorrect information. Is there a way to force Google to look again?
Technical SEO | | DotCar0 -
404 errors on non-existent URLs
Hey guys and gals, First Moz Q&A for me and really looking forward to being part of the community. I hope as my first question this isn't a stupid one but I was just struggling to find any resource that dealt with the issue and am just looking for some general advice. Basically a client has raised a problem with 404 error pages - or the lack thereof- on non-existent URLs on their site; let's say for example: 'greatbeachtowels.com/beach-towels/asdfas' Obviously content never existed on this page so its not like you're saying 'hey, sorry this isn't here anymore'; its more like- 'there was never anything here in the first place'. Currently in this fictitious example typing in 'greatbeachtowels.com/beach-towels/asdfas**'** returns the same content as the 'greatbeachtowels.com/beach-towels' page which I appreciate isn't ideal. What I was wondering is how far do you take this issue- I've seen examples here on the seomoz site where you can edit the URI in a similar manner and it returns the same content as the parent page but with the alternate address. Should 404's be added across all folders on a site in a similar way? How often would this scenario be and issue particularly for internal pages two or three clicks down? I suppose unless someone linked to a page with a misspelled URL... Also would it be worth placing 301 redirects on a small number of common mis-spellings or typos e.g. 'greatbeachtowels.com/beach-towles' to the correct URLs as opposed to just 404s? Many thanks in advance.
Technical SEO | | AJ2340