4xx (not found) errors seem spurious, caused by a "\" added to the URL
-
Hi SEOmoz folks
We're getting a lot of 404 (not found) errors in our weekly crawl.
However the weird thing is that the URLs in question all have the same issue.
They are all a valid URL with a backsalsh ("") added. In URL encoding, this is an extra %5C at the end of the URL.
Even weirder, we do not have any such URLs in our (Wordpress-based) website.
Any insight on how to get rid of this issue?
Thanks
-
No, Google Webmaster tools do not list an error here.
Its indeed an SEOmoz bug. Ryan, thanks for trying though!
-
My request is for a real link that I can click on and view the page.
In most cases where someone described an issue to me, either a key piece of information was left out or missed. If you cannot share that information, I understand. In the interest of being helpful, I wanted to ask.
It is entirely possible this is a crawler issue, but it is also possible the crawler is functioning perfectly and Google's crawler will produce the same result. That is my concern.
-
Well actualy I did already. The example I gave above is exactly that, only I replaced the real URL with "URL".
In a bit greater detail, the referring page is actually URL1 and this page contains the javascript
item = '
- text';
which produces 404 errors for URL2 in the SEOmoz crawl report.
-
It is entirely possible the issue is with the SEOmoz crawler. I would like to see it improved as well.
I am concerned the root issue may actually be with your site. Would you be willing to share an example of a link which is flagged in your report along with the referring page?
-
Thanks for the tips. After drilling down on the referer, this looks like an SEOmoz bug.
We are using a wordpress plugin called "collapsing archives" which creates LEGAL archive links with a javascript snippet like this:
item = '
- text';
As you can see this is totally legal javascript. But it seems SEOmoz is scanning the javascript without interpretation and picking up the escaped quotation mark ' after the URL and interpreting it as an additional \ at the end of the URL.
Since the plugin is behaving legally and works well - we want to keep using it. What's the chance that SEOmoz will fix the bug?
-
Many people do not realize when you add the backslash character, you change the URL. You can actually present a different web page for the URL with the trailing slash.
A popular cause of the problem is linking. If you check your weekly crawl report, there will be a column called Referrer. That is the source of the link. Check the referring page and find the link. Fix the link (i.e. remove the trailing slash) and the problem will go away on the next crawl. Of course, you want to determine how the link appeared and ensure it doesn't happen again.
-
If I had to have a guess I'd look into any javascript on the page that is perhaps adding or pointing to the URL with backslash.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz & Xenu Link Sleuth unable to crawl a website (403 error)
It could be that I am missing something really obvious however we are getting the following error when we try to use the Moz tool on a client website. (I have read through a few posts on 403 errors but none that appear to be the same problem as this) Moz Result Title 403 : Error Meta Description 403 Forbidden Meta Robots_Not present/empty_ Meta Refresh_Not present/empty_ Xenu Link Sleuth Result Broken links, ordered by link: error code: 403 (forbidden request), linked from page(s): Thanks in advance!
Moz Pro | | ZaddleMarketing0 -
Why does page report still have "Too Many Links" alert?
I'm a new Moz Pro user and see that the on-page optimization report includes "too many links" on page and references the "100 links" guideline that Google no longer endorses: http://www.youtube.com/watch?feature=player_embedded&v=l6g5hoBYlf0 Too many links is still an issue in terms of passing link juice, but not much else. Matt Cutts has stated that it is fine to have many internal links in this regard. Please correct me if I am mistaken.
Moz Pro | | Shannon-PayLoadz0 -
5XX (Server Error) on all urls
Hi I created a couple of new campaigns a few days back and waited for the initial crawl to be completed. I have just checked and both are reporting 5XX (Server Error) on all the pages it tried to look at (one site I have 110 of these and the other it only crawled the homepage). This is very odd, I have checked both sites on my local pc, alternative pc and via my windows vps browser which is located in the US (I am in UK) and it all works fine. Any idea what could be the cause of this failure to crawl? I have pasted a few examples from the report | 500 : TimeoutError http://everythingforthegirl.co.uk/index.php/accessories.html 500 1 0 500 : Error http://everythingforthegirl.co.uk/index.php/accessories/bags.html 500 1 0 500 : Error http://everythingforthegirl.co.uk/index.php/accessories/gloves.html 500 1 0 500 : Error http://everythingforthegirl.co.uk/index.php/accessories/purses.html 500 1 0 500 : TimeoutError http://everythingforthegirl.co.uk/index.php/accessories/sunglasses.html | 500 | 1 | 0 | Am extra puzzled why the messages say time out. The server dedicated is 8 core with 32 gb of ram, the pages ping for me in about 1.2 seconds. What is the rogerbot crawler timeout? Many thanks Carl
Moz Pro | | GrumpyCarl0 -
Sitewide links causing problem
Hello, I've had a couple of people look at the backlink profile for nlpca(dot)com and tell me that there were a lot of sitewide links with non-branded anchor text. The spreadsheet is [here](http://thewealthymind.com/wp-content/uploads/2013/03/anchor-text.xls). Actually, because of our company name, all sitewide links have branded anchor text except for http://www.iepdoc.nl
Moz Pro | | BobGW
http://www.ionut-ciurea.com The term "NLP" is not ranking for us, we should be on page 1 or 2. Also, we used to be sitelinked for "NLP California" and aren't anymore. Notice that "nlpca" splits into "nlp" and "ca" for California. Could you look and see if it is just those 2 sites and if all we need to do is contact them to change to branded anchor text. Or if it's something else to get back our rankings for "NLP" and our sitelinks for "NLP California" Thank you0 -
Crawl Errors from URL Parameter
Hello, I am having this issue within SEOmoz's Crawl Diagnosis report. There are a lot of crawl errors happening with pages associated with /login. I will see site.com/login?r=http://.... and have several duplicate content issues associated with those urls. Seeing this, I checked WMT to see if the Google crawler was showing this error as well. It wasn't. So what I ended doing was going to the robots.txt and disallowing rogerbot. It looks like this: User-agent: rogerbot Disallow:/login However, SEOmoz has crawled again and it still picking up on those URLs. Any ideas on how to fix? Thanks!
Moz Pro | | WrightIMC0 -
Optimization report card says "F" but I'm ranking #1
Using the SEOMoz pro tools and several of the terms I have listed get a report card of "F" but they also show Google rank of #1, #2. Is this unusual? And should I spend a lot of time "fixing" this if we're already ranking high?
Moz Pro | | schof0 -
Duplicate Content being caused by home page?
Hello everyone, I am new to SEOmoz and SEO in general and I have a quick questions. When running a SEO Web Crawler report on my URL, I noticed in the report that my home page (also known as my index page) was listed twice. Here is what the report was showing: www.example.com/ www.example.com/index.php So are these 2 different urls? If so, is this considered duplicate content and should I block crawler access to the index.php? Thanks in advance for the help!
Moz Pro | | threebiz0 -
Can overly dynamic URLs be overcome with canonical meta tags?
I tried searching for questions regarding dynamic URLs and canonical tags, but I couldn't find anything s hopefully this hasn't been covered. There are a large number of overly dynamic URLs reported in our site crawl (>7,000). I haven't looked at each of these, but most of these either have a canonical meta tag or have are indicated as FOLLOW, NO INDEX pages. Will these be enough to overcome any negative SEO impact that may come from overly dynamic URLs? We are down to almost 0 critical errors and this is now the biggest problem reported by the site crawl after too many on page links.
Moz Pro | | afmaury0