Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
4XX Errors - Adding %5c%5c to Links
-
Hi all
Hope someone can help me with this.
The internal links on my hubby's business site occasionally break and add %5c%5c%5c endlessly to the end of the url - like this:
site.com/about/hours-of-operation/\\\\\\\\%
I cannot for the life of me figure out why it is doing this and while it has happened to me from time to time, I can't recreate it.
My crawl diagnostics here in my SEOMox campaign show 19-20 urls doing this - it's nuts.
Any insight?
Thank you!!
Jennifer
~PotPieGirl
-
Both Shane and I looked and neither of saw any / vs \ issues.
Then, I just took a peek at my source code and look what I saw:
http://screencast.com/t/y02R4RS2L
Think that is it?
Thanks for replying!
Jennifer
-
Sent you a PM, Shane - thanks so much for the offer!
By "occasionally break", I mean that every now and again, any link on the site will freak out and add the %5c jibberish to the end.
Jennifer
-
It would really have to be....
The only reason for a %5C is the use of backslash - as that is actually what it means in code.
**business site occasionally break **
How do you mean "break" ?
My crawl diagnostics here in my SEOMox campaign show 19-20 urls doing this - it's nuts.
Is the only issue from this in SEOMOZ reports?
If you would like to PM me the site I can attempt to profile it
-
Thanks for replying so quickly, Shane!
I don't believe it's a / vs \ issue. It's a wordpress site. Key pages are in the top nav bar (all urls correct) and all the side bar links are 'widgets' and created by Wordpress.
For some reason I am suspecting a theme issue, but if I can't recreate the error, I'll have no way of knowing if changing the theme solves the problem.
Site has been online since 2010 with no issues...this is a new issue (past couple months according to crawl diagnostics).
Thanks!!!
-
somewhere in your url coding you added backslash instead of slash which is not valid.
-
It appears from research it is actually \ (backslash) not / (slash)
So possibly somewhere in your site you have used \ instead of / but of course this is just a possibility, as it appears that %5C is the delimiter or interpretation of \ in original Unix.
Hope this helps
PS a quick check at http://www.w3schools.com/tags/ref_urlencode.asp verified %5c is URL encode, for backslash - as it is treated as a special character, so somewhere in your CMS or code, you have used backslash instead of slash.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap error in Webmaster tools - 409 error (conflict)
Hey guys, I'm getting this weird error when I submit my sitemap to Google. It says I'm getting a 409 error in my post-sitemap.xml file (https://cleargear.com/post-sitemap.xml). But when I check it, it looks totally fine. I am using YoastSEO to generate the sitemap.xml file. Has anyone else experienced this? Is this a big deal? If so, Does anyone know how to fix? Thanks EwTswL4
Technical SEO | | Extima-Christian0 -
Errors In Search Console
Hi All, I am hoping someone might be able to help with this. Last week one of my sites dropped from mid first day to bottom of page 1. We had not been link building as such and it only seems to of affected a single search term and the ranking page (which happens to be the home page). When I was going through everything I went to search console and in crawl errors there are 2 errors that showed up as detected 3 days before the drop. These are: wp-admin/admin-ajax.php showing as response code 400 and also xmlrpc.php showing as response code 405 robots.txt is as follows: user-agent: * disallow: /wp-admin/ allow: /wp-admin/admin-ajax.php Any help with what is wrong here and how to fix it would be greatly appreciated. Many Thanks
Technical SEO | | DaleZon0 -
Find all external 404 errors/links?
Hi All, We have recently discovered a site was linking to our site but it was linking to an incorrect url, resulting in a 404 error. We had only found this by pure chance and wondered if there was a tool out there that will tell us when a site is linking to an incorrect url on our site? Thanks 🙂
Technical SEO | | O2C0 -
?_escaped_fragment_= Duplicate error in Webmaster
Hi I am not sure where this came from ... ?escaped_fragment= But in webmaster we are seeing hundreds of pages with this and thus webmaster is saying that we have Pages with duplicate title tags How do I fix this, or remove it. Regards T
Technical SEO | | Taiger0 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
Self-referencing links
I personally think that self-referencing links are silly. It's blatantly easy for Google to tell and my instinct says that the link juice for this would simply evaporate rather than passing back to itself. Does anyone have information backing me up from an authoritative source? I can't find any info about this linked to Matt Cutts, Rand or any of those I look up to.
Technical SEO | | IPROdigital0 -
Links from the same server has value or not
Hi Guys, Sometime ago one of the SEO experts said to me if I get links from the same IP address, Google doesn't count them as with much value. For an example, I am a web devleoper and I host all my clients websites on one server and link them back to me. Im wondering whether those links have any value when it comes to seo or should I consider getting different hosting providers? Regards Uds
Technical SEO | | Uds0 -
Does Yelp pass link juice?
This is probably a profoundly obvious question, but I can't seem to find an explicit answer on the internet, so I'll ask it here: Yelp's links out to local business websites are not nofollow'd, but they go through a javascript-based redirect. My understanding is that javascript redirected links do not pass link juice, so a link from a yelp profile will not directly impact my page authority; however, it looks like yelp does use nofollow judiciously for internal links, so I don't understand why they would allow follow for these "useless" outbound links. Do yelp's javascript-redirected links pass link juice?
Technical SEO | | tvkiley0