URL Encoding
-
HI
SEOmoz has finished crawling the site and surprised me with nearly 4k of 301's
all the 301 are on my deal pages
Example of the 301
as you can see from the above URL it returns a 404 but the URL is actually sent as below
For some reason SEOmoz crawler is converting the = to %3d and reporting its a 301 even though it returns 404
Is this an error on SEOMOZ part ? or is there an error on my site
Googlebot when i do a fetch as Google bot returns all on with the = sign and every other tool i have tried is ok too
so not sure why SEOMOZ is seeing it different and then adding the URL as a 301
I am hoping this is just a glitch on the report tool part as im struggling since a recent site 301
-
Kind of answered my own question to a point
The encoding of a = sign converts to %3D (no idea why SEOMOZ is picking this up on its crawl, the 301 is being done by my htaccess because its an upper case D the 301 it 301's the upper case to lower case then to a 404 page which is not good, i have fixed this but still wonder why SEOMOZ is seeing the %3D instead of an = sign
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Ooops. Our crawlers are unable to access that URL
hello
Moz Pro | | ssblawton2533
i have enter my site faroush.com but i got an error
Ooops. Our crawlers are unable to access that URL - please check to make sure it is correct
what is problem ?0 -
Moz shows duplicate content, but URL's are tagged with campaign tags
Crawl diagnostics shows a lot of pages with duplicate content, but when I check the details, I see that it lists the same page but the url contains a campaign tag, so it's not really another page that is serving identical content... Is there a way to remove these pages out of the Crawl Diagnostics?
Moz Pro | | jorisbrabants0 -
Duplicate URLs
A campaign that I ran said that my client's site had some 47,000+ duplicate pages and titles. I was wondering how I can possibly set that many 301 redirects, but a Moz help engineer said it has a lot to do with session IDs. See this set of duplicate URLs: http://www.lumberliquidators.com/ll/c/engineered-hardwood-flooring (clearly the main URL for the page)
Moz Pro | | AlanJacob
http://www.lumberliquidators.com/ll/c/engineered-hardwood-flooring?PIPELINE_SESSION_ID=0ac00a2e0ad53eb90cb0b0304d178fc1
http://www.lumberliquidators.com/ll/c/engineered-hardwood-flooring?PIPELINE_SESSION_ID=0ac3039d0ad4af2720b3ccd2238547ab
http://www.lumberliquidators.com/ll/c/engineered-hardwood-flooring?PIPELINE_SESSION_ID=0ac071ed0ad4af292684b0746931158f To a crawler, that looks like 4 different pages, when it's clear that they're actually all different URLs for the same page. I was wondering if some of you, maybe with experience in site architecture, would have insight into how to address this issue? Thanks Alan0 -
How to fix overly dynamic URLs for Volusion site?
We're currently getting over 5439 pages with an 'overly dynamic URL' warning in our Moz scan. The site is run on Volusion. Is there a way to fix this seeming Volusion error?
Moz Pro | | Brandon_Clay0 -
So noob while adding competitors, why i can't add sub-directory or specific url?
I try to add my competitors but failed, but there is only sub domain we can add to competitors, so how can i add a sub-directory or specific url to my competitors?
Moz Pro | | lfproseo0 -
Batch lookup domain authority on list of URL's?
I found this site the describes how to use excel to batch lookup url's using seomoz api. The only problem is the seomoz api times out and returns 1 if I try dragging the formula down the cells which leaves me copying, waiting 5 seconds and copying again. This is basically as slow as manually looking up each url. Does anyone know a workaround?
Moz Pro | | SirSud1 -
Any tools for scraping blogroll URLs from sites?
This question is entirely in the whitehat realm... Let's say you've encountered a great blog - with a strong blogroll of 40 sites. The 40-site blogroll is interesting to you for any number of reasons, from link building targets to simply subscribing in your feedreader. Right now, it's tedious to extract the URLs from the site. There are some "save all links" tools, but they are also messy. Are there any good tools that will a) allow you to grab the blogroll (only) of any site into a list of URLs (yeah, ok, it might not be perfect since some sites call it "sites I like" etc.) b) same, but export as OPML so you can subscribe. Thanks! Scott
Moz Pro | | scottclark0 -
Does anyone know what the %5C at the end of a URL is?
I've just had a look at the crawl diagnostics and my site comes up with duplicate page content and duplicate titles. I noticed that the url all has %5C at the end which I've never seen before. Does anybody know what that means?
Moz Pro | | Greg800