Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Use of the tilde in URLs
-
I just signed up for SEOMoz and sent my site through the first crawl. I use the tilde in my rewritten URLs. This threw my entire site into the Notice section 301 (permanent redirect) since each page redirects to the exact URL with the ~, not the %7e.
I find conflicting information on the web - you can use the tilde in more recent coding guidelines where you couldn't in the old.
It would be a huge thing to change every page in my site to use an underscore instead of a tilde int he URL. If Google is like SEOMoz and is 301 redirecting every page on the site, then I'll do it, but is it just an SEOMoz thing?
I ran my site through Firebug and and all my pages show the 200 response header, not the 301 redirect.
Thanks for any help you can provide.
-
Thanks for all the advice! I realized that Google doesn't care about the tilde -- or at least is not doing the same redirect as SEOMoz. Recently one of my older sitemaps was flagged by Google with errors because too many of the files were redirecting. All of my sitemaps would be flagged if pages were redirecting on a wide scale.
My pages generally rank in the top 5 in Google and maybe losing the tilde would get me to #1, so I'll keep it in mind for the future. Thanks again for the help.
-
We use tildes pretty heavily on our new site. They seem to be okay with Google. However I did not want to use them because some foreign keyboards do not include the character... like Mexico.
So... do folks in Mexico type in our URLs by hand? Probably not common... but it is a potential problem. It is missing from other keyboards as well.
We use the tilde because we think it helps break up words we do not want to be seen as "together" in a string. All my product URLs have the product name, separated by dashes, then we use the tilde then comes the product number. We think it may help Google see the product title as a complete string and not include the product number. Not sure if it works or not.
-
Tildes are okay these days, but 'unsafe'.
http://www.cs.tut.fi/~jkorpela/rfc/2396/full.html#2.3
Tilde was (begrudgingly) added to the unreserved character list a while ago, so Google should treat them fine without encoding.
However, if you can avoid using them I would, so leave the old addresses but from now on I'd use a hyphen (in preference to an underscore, still) instead of a tilde if you can.
-
Hi,
Since this is really about the way that the tool works, the quickest and most accurate way of getting the correct answer would be to email help@seomoz.org.
That being said, avoiding special characters would be our company's preferred option. This thread from Google Webmaster Central would be worth a read.
Sha
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz-Specific 404 Errors Jumped with URLs that don't exist
Hello, I'm going to try and be as specific as possible concerning this weird issue, but I'd rather not say specific info about the site unless you think it's pertinent. So to summarize, we have a website that's owned by a company that is a division of another company. For reference, we'll say that: OURSITE.com is owned by COMPANY1 which is owned by AGENCY1 This morning, we got about 7,000 new errors in MOZ only (these errors are not in Search Console) for URLs with the company name or the agency name at the end of the url. So, let's say one post is: OURSITE.com/the-article/ This morning we have an error in MOZ for URLs OURSITE.com/the-article/COMPANY1 OURSITE.com/the-article/AGENCY1 x 7000+ articles we have created. Every single post ever created is now an error in MOZ because of these two URL additions that seem to come out of nowhere. These URLs are not in our Sitemaps, they are not in Google... They simply don't exist and yet MOZ created an an error with them. Unless they exist and I don't see them. Obviously there's a link to each company and agency site on the site in the about us section, but that's it.
Moz Pro | | CJolicoeur0 -
URL Length Issue
MOZ is telling me the URLs are too long. I did a little research and I found out that the length of the URLs is not really a serious problem. In fact, others recommend ignoring the situation. Even on their blog I found this explanation: "Shorter URLs are generally preferable. You do not need to take this to the extreme, and if your URL is already less than 50-60 characters, do not worry about it at all. But if you have URLs pushing 100+ characters, there's probably an opportunity to rewrite them and gain value. This is not a direct problem with Google or Bing - the search engines can process long URLs without much trouble. The issue, instead, lies with usability and user experience. Shorter URLs are easier to parse, copy and paste, share on social media, and embed, and while these may all add up to a fractional improvement in sharing or amplification, every tweet, like, share, pin, email, and link matters (either directly or, often, indirectly)." And yet, I have these questions: In this case, why do I get this error telling me that the urls are too long, and what are the best practices to get this out? Thank You
Moz Pro | | Cart_generation1 -
What is the best way to treat URLs ending in /?s=
Hi community, I'm going through the list of crawl errors visible in my MOZ dashboard and there's a few URLs ending in /?s= How should I treat these URLs? Redirects? Thanks for any help
Moz Pro | | Easigrass0 -
Block Moz (or any other robot) from crawling pages with specific URLs
Hello! Moz reports that my site has around 380 duplicate page content. Most of them come from dynamic generated URLs that have some specific parameters. I have sorted this out for Google in webmaster tools (the new Google Search Console) by blocking the pages with these parameters. However, Moz is still reporting the same amount of duplicate content pages and, to stop it, I know I must use robots.txt. The trick is that, I don't want to block every page, but just the pages with specific parameters. I want to do this because among these 380 pages there are some other pages with no parameters (or different parameters) that I need to take care of. Basically, I need to clean this list to be able to use the feature properly in the future. I have read through Moz forums and found a few topics related to this, but there is no clear answer on how to block only pages with specific URLs. Therefore, I have done my research and come up with these lines for robots.txt: User-agent: dotbot
Moz Pro | | Blacktie
Disallow: /*numberOfStars=0 User-agent: rogerbot
Disallow: /*numberOfStars=0 My questions: 1. Are the above lines correct and would block Moz (dotbot and rogerbot) from crawling only pages that have numberOfStars=0 parameter in their URLs, leaving other pages intact? 2. Do I need to have an empty line between the two groups? (I mean between "Disallow: /*numberOfStars=0" and "User-agent: rogerbot")? (or does it even matter?) I think this would help many people as there is no clear answer on how to block crawling only pages with specific URLs. Moreover, this should be valid for any robot out there. Thank you for your help!0 -
Add to cart redirect using 302
I am getting a list of crawl errors in Moz because I am using a 302 redirect when people click on an item using the quickview add to cart eg:http://copyfaxes.com/cart/quickadd?partno=4061 will redirect them to the viewshoppingcart page. Is this wrong should this be a 301 redirect? There is no link juice to pass. Thanks
Moz Pro | | copyfaxes10 -
What do you use for site audit
What tools do you use for conducting a site audit? I need to do an audit on a site and the seomoz web crawler and on page optimization will takes days if not a full week to return any results. In past Ive used other tools that I could run on the fly and they would return broken links, missing htags, keyword density, server information and more. Curious as to what you all use and what you may recommend to use in conjunction with the moz tools.
Moz Pro | | anthonytjm0 -
Batch lookup domain authority on list of URL's?
I found this site the describes how to use excel to batch lookup url's using seomoz api. The only problem is the seomoz api times out and returns 1 if I try dragging the formula down the cells which leaves me copying, waiting 5 seconds and copying again. This is basically as slow as manually looking up each url. Does anyone know a workaround?
Moz Pro | | SirSud1 -
Does anyone know what the %5C at the end of a URL is?
I've just had a look at the crawl diagnostics and my site comes up with duplicate page content and duplicate titles. I noticed that the url all has %5C at the end which I've never seen before. Does anybody know what that means?
Moz Pro | | Greg800