Is using dots in URL path really a problem?
-
we have a couple of pages displaying a dot in the URL path like
domain.com/mr.smith/widget-mr.smith
It displays fine in chrome, firefox and IE and for the user it may actually look better than replacing it by _ or -.
Did this ever cause problems to anybody?
Any statement from google about it?
Should I change existing URLs?If so, which other characters can I use in the URL instead of underscore and dash, since in our system dash and underscore are already used for rewriting other characters.
Thanks
-
Hi Andrews,
While the difference between dashes and underscores use to be a big issue a few years back its something that seems to hold minimal merit now. The two can be used rather interchangeably without any major impact. This was phased out around the same time as exact-match-domain value was as far too many people were abusing the long-tail dash page method.
-
While I've never come across this exact problem before I can share with you one my mantras that applies here:
"If a system (browser, search engine, etc) needs to perform a data re-write, you aren't accessible enough."
Google loves accessibility. It always wants the user to be able to easily access information and it wants it's spiders to be able to easily index and categorize the information. When accessibility options such as javascript versioning or if a site is using flash or not have an impact then it would only logically follow that more obvious structural access issues do come into effect.
From a technology stand point I can tell you that "." is not traditionally used in the scope of a URL/file structure as it a reserved character and therefore your structure is being re-written to display those. Much like international domains like the chinese internationalized domain name extension .中国 (which is basically a visual re-encode of the unicode: xn--fiqs8s) For the sake of accessibility, proper structure formatting and system practicality you should avoid using non-standard characters such as the . in your url
-
Hi!
As far as I know, this really isn't a huge problem (could be mistaken). I guess it depends...
In regards to readability, I prefer using dashes (-), as they tend to be easier to read. Underscores may be mistaken for a space). Here's what Matt Cutts had to say about this some years ago: http://www.mattcutts.com/blog/whitehat-seo-tips-for-bloggers/ (and http://www.mattcutts.com/blog/dashes-vs-underscores/)
I believe I have read that Google and other search engines read URLs like this when looking for semantic meanings:
- /this-is-part-of-a-website-address = this is part of a website address
- /this_is_part_of_a_website_address = thisispartofawebsiteaddress
At least that used to be the case...It could be changed now.
In your example, I would not obsess too much about it, as it gives perfect semantic meaning. Have you considered removing special characters, instead of replacing them with a "-" ?
Hope this helps.
Best regards,
Anders
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL indexed but not submitted in sitemap, however the URL is in the sitemap
Dear Community, I have the following problem and would be super helpful if you guys would be able to help. Cheers Symptoms : On the search console, Google says that some of our old URLs are indexed but not submitted in sitemap However, those URLs are in the sitemap Also the sitemap as been successfully submitted. No error message Potential explanation : We have an automatic cache clearing process within the company once a day. In the sitemap, we use this as last modification date. Let's imagine url www.example.com/hello was modified last time in 2017. But because the cache is cleared daily, in the sitemap we will have last modified : yesterday, even if the content of the page did not changed since 2017. We have a Z after sitemap time, can it be that the bot does not understands the time format ? We have in the sitemap only http URL. And our HTTPS URLs are not in the sitemap What do you think?
Intermediate & Advanced SEO | | ZozoMe0 -
URL in russian
Hi everyone, I am doing an audit of a site that currently have a lot of 500 errors due to the russian langage. Basically, all the url's look that way for every page in russian: http://www.exemple.com/ru-kg/pешения-для/food-packaging-machines/
Intermediate & Advanced SEO | | alexrbrg
http://www.exemple.com/ru-kg/pешения-для/wood-flour-solutions/
http://www.exemple.com/ru-kg/pешения-для/cellulose-solutions/ I am wondering if this error is really caused by the server or if Google have difficulty reading the russian langage in URL's. Is it better to have the URL's only in english ?0 -
Attack of the dummy urls -- what to do?
It occurs to me that a malicious program could set up thousands of links to dummy pages on a website: www.mysite.com/dynamicpage/dummy123 www.mysite.com/dynamicpage/dummy456 etc.. How is this normally handled? Does a developer have to look at all the parameters to see if they are valid and if not, automatically create a 301 redirect or 404 not found? This requires a table lookup of acceptable url parameters for all new visitors. I was thinking that bad url names would be rare so it would be ok to just stop the program with a message, until I realized someone could intentionally set up links to non existent pages on a site.
Intermediate & Advanced SEO | | friendoffood1 -
URLs: Removing duplicate pages using anchor?
I've been working on removing duplicate content on our website. There are tons of pages created based on size but the content is the same. The solution was to create a page with 90% static content and 10% dynamic, that changed depending on the "size" Users can select the size from a dropdown box. So instead of 10 URLs, I now have one URL. Users can access a specific size by adding an anchor to the end of the URL (?f=suze1, ?f=size2) For e.g: Old URLs. www.example.com/product-alpha-size1 www.example.com/product-alpha-size2 www.example.com/product-alpha-size3 www.example.com/product-alpha-size4 www.example.com/product-alpha-size5 New URLs www.example.com/product-alpha-size1 www.example.com/product-alpha-size1?f=size2 www.example.com/product-alpha-size1?f=size3 www.example.com/product-alpha-size1?f=size4 www.example.com/product-alpha-size1?f=size5 Do search engines read the anchor or drop them? Will the rank juice be transfered to just www.example.com/product-alpha-size1?
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Does having a ? on the end of your URL affect your SEO?
I have some redirects that were done with at "?" at the end of the URL to include google coding (i.e. you click on an adwords link and the google coding follows the redirected link). When there is not coding to follow the link just appears as "filename.html?". Will that affect us negatively SEO-wise? Thank you.
Intermediate & Advanced SEO | | RoxBrock1 -
Should /node/ URLs be 301 redirect to Clean URLs
Hi All! We are in the process of migrating to Drupal and I know that I want to block any instance of /node/ URLs with my robots.txt file to prevent search engines from indexing them. My question is, should we set 301 redirects on the /node/ versions of the URLs to redirect to their corresponding "clean" URL, or should the robots.txt blocking and canonical link element be enough? My gut tells me to ask for the 301 redirects, but I just want to hear additional opinions. Thank you! MS
Intermediate & Advanced SEO | | MargaritaS0 -
Lots of incorrect urls indexed - Googlebot found an extremely high number of URLs on your site
Hi, Any assistance would be greatly appreciated. Basically, our rankings and traffic etc have been dropping massively recently google sent us a message stating " Googlebot found an extremely high number of URLs on your site". This first highligted us to the problem that for some reason our eCommerce site has recently generated loads (potentially thousands) of rubbish urls hencing giving us duplication everywhere which google is obviously penalizing us with in the terms of rankings dropping etc etc. Our developer is trying to find the route cause of this but my concern is, How do we get rid of all these bogus urls ?. If we use GWT to remove urls it's going to take years. We have just amended our Robot txt file to exclude them going forward but they have already been indexed so I need to know do we put a redirect 301 on them and also a HTTP Code 404 to tell google they don't exist ? Do we also put a No Index on the pages or what . what is the best solution .? A couple of example of our problems are here : In Google type - site:bestathire.co.uk inurl:"br" You will see 107 results. This is one of many lot we need to get rid of. Also - site:bestathire.co.uk intitle:"All items from this hire company" Shows 25,300 indexed pages we need to get rid of Another thing to help tidy this mess up going forward is to improve on our pagination work. Our Site uses Rel=Next and Rel=Prev but no concanical. As a belt and braces approach, should we also put concanical tags on our category pages whereby there are more than 1 page. I was thinking of doing it on the Page 1 of our most important pages or the View all or both ?. Whats' the general consenus ? Any advice on both points greatly appreciated? thanks Sarah.
Intermediate & Advanced SEO | | SarahCollins0 -
Effect of URL change on Website
Hello we are developers and we have just created a new webpage for a client of us. The problem is that we can not replace the old one by the new one, cause our client has developed over 15 satellite pages that calls directly to the code of the old page. If we completly remove the old page we will make those 15 pages go down. Those pages are working over domains specially register for SEO reasons. For example Main page is www.euroair.es Satellite page is www.aireacondicionadodaikin.com Satellite page has pretty good ranking for search term "aire acondicionado daikin" As I told you, we have a new page but we can not make the page work over root domain. So we thought we could make it work over www.euroair.es/es, and make a redirection 301 of homepage and another important inner pages. We chose "/es" folder because it seems like a language folder, but we are not very sure of the effects of pages working on that folder instead of working on root directory. What do you think? Is this matter important or doesn't? Thanks
Intermediate & Advanced SEO | | teconsite.com0