Capital Letters in URLS?
-
Remove
-
Having Capital letters in the URLs are not bad for SEO, Google consider this case as negative seo and it will not affect your ranking, but i recommend to use lower case in URL because it is User-friendly and SE friendly, and may be possible that you will have duplicate content issue if search engine see variations of upper and lower case among URLs that all evidently point to the same content. Read matt cutts's advices on URL http://www.seosean.com/blog/matt-cutts-advice-on-urls-page-names
-
I agree with Neil. It's not bad, just a good user practice to keep them lowercase so that's there's no confusion. The best bet for you would to be to use a consistent format and mimic that in your canonical URLs so only that variation gets crawled and indexed.
-
Whilst it's not necessarily "bad" per se, the implications are, so this kind of canonicalisation issue needs to be taken care of using URL rewrites/permanent 301 redirects.
Typically, on a Windows-based server (without any URL rewriting), a 200 (OK) status code will be returned for each version regardless of the combination of upper/lower-case letters used - giving search engines duplicate content to index, and others duplicate content to link to. This naturally dilutes rankings and link equity across the two (or more) identical pages.
There is an excellent section on solving canonicalisation issues on Windows IIS servers in this SEOmoz article by Dave Sottimano.
On a Linux server (without any URL rewriting) you will usually get a 200 for the lower-case version, and a 404 (Not Found) for versions with upper-case characters. Whilst search engines wont index the 404, you are potentially wasting link equity passed to non-existent pages, and it can be really confusing for users, too.
There is a lot of info around the web about solving Linux canonicalisation issues (here is an article from YouMoz). If your site uses a CMS like Joomla or Wordpress, most of these issues are solved using the default .htaccess file, and completely eliminated when you combine this with a well chosen extension or two.
You can help the search engines figure out which version of a page you regard as the original by using the rel="canonical" meta tag in the html . This passes link equity and rankings from duplicate versions to the main, absolute version.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should one end URLs with or without a slash?
Moz, I am noticing that I need to go back and update my outbound links to your site. There are a lot of them because your content is so great and we love you guys. Could you explain your logic for making the change? Example on my Valid JSON-LD image sizes page: [https://moz.com/blog/state-of-searcher-behavior-revealed/](https://moz.com/blog/state-of-searcher-behavior-revealed/) redirected to: [https://moz.com/blog/state-of-searcher-behavior-revealed](https://moz.com/blog/state-of-searcher-behavior-revealed)
Algorithm Updates | | jessential0 -
Placement of /p/ in URL structure for ecommerce site product URLs
Hi, We're a discussion about how to structure a clients ecommerce site product page URLs where 12345 represent the product SKU/number: https://domain.com/Item--i-12345 https://domain.com/product-name/p/12345 https://domain.com/p/12345 It's a toss up between the second and the third URL, but the SEO company is saying the third is best because of the placement with the /p/ and creating a silo for "products" that help search engines recognize it is a product. Does anyone have thoughts on this? Thanks!
Algorithm Updates | | AliMac260 -
Link reclamation and many 301 redirect to one URL
We have many incoming links to a non existing pages of a sub-domain, which we are planning to take down or redirect to a sub-directory. But we are not ready to loose pagerank or link juice as many links of this sub-domain are referred from different external links. It's going to be double redirect obviously. What is the best thing we can go to reclaim these links without loss of link juice or PR? Can we redirect all these links to same sub-domain and redirect the same sub-domain to sub-directory? Will this double redirect works? Or Can we redirect all these links to same sub-domain and ask visitors to visit sub-directory, manual redirection? How fair to manually redirect visitors? Any other options? Thanks, Satish
Algorithm Updates | | vtmoz0 -
Remove spam url errors from search console
My site was hacked some time ago. I've since then redesigned it and obviously removed all the injection spam. Now I see in search console that I'm getting hundreds of url errors (from the spam links that no longer work). How do I remove them from the search console. The only option I see is "mark as fixed", but obviously they are not "fixed", rather removed. I've already uploaded a new sitemap and fetched the site, as well as submitted a reconsideration request that has been approved.
Algorithm Updates | | rubennunez0 -
Organic Traffic dropped 50%. Anyone want to have a stab at why? (URL listed)
Just curious what the pro's on here think is the reason why our site got hammered recently. The URL is www.jobshadow.com. We've got gobs of quality content that had been ranking for quite a few keywords. Even one from Rand himself http://www.jobshadow.com/interview-with-seo-and-seomoz-founder-rand-fishkin/ Rankings for even the exact match domain keyword 'Job Shadow' have been pummeled. Anyway, we've got a pretty solid link profile I would think. We also have a very high user time on the site, thus suggesting the organic traffic was engaged when Google ranked us for those keywords. We have lots of unsolicited inbound links and even recent ones from PBS. I'm not really sure what it takes to please the "machine" at this point. Curious as to what everyone here thinks.
Algorithm Updates | | arkana0 -
Is it allowed to put a word in all domains URLs to get higher in SERP?
Hello, What good or bad could happen if someone put the same keyword in all site's URL's? (i.e. I would be selling cars and my domain isn't included any word cars, so i put all of my pages in one folder like domain.com/cheap-cars/etc)
Algorithm Updates | | komeksimas0 -
What is the best url format ?
hello, I have the multiple site with file format url like contact-us.php, search.php, index.html But now i am going to update my site using any framework such as yii, cakephp, now i need the best suggestion, i wanted to create the seo friendly site, so what is the best format for ulr, 1. file format such as contact-us.php, index.html, about-us.html [or] 2. path based url such as contact-us/ , about-my-company/ [or] 3. path based without slash like contact-us, about-my-company Please provide me the best solution for above Thanks Alex
Algorithm Updates | | massimobrogi0 -
Local SEO url format & structure: ".com/albany-tummy-tuck" vs ".com/tummy-tuck" vs ".com/procedures/tummy-tuck-albany-ny" etc."
We have a relatively new site (re: August '10) for a plastic surgeon who opened his own solo practice after 25+ years with a large group. Our current url structure goes 3 folders deep to arrive at our tummy tuck procedure landing page. The site architecture is solid and each plastic surgery procedure page (e.g. rhinoplasty, liposuction, facelift, etc.) is no more than a couple clicks away. So far, so good - but given all that is known about local seo (which is a very different beast than national seo) quite a bit of on-page/architecture work can still be done to further improve our local rank. So here a a couple big questions facing us at present: First, regarding format, is it a given that using geo keywords within the url indispustibly and dramatically impacts a site's local rank for the better (e.g. the #2 result for "tummy tuck" and its SHENANIGANS level use of "NYC", "Manhattan", "newyorkcity" etc.)? Assuming that it is, would we be better off updating our cosmetic procedure landing page urls to "/albany-tummy-tuck" or "/albany-ny-tummy-tuck" or "/tummy-tuck-albany" etc.? Second, regarding structure, would we be better off locating every procedure page within the root directory (re: "/rhinoplasty-albany-ny/") or within each procedure's proper parent category (re: "/facial-rejuvenation/rhinoplasty-albany-ny/")? From what I've read within the SEOmoz Q&A, adding that parent category (e.g. "/breast-enhancement/breast-lift") is better than having every link in the root (i.e. completely flat). Third, how long before google updates their algorithm so that geo-optimized urls like http://www.kolkermd.com/newyorkplasticsurgeon/tummytucknewyorkcity.htm don't beat other sites who do not optimize so aggressively or local? Fourth, assuming that each cosmetic procedure page will eventually have strong link profiles (via diligent, long term link building efforts), is it possible that geo-targeted urls will negatively impact our ability to rank for regional or less geo-specific searches? Thanks!
Algorithm Updates | | WDeLuca0