CamelCase vs lowernodash
-
I'm in the process of reviewing on-site URL structure on a few sites, and I've run into something I can't decide between.
I am forced to choose between the two examples:
MediaRoom/CaseStudies.aspx (camel case)
mediaroom/casestudies (all lower case, mashed, no dashes)
I would personally rather see:
media-room/case-studies/
However implementing the dashes would require manually re-writing about ~10,000 URLs. Implementing 301s from the existing structure to whatever I choose would be trivial, so there is no concern there.
Given the choice between CamelCase and lower-mashed, which would you choose? Why?
-
I agree with Alan 100%. There are some great options you can do with IIS rewrites.
-
lower case.
you will run into canonicall problems if you start using caps.
Since the pags are aspx, it is running on a IIS server. I would install the IIS url rewrit module if its not alrerady installed. And force lowercase very easy interface
See here http://thatsit.com.au/seo/tutorials/how-to-fix-canonical-issues-involving-the-upper-and-lower-case
Also there are really easy url rewrite solutions included in the module, a lot easier then rewiting the atucal urls.
A tip. take a copy of the web.config file after, just incase you overwrite it later and lose of your rewrites
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disallowing URL Parameters vs. Canonicalizing
Hi all, I have a client that has a unique search setup. So they have Region pages (/state/city). We want these indexed and are using self-referential canonicals. They also have a search function that emulates the look of the Region pages. When you search for, say, Los Angeles, the URL changes to _/search/los+angeles _and looks exactly like /ca/los-angeles. These search URLs can also have parameters (/search/los+angeles?age=over-2&time[]=part-time), which we obviously don't want indexed. Right now my concern is how best to ensure the /search pages don't get indexed and we don't get hit with duplicate content penalties. The options are this: Self-referential canonicals for the Region pages, and disallow everything after the second slash in /search/ (so the main search page is indexed) Self-referential canonicals for the Region pages, and write a rule that automatically canonicalizes all other search pages to /search. Potential Concern: /search/ URLs are created even with misspellings. Thanks!
Technical SEO | | Alces1 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
URLs with dashes between words or nothing at all? ( ../some-content vs. ../somecontent)
Here's a quick and easy question: Is there any problem with not using dashes in between words for URLs? Obviously the readability factor is a concern, but from a search engine standpoint? Thanks in advance!
Technical SEO | | tbinga0 -
Link rel next previous VS duplicate page title
Hello, I am running into a little problem that i would like to have a feedback on. I am running multiple wordpress blogs and Seo Moz pro is telling me that i have duplicate title tags on canadiansavers.ca vs http://www.canadiansavers.ca/page/2 I can dynamically add a page 2 to the title but I am correctly using the link rel: next and rel:previous Why is it seen as a duplicate title tag and should i add the page 2, page 3... in the meta title thanks
Technical SEO | | pixweb0 -
Wordpress & use of 'www' vs not for webmaster tools - explanation needed
I am having a hard time understanding the issue of canonization of site pages, specifically in regards to the 'www' or 'non-www' versions of a site. And specifically in regards to wordpress. I can see that it doesn't matter whether you type in 'www' or not in the url for a wordpress site, what is going on in the back end that allows this? When I link up to google webmaster tools, should i use www or not? thanks for any help d
Technical SEO | | dnaynay0 -
Http:// vs http://www.
Why is it that when I run an "On Page Optimization Keyword Report" for my website I get a different score when using http://www.tandmkitchens.com vs http://tandmkitchens.com. My keyword is "Kitchen Remodeling" http://www.tandmkitchens.com scores an A http://tandmkitchens.com scores a B It's the same page yet one url scores higher than the other. Any help! Thanks
Technical SEO | | fun52dig
Gary0 -
Microsite on subdomain vs. subdirectory
Based on this post from 2009, it's recommended in most situations to set up a microsite as a subdirectory as opposed to a subdomain. http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites. The primary argument seems to be that the search engines view the subdomain as a separate entity from the domain and therefore, the subdomain doesn't benefit from any of the trust rank, quality scores, etc. Rand made a comment that seemed like the subdomain could SOMETIMES inherit some of these factors, but didn't expound on those instances. What determines whether the search engine will view your subdomain hosted microsite as part of the main domain vs. a completely separate site? I read it has to do with the interlinking between the two.
Technical SEO | | ryanwats0 -
WordPress Pretty Permalinks vs Site Speed
A couple of issues at play here as I wrestle with the best permalink structure for a site I'm toying with now. 1. I know that WordPress wants a unique number in the post to improve performance and db calls. 2. I know that for basic on-page SEO, most of us would opt for CATEGORY/POST or maybe even just post. I constantly change those. It's a bad habit, but sometimes you want the killer headline and a decent title in the post. So here is the issue: I can rewrite or use a plugin (anyone have a favorite) the permalinks to speed up site performance. We all know Google wants that. Maybe the permalink becomes /1234-foo But you know, a number in front of the URL just isn't awfully user friendly. If someone wants to read the foo post, it's nice to send them directly there. So would you trade off a slowdown in site speed for the prettiest permalinks for usability and SEO? And since you're asking a WP question, has anyone heard of a hard cap on static pages where the database starts dragging? The site I have in mind has 400 each posts and pages. Would moving platforms to Drupal or Joomla allow handling that many pages more effectively? Thanks for contributing and any help you can give. George
Technical SEO | | georgebounacos0