Are there any benefits to having dashes in file names?
-
Through searching, I can find lots of discussion regarding "dash vs underscore", but am having trouble with an even simpler question:
Is there any SEO difference between using
http://www.broadway.com/shows/milliondollarquartet.php
vs.
-
A lot of coders like giving filenames underscores rather than dashes, because when you select (doubleclick) a filename_like_this all the text is selected, whereas a filename-like-this may only have part of it highlighted - so to me, an underscore is more akin to 'treat this like a word' and a dash is 'treat this like a space' - and either is better than %20!
For SEO though, the dash=space is worth it where:
-
the URL string is long (thisisnotaseasytoread in the address bar, this-is-much-easier-to-comprehend) - I think that helps users, which in turn signals to Google that you're being helpful
-
if the concatenation of the words would be confusing to a stemming programme - see here for examples http://independentsources.com/2006/07/12/worst-company-urls/
-
if someone wanted/had to manually type a url, a dash is quicker/easier as you don't need the shift key (which you do for an underscore)
-
if someone shares (pastes) a raw, long URL, with dashes in you have a chance it will wrap in a blog or wherever -
that looks dreadful, whereas you might get away with high-on-the-hill-stood-a-lonely-goatherd
- I find that it also helps with link-naming consistency, it's easier for you to spot your own typos or linking errors (so fewer 404s to hunt down)
Sorry, must dash
-
-
From what I've come to understand, dashes are seen as spaces by most search engines, so it helps to delineate your phrase. milliondollarquartet would be seen as one word, while million-dollar-quartet would be seen as 3 and would be more readable by both users and search engines, and it would help you get some SEO juice for the phrase, while milliondollarquartet would probably not be searched upon.
That having been said, I'm still fairly new to SEO, so hopefully you'll get more answers. But that's been my understanding thus far.
-
A smart robot can parse that out using algorithms called stemmers... But what about less sophisticated robots? Will ask.com or lesser known SE's get your keywords right without the dashes? What about a combination of words that can be separated in different ways to give different meaning?
And what about human readers? Which is easier for you to read?
Dashes and underscores are the URL equivalent of spaces. They help human readers and robot crawlers parse out what your URL is supposed to be.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Building a product clients will integrate into their sites: What is the best way to utilize my clients' unique domain names?
I'm designing a hosted product my clients will integrate into their websites, their end users would access it via my clients' customer-facing websites. It is a product my clients pay for which provides a service to their end users, who would have to login to my product via a link provided by my clients. Most clients would choose to incorporate this link prominently on their home page and site nav.
Intermediate & Advanced SEO | | emzeegee
All clients will be in the same vertical market, so their sites will be keyword rich and related to my site.
Many may even be .org and ,edus The way I see it, there are three main ways I could set this up within the product.
I want to know which is most beneficial, or if I'm missing anything. 1: They set up a subdomain at their domain that serves content from my domain product.theirdomain.com would render content from mydomain.com's database.
product.theirdomain.com could have footer and/or other no-follow links to mydomain.com with target keywords The risk I see here is having hundreds of sites with the same target keyword linking back to my domain.
This may be the worst option, as I'm not sure about if the nofollow will help, because I know Google considers this kind of link to be a link scheme: https://support.google.com/webmasters/answer/66356?hl=en 2: They link to a subdomain on mydomain.com from their nav/site
Their nav would include an actual link to product.mydomain.com/theircompanyname
Each client would have a different "theircompanyname" link.
They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).
I would have no control aside from requiring them to link to that url on my server. 3: They link to a subdirectory on mydomain.com from their nav/site
Their nav would include an actual link to mydomain.com/product/theircompanyname
Each client would have a different "theircompanyname" link.
They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).
I would have no control aside from requiring them to link to that url on my server. In all scenarios, my marketing content would be set up around mydomain.com both as static content and a blog directory, all with SEO attractive url slugs. I'm leaning towards option 3, but would like input!0 -
Website using search term as URL brand name to cheat Google
Google has come a long way over the past 5 years, the quality updates have really helped bring top quality content to the top that is relevant for users search terms, although there is one really ANNOYING thing that still has not been fixed. Websites using brand name as service search term to manipulate Google I have got a real example but I wouldn't like to use it in case the brand mentions flags up in their tools and they spot this post, but take this search for example "Service+Location" You will get 'service+location.com' rank #1 Why? Heaven knows. They have less than 100 backlinks which are of a very low, spammy quality from directories. The content is poor compared to the competition and the competitors have amazing link profiles, great social engagement, much better website user experience and the data does not prove anything. All the competitors are targeting the same search term but yet the worst site is ranking the highest. Why on earth is Google not fixing this issue. This page we are seeing rank #1 do not even deserve to be ranking on the first 5 pages.
Intermediate & Advanced SEO | | Jseddon920 -
How does linkedin get grey microdata when searching a persons name?
If you google any persons name who has a linkedin profile and then locate that entry in the search engine results (linkedin profiles are usually first page for most people) you will see that they get microdata indexed which is basically the persons location and headline from their profile. Looking at their markup, i see location which makes sense as it is an hcard format, but I do not see any microformat data around the headline. Any ideas how they get this? wDQcGZY
Intermediate & Advanced SEO | | stacks210 -
What is the difference between link rel="canonical" and meta name="canonical"?
Hi mozzers, I would like to know What is the difference between link rel="canonical" and meta name="canonical"? and is it dangerous to have both of these elements combined together? One of my client's page has the these two elements and kind of bothers me because I only know link rel="canonical" to be relevant to remove duplicates. Thanks!
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Issue with Robots.txt file blocking meta description
Hi, Can you please tell me why the following error is showing up in the serps for a website that was just re-launched 7 days ago with new pages (301 redirects are built in)? A description for this result is not available because of this site's robots.txt – learn more. Once we noticed it yesterday, we made some changed to the file and removed the amount of items in the disallow list. Here is the current Robots.txt file: # XML Sitemap & Google News Feeds version 4.2 - http://status301.net/wordpress-plugins/xml-sitemap-feed/ Sitemap: http://www.website.com/sitemap.xml Sitemap: http://www.website.com/sitemap-news.xml User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Other notes... the site was developed in WordPress and uses that followign plugins: WooCommerce All-in-One SEO Pack Google Analytics for WordPress XML Sitemap Google News Feeds Currently, in the SERPs, it keeps jumping back and forth between showing the meta description for the www domain and showing the error message (above). Originally, WP Super Cache was installed and has since been deactivated, removed from WP-config.php and deleted permanently. One other thing to note, we noticed yesterday that there was an old xml sitemap still on file, which we have since removed and resubmitted a new one via WMT. Also, the old pages are still showing up in the SERPs. Could it just be that this will take time, to review the new sitemap and re-index the new site? If so, what kind of timeframes are you seeing these days for the new pages to show up in SERPs? Days, weeks? Thanks, Erin ```
Intermediate & Advanced SEO | | HiddenPeak0 -
Summarize your question.Irrelevant Domain Name, relevant page name
Hello All, My website is a PC Repair website called 'Paulspcrepair.co.uk' (example). And recently I want to start my SEO branch to the website, now, would it be better : A: Use my domain Authority already obtained and create a new page, highly optimized for SEO searches. OR B: Create a whole new website, such as 'PaulsSEOWorkshop.co.uk'. If so, why? I can't decide what to try.
Intermediate & Advanced SEO | | Paul_Tovey0 -
Canonical URL redirect to different domain - SEO benefits?
Hello Folks, We are having a SEO situation here, and hope your support will help us figure out that. Let's say there are two different domains www.subdomian.domianA.com and www.domainB.com. subdomain.domainA is what we want to promote and drive SEO traffic. But all our content lies in domainB. So one of the thoughts we had is to duplicate the domainB's content on subdomian.domainA and have a canonical URL redirect implemented. Questions: Will subdomain.domainA.com get indexed in search engines for the content in domainB by canonical redirect? Do we get the SEO benefits? So is there any other better way to attain this objective? Thanks in advance.
Intermediate & Advanced SEO | | NortonSupportSEO0 -
Is there any SEO benefit for links shared in Facebook feeds or wall posts?
Is there any SEO benefit for links shared in Facebook feeds or wall posts?
Intermediate & Advanced SEO | | NinjaTEL30