Is it worth switching from underscores to hyphens in the URL?
-
I work for a website that recently did a redesign, and switched from hyphens to underscores. We have seen some drop in traffic, although that may be attributed to the migration.
I have read that while Google prefers hyphens, the underscore problem is not as much of an issue as it used to be.
Is it worth 301'ing the page to a version of itself with hyphens instead of underscores in the URL?
-
Darren is absolutely right. However, it seems unclear as to whether or not you set up 301-redirects from your old URLs to your new URLs. If you did, then the drop should only be temporary as Google figures out where the new URLs are and the old authority is passed to the new pages (or at least most of it). If you didn't do any 301-redirects, and you still have access to a spreadsheet or sitemap of your old URLs I'd encourage you to set up 301 redirects as soon as possible. Otherwise, it will indeed be like starting completely over with a new site, regardless of hyphens or underscores.
-
The drop in traffic isn't because Google prefers hyphens, it's because all your page URLs changed. You had a bunch of web pages that Google knew about, were in its index, and now they're gone. You now have a bunch of new pages that have to start building trust and authority from scratch.
I'd change them back to dashes, then set up 301 redirects for all the underscore versions.
-
Sorry, let me clarify. By used to I mean before the domain was migrated over. So it was hyphens on the old site, but was changed to underscores on the new site.
-
Hyphens vs underscores doesn't make a difference any more. If you're purely thinking of making the change for SEO purposes, I wouldn't do it. I agree that the drop in traffic you are seeing has more to do with replatforming and rewriting all of your URLs. This is pretty normal. Hope that helps!
-
You said it used to be hyphens and then was changed to underscores. if this was done recently then change it back and don't bother 301'ing unless you got a decent amount of links in that time period. i am assuming you 301'd the hyphens to the underscores, if so you need to undo that was well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competing URLs
Hi We have a number of blogs that compete with our homepage for some keywords/phrases. The URLs of the blogs contain the keywords/phrases. I would like to re-work the blogs so that they target different keywords that don't compete and are more relevant. Should I change the URLs as I think this is what is mainly causing the issue? If so, should I 301 old URL's to the homepage? For example, say we we're a site that specialised in selling plastic cups. Currently there is a blog with the URL www.mysite.com/plastic-cups that outranks the homepage for _plastic cups. _The blog isn't particularly relevant to plastic cups and the homepage should rank for this term. How should I let Google know that it is the homepage that is most relevant for this term? Thanks
Intermediate & Advanced SEO | | Buffalo_71 -
Internal Links - Different URLs
Hey so, In my product page, I have recommended products at the bottom. The issue is that those recommended products have long parameters such as sitename.com/product-xy-z/https%3A%2F%2Fwww.google.co&srcType=dp_recs The reason why it has that long parameter is due to tracking purposes (internally with the dev and UX team). My question is, should I replace it with the clean URL or as long as it has the canonical tag, it should be okay to have such a long parameter? I would think clean URL would help with internal links and what not...but if it already has a canonical tag would it help? Another issue is that the URL is different and not just the parameter. For instance..the canonical URL is sitename.com/productname-xyz/ and so the internal link used on the product page (same exact page just different URL with parameter) sitename.com/xyz/https%3A%2F%2Fwww.google.co&srcType=dp_recs (missing product name), BUT still has the canonical tag!
Intermediate & Advanced SEO | | ggpaul5620 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Changing the spellings of titles and URl changes
Hi, Changing the spellings of titles and URl changes We identifies 500+ titles with some issues like spellings and punctuations and short or too long. We want to change them, but the titles are connected with the URL's when we change the titles the URl's change as well. My questions are 1. Is it a good way to change them all in one shot or do few daily 2. As the URl's change will Google index drop the old pages as they would be 404 and index new ones? 3. Will we have chances to have drop in traffic due to this? 4. Any way to redirect? as we have a Drupal website Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
Blocking out specific URLs with robots.txt
I've been trying to block out a few URLs using robots.txt, but I can't seem to get the specific one I'm trying to block. Here is an example. I'm trying to block something.com/cats but not block something.com/cats-and-dogs It seems if it setup my robots.txt as so.. Disallow: /cats It's blocking both urls. When I crawl the site with screaming flog, that Disallow is causing both urls to be blocked. How can I set up my robots.txt to specifically block /cats? I thought it was by doing it the way I was, but that doesn't seem to solve it. Any help is much appreciated, thanks in advance.
Intermediate & Advanced SEO | | Whebb0 -
Robots.txt: Syntax URL to disallow
Did someone ever experience some "collateral damages" when it's about "disallowing" some URLs? Some old URLs are still present on our website and while we are "cleaning" them off the site (which takes time), I would like to to avoid their indexation through the robots.txt file. The old URLs syntax is "/brand//13" while the new ones are "/brand/samsung/13." (note that there is 2 slash on the URL after the word "brand") Do I risk to erase from the SERPs the new good URLs if I add to the robots.txt file the line "Disallow: /brand//" ? I don't think so, but thank you to everyone who will be able to help me to clear this out 🙂
Intermediate & Advanced SEO | | Kuantokusta0 -
Rewriting URL
I'm doing a major URL rewriting on our site to make the URL more SEO friendly as well as more comfortable and intuitive for our users. Our site has a lot of indexed pages, over 250k. So it will take Google a while to reindex everything. I was thinking that when Google Bot encounters the new URLs, it will probably figure out it's duplicate content with the old URL. At least until it recrawls the old URL and get a 301 directing them to the new URL. This will probably lower the ranking of every page being crawled. Am I right to assume this is what will happen? Or is it fine as long as the old URLs get 301 redirect? If it is indeed a problem, what's the best solution? rel="canonical" on every single page maybe? Another approach? Thank you.
Intermediate & Advanced SEO | | corwin0 -
Googlebot crawling partial URLs
Hi guys, I've checked my email this morning and I've got a number of 404 errors over the weekend where Google has tried to crawl some of my existing pages but not found the full URL. Instead of hitting 'domain.com/folder/complete-pagename.php' it's hit 'domain.com/folder/comp'. This is definitely Googlebot/2.1; http://www.google.com/bot.html (66.249.72.53) but I can't find where it would have found only the partial URL. It certainly wasn't on the domain it's crawling and I can't find any links from external sites pointing to us with the incorrect URL. GoogleBot is doing the same thing across a single domain but in different sub-folders. Having checked Webmaster Tools there aren't any hard 404s and the soft ones aren't related and haven't occured since August. I'm really confused as to how this is happening.. Thanks!
Intermediate & Advanced SEO | | panini0