The impact of using directories without target keyword on our Rankings
-
Hello all,
I have a question regarding a website I am working on. I’ve read a lot of Q en A’s but couldn’t really find the best answer.
For one of our new websites we are thinking about the structure of this website and the corresponding URL-structure. Basically we have a main product (and a few main keywords) which should drive the most traffic to our website, and for which we want to optimize our homepage.
Besides those main keywords, we have an enormous base of long-tail keywords from which we would like to generate traffic. This means we want to create a lot of specific pages which are optimized.
My main question is the following:
We are thinking of two options:
- Option 1: www.example.com/example-keyword-one
- Option 2: www.example.com/directory/example-keyword-one
With option 1 we will link directly from our homepage to the most important pages (which represent our most important keywords). All the pages with the long tail content will be linked from another section on our website, which is one click away from our homepage (specifically a /solutions page which is linked from the footer). All the pages with long-tail content will have this structure www.example.com/example-keyword-one so the URLs will not contain the directory /solutions
With option 2 we will use more subdirectories in our URLs. Specifically, for all the long tail content we would use URLs like this: www.example.com/solutions/example-keyword-one
The directories we want to use wouldn't really have added value in terms of SEO, since they don’t represent important keywords.- So what is the best way to go? Option 1, straightforward, short URL’s which don’t really represent the linking structure of our website, but only contain important keywords. Or option 2, choose for more directories in our URLs which represent the linking structure of our website, but contain directories which don’t represent important keywords.
- Would the keyword ‘solutions’ in the directory (which doesn’t really relate to the content on the page) have a negative impact on our rankings for that URL?
-
Hi Rob,
Thanks for the helpful answer! I did a lot of research and also concluded that both options can work. I just haven't found any supporting case studies which clearly shows which of the two alternatives would work best. So if anyone knows a good article related to URL-structure and my question in specific, that would be very welcome!
Thanks!
Regards,
Jorg
-
It all depends if you want (or are going too):
1. Short URL's usually work best with regards to indexing and product correlation (too long means characters get left off by Google when indexing). Keep things within a short URL length also helps Google index the full length and get the full value of the URL - using your <keywords>to reinforce the URL relation.</keywords>
-
Also - Having these URL's linked too from the main page will help flow 'link juice' through the site, providing you keep the amount of links on the homepage to a minimum amount, and mix with other links that are <nofollow>. Usually links beyond 100 will not be crawled by Googlebot.</nofollow>
-
Also - If your URL's are strings - make sure to have 301's setup for URL's that include any type of string (?=question123456 or something to that alignment) Make sure to change that string = www.domains.com/keyword-rich-content. This might be nothing for the site/domain you are working on, or might be a step that needs to be included in the site's overhaul project work.
2. Longer URL's (like adding directories or sub-folders) can be good too, depending on your product breakdown in you site architecture. It might not be needed though. If you have hundreds of thousands of products, directories will most likely be needed to sort the data and organize the database being used to work alongside the CMS. Then you would want to go this route, other than having an unorganized ROOT directory with thousands of pages in it (even if dynamically generated)
Each option works, in their own way. Each with supporting documentation and methods. Just something to consider in helping you steer the SEO sea
Cheers!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migrating Magento site to Shopify Plus without dropping in SERPS
We have been looking at moving our ecommerce store www.pretavoir.co.uk from Magento to Shopify Plus. However, as we rank quite well at present we are interested in hearing experience others may have had making this change and also any advice that you may have... Also, any general comments on Shopify appreciated..
Web Design | | seanmccauley0 -
Does using role="heading" instead of H1 in HTML code affects SEO?
Does using role="heading" instead of affect SEO? http://www.w3.org/WAI/GL/wiki/Headings_using_role%3Dheading
Web Design | | LNEseo0 -
How to provide product information without duplicate content?
Hi all, I have an ecommerce website with approx 400 products, these are quite technical products and i use to have helpful information about the products on the product pages... My SEO company told me to remove all this, as i had lots of duplicate content issues... I have since had content writers re-write all product descriptions (about 250 words per product)... and now i am trying to figure out a way of getting the "helpful" information back on but in some kind of dynamic way... There is basically about 5 or 6 blocks of information, that can be added to each product page, these overlap hundreds of products. i was thinking of perhaps creating a separate static page for each block of useful information, and putting links on the product pages to this... however, ideally i would prefer to not keep sending customers to other pages... so wanted to see if others had come across similar issues themselves and how they went about having this "content" available to the user but in such a way it was not duplicate content... Please note using images would not be any good here, as the content varies in size but most of it is text based... regards James
Web Design | | isntworkdull1 -
Site down for more than a month - lost rankings
Hello, We have run into a situtation where we had multiple pages setup for different keywords but didn't realize that we had a name server issue that has caused the pages to be down for the last month or so (2-3 weeks on the low side.) The rank finder was still working fine, but the offline page was never reported. We realized the situation recently and have since gotten the sites back online under the new nameservers. Most of these sites were ranking 1 and 2 spots in their keywords, and now are no where to be found in the Google Index. Should I do anything differently, or just put the sites back online and wait it out? I have seen in different places that it may only take 2 weeks to come back, but it's possible that Google has marked the sites as 'not quality' because of their downtime and it will be even harder to get them to rank again. Can anyone shed any light on this situation? Any information is appreciated. Thanks in advance.
Web Design | | EQ-Richie0 -
Buying mutliple keyword rich domain names and directing them to one site
I've noticed some folks buying keyword rich domain names and pointing them to one site to try to rank for those keywords. An example of this is a plumbing business that buys domains like austinplumber.com, localaustinplumbingservice.com, bestplumberinaustin.com and then pointing these domains to their main website. Does this help the site rank for these key phrases? How does google see this? Thanks mozzers! Ron
Web Design | | Ron100 -
$100 to who discovers why our rankings drop
I'm offering $100 to the SEO that pinpoints why our rankings dropped. Here's details: Some very good people have this site: nlpca(dot)com and it has dropped for many of it's keywords, including the keywords "NLP" "NLP Training" and many other keywords. We dropped from 19th to 42nd for the term "NLP". Here's what I'm doing about it: (1) making sure all of the keywords (on all pages) in the titles reflect what's in the content, and that the keywords show up exactly in the content 3 times or more. (2) making sure all of the keywords (on all pages) in the URLs reflect what's in the content, and that the keywords show up exactly in the content 3 times or more. (3) We're redoing the home page as (1) above. (4) We're fixing the 404s (5) We're shortening the titles that are too long, and we're thinking of reducing the home page keyword count to 3 keyword phrases, although 4 keywords work in all of our other sites that have the keywords showing up at least 3 times in the content. If it is something else, and you pinpoint it, and if because of you, we rise back up to around 19th (more or less) again then we'll give you $100 payable via paypal as a thank you. I'm going to leave this question 'unanswered' until this is resolved.
Web Design | | BobGW0 -
Does anyone use Ultimento for Magento?
We're working on a re-design and want to start with SEO rather than leave it until the end. Of course, saving time and money on a custom template would be great. Ultimento found at www.ultimento.com boasts great SEO features. Has anybody used it? Please comment. Thanks in advance! Kevin
Web Design | | kwoolf0 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0