Flat vs. subdomain web structure
-
I am building a site which sells a product in 50 states and in each state we will have independt partners. From an SEO perspective, what are the tradeoffs in using a single domain vs. having each state a subdomain? Each state also has varying regulatory issues that are specific to that state.
-
I agree that with 50 subdomains i cant see you having enouth content, i was speakng in general.
i was refereing to that link, Rand said it is his personal belief that most of the time it is better to keep to one subdomain.
-
I agree. When I use subdomains, I start thinking about FTP. I also think about the user having the best user experience. If he wants to make one site that markets 50 states, then using a CMS would be the answer. But creating 50 subdomains would be repetitive. I his case I would use folders and if the independent partner needs access to the site, then add them as a user with limited site access.
http://www.seomoz.org/blog/understanding-root-domains-subdomains-vs-subfolders-microsites
-
This is an old argument, subdomains v subfolders,
Matt cutts said there is no difference. see comments
http://www.mattcutts.com/blog/subdomains-and-subdirectories/google web master blog said there is no differnece.
http://googlewebmastercentral.blogspot.com/2008/01/feeling-lucky-at-pubcon.htmlRand recomends sub folders, but said it was his personal choice.
I have seen SERPS with sitelinks, and in theh site links are links in subdomains, so i would say that google sees them as the same site.
If you register a root domain in wmt, the links from subdomains are seen as internal links. If somone verifies the subdomain under another account, then you will no longer see stats for the subdomain.
i have never seen any evidence that they are any different.
-
Use craigslist.org as an example. Every city has it's own subdomain. It's not in a subfolder where link juice is passed. Using a subdomain is almost like having a different domain.
Your choices will be state.example.com or example.com/state. I personally would use subfolders instead of subdomains to keep link juice. No "if" I was going to GeoTarget each state and I did NOT want to be in other states, then I would use subdomains the way Craigslist is set up.
A better question is this. You state you want to sell "a" product in 50 states. The way I read that is you are going to have 50 pages of duplicate content (whether it's one product or 1,000 products). How do you mean independent partners? You have to explain that a little further. Do you mean affiliates? Do you mean independent contractors like MLMs (network marketing). Your website should be structured around your business objectives. What if you have two partners within one state?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best way to structure website URLs ?
Hi, can anyone help me to understand if having category folder in URL matters or not? how to google treat a URL? for example, I have the URL www.protoexpress.com/pcb/certification but not sure google will treat it a whole or in separate parts? if in separate parts, is it safe to use pcb/pcb-certification? or it will be considered as keyword stuffing? Thank you in anticipation,
Intermediate & Advanced SEO | | SierraPCB1 -
URL structure with dash or slash
Hi, everyone Basically I am editing my website page's URL for SEO Optimisation and I am not sure which URL structure is best for SEO. The main different is the sign ( dash or slash ) before the product-code. HERE ARE TWO EXAMPLE www.example.com/long-tail-keyword-product-code www.example.com/long-tail-keyword/product-code To get more idea of my page, here is one of the product from my website : http://www.okeus.co.uk/pro_view-3.html My website is selling my own product, as a result the only keyword can be found was the name of the product and I separated different design by different code. Any experts who are willing help would be very much appreciated.
Intermediate & Advanced SEO | | chrisyu781 -
What's the best URL structure?
I'm setting up pages for my client's website and I'm trying to figure out the best way to do this. Which of the following would be best (let's say the keywords being used are "sell xgadget" "sell xgadget v1" "sell xgadget v2" "sell xgadget v3" etc.). Domain name: sellgadget.com Potential URL structures: 1. sellxgadget.com/v1
Intermediate & Advanced SEO | | Zing-Marketing
2. sellxgadget.com/xgadget-v1
3. sellxgadget.com/sell-xgadget-v1 Which would be the best URL structure? Which has the least risk of being too keyword spammy for an EMD? Any references for this?0 -
Constructing the perfect META Title - Ranking vs CTR vs Search Volume
Hello Mozzers! I want to discuss the science behind the perfect META Title in terms of three factors: 1. Ranking 2. CTR 3. Search Volume Hypothetical scenario: A furniture company "Boogie Beds" wants to optimise their META Title tag for their "Cane Beds" ecommerce webpage. 1. The keywords "Cane Beds' has a search volume of 10,000 2. The keywords " Cane Beds For Sale" has a search volume of 250 3. The keywords "Buy Cane Beds" has a search volume of 25 One of Boogie Beds SEO's suggests a META Title "Buy Cane Beds For Sale Online | Boogie Beds" to target and rank for all three keywords and capture long tail searches. The other Boogie Bed SEO says no! The META Title should be "Cane Beds For Sale | Boogie Beds" to target the most important two competitive keywords and sacrifice the "Buy" keyword for the other two Which SEO would you agree more with, considering 1. Ranking ability 2. Click through rates 3. Long tail search volume 4. Keyword dilution Much appreciated! MozAddict
Intermediate & Advanced SEO | | MozAddict1 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
When to Use Schema vs. Facebook Open Graph?
I have a client who for regulatory reasons cannot engage in any social media: no Twitter, Facebook, or Google+ accounts. No social sharing buttons allowed on the site. The industry is medical devices. We are in the process of redesigning their site, and would like to include structured markup wherever possible. For example, there are lots of schema types under MedicalEntity: http://schema.org/MedicalEntity Given their lack of social media (and no plans to ever use it), does it make sense to incorporate OG tags at all? Or should we stick exclusively to the schemas documented on schema.org?
Intermediate & Advanced SEO | | Allie_Williams0 -
Will links to a subdomain help it rank?
I have an affiliate subdomain on a larger company's domain. (For example I have: www.victor.company.com on www.company.com). Would working to attain backlinks to the subdomain help it rank or will I just be putting forth my effort and helping the domain rank?
Intermediate & Advanced SEO | | VictorVC0 -
Is My Competitor Beating Me With A Better URL Structure?
A competitor is consistently beating my website on non-competitive, long tail keywords. His DA is 32 compared to my 46. His average PA is 23 to my 28. His average On Page Optimization Grade is a C compared to my A. His page speed score using YSlow is a 71 compared to my 78. The only thing I can think of at this point is that he has a better URL structure. We both have the keyword in the URL, but his structure goes like this (keyword: apw wyott parts): www.competitor.com/apw-wyott/parts While mine goes like this (I had nothing to do with this site's architecture; this is what I'm stuck with for the time being): http://www.etundra.com/APW_Wyott_Parts-C347.html It should be noted that the last word in these keywords is always the same - "parts." These keywords are for parts by different manufacturers so they follow a consistent pattern: [manufacturer-name] followed by "parts." Also, the "C347" on the end of my URL is the category number given to this particular category of products in our database. Are his URLs beating me or should I continue to look for other factors? If so, what other factors should I consider?
Intermediate & Advanced SEO | | eTundra0