Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Using subdomains for related landing pages?
-
Seeking subdomain usage and related SEO advice... I'd like to use multiple subdomains for multiple landing pages all with content related to the main root domain. Why?...Cost: so I only have to register one domain. One root domain for better 'branding'. Multiple subdomains that each focus on one specific reason & set of specific keywords people would search a solution to their reason to hire us (or our competition).
-
Thanks very much Jane! I think subdirectories are how I'll go.
Effective; organic SEO is HUGE for my initial online success. We market only with direct mail so far. But mailing lists don't address human situations ie: people who've inherited a property AND with it a 2nd mortgage payment AND they're stressed because they can't afford the 2nd payment AND their realtor hasn't sold the inherited property.One last question for all-
With effective landing page SEO & SERP being my primary goal; is the web URL structure term "siloing" familiar to anyone and applicable / adaptable to my multiple landing pages? (I found the term & explanation here: http://www.bruceclay.com/seo/silo.htm) Or is some other method more advisible in order to "pool" my subdirectories for better SEO in SERP? Peter
-
Hi Peter,
In some ways, subdirectories seems even more sensible when you're dealing with single landing pages, as they'll work together somewhat to look like a fuller site from Google's perspective, rather than just a collection of subdomains happening to exist on the same domain.
-
Hello again; after looking at your feedback; then a fresh look at our marketing needs & budget... After viewing each of our competitors sites with keyword 'semi'-stuffing, empty tags, horrible SEO structure, very light traffic & way too much info.... So now we're thinking that we do not need a main site; AND JUST HAVE multiple landing pages each very focused on a single financial or situational motivation causing a property owner to want to sell quickly & we'll explain how we are an alternative than a realtor. Does using subdirectories still seem best for only having single page landing pages? Does anyone have a few informative links regarding setting up & use of subdirectories? Thx, Peter
-
Hi Peter,
I understand that the platform only allows for subdomains. From a purely SEO point of view, subfolders or pages are preferable to subdomains because authority does not appear to pass between a parent domain and its subdomains in the same way as it does between subfolders and parent domains. If your landing page sites are only one pagers, they may be seen as quite thin as well.
However, there is no reason why you can't build quality content like this - it just may take more link building to establish the authority for the subdomains than it would for pages on the same site. You will need to ensure that as much unique content as possible is placed on the landing pages to increase their 'worth' in Google's eyes, given that they are separate from each other on subdomains.
-
Thanks for both responses. Alan- These landing pages would be single page sites. Thompson Paul- The reason I thought sub-domains IS TO SAVE $ with Lander ($ per # domains) and the cost of registering many domains.
Here's the specifics of my search.. The targeted property owner mailing lists are based on data: mortgage, taxes & assessors. They give NO CLUES as to human condition that we look for when our mailers get responses.We have a list of motivations (or reasons for distress to sell their house) are financial or circumstantial: divorce, inheritance, job loss, job transfer, can't sell house, bankrupt, tenant trashed apartment, etc. These motivations are not apparent, obviously, on a mailing list. We want to learn the best way to specifically find people, who own their property in CT, who aren't searching to sell - but are looking for solution to divorce or whatever NOT realizing a cash buyer (us) is a real & UPRIGHT solution. ** We have a list of motivations that we want to define into what phrases people ask in Google to find answers; then what keywords get found for those queries.. and limit it the best we can to CT.** Thanks, Peter
PS:Like Squarespace is drag and drop creation for websites plus hosting, ecommerce & stats; so is www.landerapp.com to landing pages -- they offer customize-able templates that are SEO optimize-able, have great stats & offer drag & drop opt-in forms to integrate into my email service. Comments/advice?
-
Fully agree with Alan - subdomains would be a major waste of effort and SEO value.
Are you thinking you want subdomains perhaps so you can track them differently? There are many ways to do the necessary tracking with pages in subdirectories of the main site, so it's not necessary to use subdomains for this reason either.
Unless there's something missing in what you need here, integrating the landing pages into the main site is the vastly superior solution here.
Can you give us an idea what it is about subdomains that you feel you need?
Paul
-
Unless those subdomains for single page sites, may look spammy to google. you can put those pages in your own site, there is nothing to gain using subdomains
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
JSON-LD schema markup for a category landing page
I'm working on some schema for a client and have a question regarding the use of schema for a high-level category page. This page is merely the main lander for Categories. For example: https://www.examples.com/pages/categories And all it does is list links to the three main categories (Men's, Women's, Kid's) - it's a clothing store. This is the code I have right now. In short, simply using type @Itemlist and an array that uses @ListItem. Structured Data Testing Tool returns no errors with it, but my main question is this: Is this the _correct _way to do a page like this, or are there better options? Thanks.
Intermediate & Advanced SEO | | Alces0 -
Does having a different sub domain for your Landing Page and Blog affect your overall SEO benefits and Ranking?
We have a domain www.spintadigital.com that is hosted with dreamhost and we also have a seperate subdomain blog.spintadigital.com which is hosted in the Ghost platform and we are also using Unbounce landing pages with the sub domain get.spintadigital.com. I wanted to know whether having subdomain like this would affect the traffic metric and ineffect affect the SEO and Rankings of our site. I think it does not affect the increase in domain authority, but in places like similar web i get different traffic metrics for the different domains. As far as i can see in many of the metrics these are considered as seperate websites. We are currently concentrating more on our blogs and wanted to make sure that it does help in the overall domain. We do not have the bandwidth to promote three different websites, and hence need the community's help to understand what is the best option to take this forward.
Intermediate & Advanced SEO | | vinodh-spintadigital0 -
Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?
We are building URLs dynamically with apache rewrite.
Intermediate & Advanced SEO | | lcourse
When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).0 -
Dev Subdomain Pages Indexed - How to Remove
I own a website (domain.com) and used the subdomain "dev.domain.com" while adding a new section to the site (as a development link). I forgot to block the dev.domain.com in my robots file, and google indexed all of the dev pages (around 100 of them). I blocked the site (dev.domain.com) in robots, and then proceeded to just delete the entire subdomain altogether. It's been about a week now and I still see the subdomain pages indexed on Google. How do I get these pages removed from Google? Are they causing duplicate content/title issues, or does Google know that it's a development subdomain and it's just taking time for them to recognize that I deleted it already?
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
How long takes to a page show up in Google results after removing noindex from a page?
Hi folks, A client of mine created a new page and used meta robots noindex to not show the page while they are not ready to launch it. The problem is that somehow Google "crawled" the page and now, after removing the meta robots noindex, the page does not show up in the results. We've tried to crawl it using Fetch as Googlebot, and then submit it using the button that appears. We've included the page in sitemap.xml and also used the old Google submit new page URL https://www.google.com/webmasters/tools/submit-url Does anyone know how long will it take for Google to show the page AFTER removing meta robots noindex from the page? Any reliable references of the statement? I did not find any Google video/post about this. I know that in some days it will appear but I'd like to have a good reference for the future. Thanks.
Intermediate & Advanced SEO | | fabioricotta-840380 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
Meta Keywords: Should we use them or not?
I am working through our site and see that meta keywords are being used heavily and unnecessarily. Each of our info pages will have 2 or 3 keyword phrases built into them. Should we just duplicate the keyword phrases into the meta keyword field, should put in additional keywords beyond or not use it at all? Thoughts and opinions appreciated
Intermediate & Advanced SEO | | Towelsrus1 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0