User Created Subdomain Help
-
Have I searched FAQ: Yes
My issue is unique because of the way our website works and I hope that someone can provide some guidance on this.Our website http://breezi.com is a website builder where users can build their own website. When users build their site it creates a sub-domain route to their created site, for example: http://mike.breezi.com. Now that I have explained how our site works here is the problem: Google Webmaster Tools and Bing Webmaster Tools are indexing ALL the user created websites under our TLD and thus it is our impression that any content created in those sub-domains can confuse the search engine to thinking that the user created website and content is relevant to _OUR _main sitehttp://breezi.com. So, what we would like to know if there is a way to let search engines know that the user created sites and content is not related to our TLD site. Thanks for any help and advise.
-
Subdomains generally don't pass any authority, link juice etc to the TLD, Rand did a Whiteboard Friday that briefly covered this a while ago (see http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday)
I am curious, if you didn't want user created sites to be associated with your TLD why didn't you set up a different domain for user created sites?
I personally think it is morally wrong to try and stop Google indexing them. So, if you don't want these associated with you or your TLD I would set up a new domain eg yourbreezi.com and 301 any sites that have been set up to the new domain and make sure that any new user sites are set up under the new domain.
In truth I'm not sure it is too much to worry about, after all Wordpress.org uses subdomains for most of its hosted blogs and it doesn't seem to have done them too much harm!!
Hope that helps
-
Robert,
The suggestion you make is not an option. I don't want to remove any sub-domain urls because these are user generated sites that could generate their own respective ranking.
-
Navid,
Using the Robot.txt to block the sub domains might not be the best route.
The only way I would think you can do that is by telling GWT to remove the URL (in this case your subdomains).
On Webmaster tools, click on "Site Configuration", then "Crawler access" then "Remove URL". Here click on "New Removal request". You will then see a option to remove whole site. You can use this option to remove "subdomain.domain.com" from SERP.
-
hmmm... That is a tricky one. One place to look for answer might be to talk to SEO people that have worked with a similar service such as Ning or wordpress.com .
I'll be curious to hear of your findings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
User Agent -teracent-feed-processing
Does anyone knows some info about "teracent-feed-processing" user agent? IP's from which user agent reside: 74.125.113.145, 74.125.113.148, 74.125.187.84 .... In our logs, 2 out of 3 requests are made by it, causing server crash.
Technical SEO | | propertyshark0 -
Does a subdomain benefit from being on a high authority domain?
I think the title sums up the question, but does a new subdomain get any ranking benefits from being on a pre-existing high authority domain. Or does the new subdomain have to fend for itself in the SERPs?
Technical SEO | | RG_SEO0 -
Subdomain question for law firm in Indiana, Michigan, and New Mexico.
Hi Gang, Our law firm has offices in the states of Indiana, Michigan, and New Mexico. Each state is governed by unique laws, and each state has its own "flavor," etc. We currently are set up with the main site as: http://www.2keller.com (Indiana) Subdomains as: http://michigan.2keller.com (Michigan) http://newmexico.2keller.com (New Mexico) My client questions this strategy from time to time, and I want to see if anyone can offer some reassurance of which I haven't thought. Our reason for setting up the sites in this manner is to ensure that each site speaks to state-specific practice areas (for instance, New Mexico does nursing home abuse, whereas the other states don't, etc.) and state-specific ethics law (for instance, in some states you can advertise your dollar amount recoveries, and others you can't.) There are so many differences between each state that the content would seem to warrant it. Local citations and listings are another reason these sites are set up in such a fashion. The firm is a member of several local state directories and memberships, and by having these links go directly to the subdomain they reference, I can see this being another advantage. Also, inside each state there are separate pages set up for specific cities. We geo-target major cities in each state, and trying to do all of this under one domain for 3 different states would seemingly get very confusing, very quickly. I had thought of setting up the various state pages through folders on the main domain, but again, there is too much state specific info to make this seem like a logical approach. Granted the linking and content creation would be easier for one site, but I don't think we can accomplish this in a clean way with the offices being in such different locales? I guess I'm wondering if there are some things I'm overlooking here? Thanks guys/gals!
Technical SEO | | puck991 -
Help! Same page with multiple urls. How is this handled?
I'm using DotNetNuke for many of our sites. DotNetNuke's home page can have multiple VALID URLs that go to the same home page. Example: http://aviation-sms.com http://www.aviation-sms.com http://aviation-sms.com/default.aspx http://www.aviation-sms.com/default.aspx and http://aviation-sms.com/aviationSMS.aspx http://www.aviation-sms.com/aviationSMS.aspx All the above URLs have the same content. In the page header tag, I have: Should I be doing something else? such as removing the "default.aspx"??? I have a blog also that has a boatload of pages. I tried this canonical approach, but I'm not sure SEO Moz likes it and the tool offers me little guidance on this issue.
Technical SEO | | manintights280 -
Creating a Blog of Rodent Removal Companies?
I am helping a small company. Lets say rodent removal is their service. But local SEO for rodent removal is very very competitive in my town and across America. Would a website/blog dedicated to highlighting rodent removers across America be good for my company? We have had nice success with wordpress.com blogs. Supposing I gave 6 other rodent removal companies a free guest post (always 300 words or more) or whatever to post on my blog. Of course, none of these companies would be in my market. Would that help my local SEO? I am thinking long term here?
Technical SEO | | greenhornet770 -
How does Ping services help your site
Hi i am trying to understand how services such as pingler.com help your site. I think i understand about the google ping service which tells google that you have updated a page but how does pingler work. Pingler claims that it sends traffic to your site but i do not understand this. Any help would be great
Technical SEO | | ClaireH-1848861 -
Help with Rel Canonical on Wordpress?
Crawl Diagnostics is showing a lot of Rel Canonical warnings, I've installed Wordpress SEO by Joose De Valk and Home Canonical URL plugins without success. Any ideas? I'm getting a lot of URL's that I thought I blocked from being indexed, such as author pages, category pages, etc. I'm also getting stuff like "recessionitis.com/?homeq=recent" and "recessionitis.com/page/2/", those pages are similar to my homepage. I thought those plugins were suppose to automatically clean things up.. anyone use these plugins that have any helpful hints?
Technical SEO | | 10JQKAs0 -
Subdomain Robots.txt
If I have a subdomain (a blog) that is having tags and categories indexed when they should not be, because they are creating duplicate content. Can I block them using a robots.txt file? Can I/do I need to have a separate robots file for my subdomain? If so, how would I format it? Do I need to specify that it is a subdomain robots file, or will the search engines automatically pick this up? Thanks!
Technical SEO | | JohnECF0