Best Way To Clean Up Unruly SubDomain?
-
Hi,
I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up?
I was thinking the following:
1. Verify them all in Webmaster Tools.
2. Remove all URLs from the index via the Removal Tool in WMT
3. Add site-wide no-index, follow directive.
Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt.
If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
-
PS
DO NOT "
2. Remove all URLs from the index via the Removal Tool in WMT"
This is my opinion and I believe it is shared with many other people as well and search engine optimization community by using the Google disavow link tool or revoke links tool you are essentially consenting to doing something wrong do not do it
do not unblock 100% of your robot text only allow the links that you wish to have seen by Google. Just think of them as a parameter on the end of the link but a subdomain. It will be indexed and site map accordingly.
Sincerely,
Thomas
-
You may now continue to use subdomains. And they will be ranked just as high as non-subdomains.
You have to do exactly what you stated but the opposite you have to remove the robot text that is blocking subdomain from being indexed by search engines
Once there is a site map and the robot text is fixed to allow search engine's to crawl the subdomain it will be ranked appropriately and should not have to take any more action.
I hope I've been help.
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practice for www and non www
How is the best way to handle all the different variations of a website in terms of www | non www | http | https? In Google Search Console, I have all 4 versions and I have selected a preference. In Open Site Explorer I can see that the www and non www versions are treated differently with one group of links pointing to each version of the same page. This gives a different PA score. eg. http://mydomain.com DA 25 PA 35 http://www.mydomain.com DA 19 PA 21 Each version of the home page having it's only set of links and scores. Should I try and "consolidate" all the scores into one page? Should I set up redirects to my preferred version of the website? Thanks in advance
Technical SEO | | I.AM.Strategist0 -
Which URL structure holds the best SEO value?
Hello Community! We are rewriting URLs to better rank and provide better visual usability to our visitors. Would it be better to serve a URL that looks like this: www.domain.com/category-subcategory or www.domain.com/category/subcategory Please note the slight difference--the 2nd URL calls out a category that has a subcategory under it. Would it give us more value? Does it make a difference? Thanks in advance!
Technical SEO | | JCorp0 -
Best Way To Handle Expired Content
Hi, I have a client's site that posts job openings. There is a main list of available jobs and each job has an individual page linked to from that main list. However, at some point the job is no longer available. Currently, the job page goes away and returns a status 404 after the job is no longer available. The good thing is that the job pages get links coming into the site. The bad thing is that as soon as the job is no longer available, those links point to a 404 page. Ouch. Currently Google Webmaster Tools shows 100+ 404 job URLs that have links (maybe 1-3 external links per). The question is what to do with the job page instead of returning a 404. For business purposes, the client cannot display the content after the job is no longer available. To avoid duplicate content issues, the old job page should have some kind of unique content saying the job is longer available. Any thoughts on what to do with those old job pages? Or would you argue that it is appropriate to return 404 header plus error page since this job is truly no longer a valid page on the site? Thanks for any insights you can offer.
Technical SEO | | Matthew_Edgar
Matthew1 -
Best way to redirect 3 sites to 1 new one.
Hi All We currently have 3 old sites that have tones of content. Due to brand/business consolidation we have merge all 3 to produce 1 website. The new site contains all the old content from the old 3. So, I know I need to 301 redirect all the old content from the previous sites to the equivelent content on the new sites but am confused how you do this with 3 domains? One of the domains is being replaced with the new site. So I have: www.domain1.co.uk www.domain2.co.uk www.domain3.co.uk All the content for all the sites have been imported into a new site and any duplicate content issues havce been resolved. Can anyone point me in the right direction? Thanks
Technical SEO | | EclipseLegal0 -
Bandcamp subdomain
I have a website - www.weddingmusicproject.com that is doing quite well. However, we have all of the actual music listed on weddingmusicproject.bandcamp.com and have a number of very powerful pages there (bandcamp is a great product by the way). Is there a strong benefit to our domain's authority if we move our "bandcamp" site onto our own subdomain using their custom domain option - http://bandcamp.com/faq_custom_domains - Something like music.weddingmusicproject.com. I seem to think that it would increase our overall domain authority, but it wouldn't increase the number of inbound links or anything. If anything it would decrease the number of linking domains and bandcamp is quite a powerful site so those links would just turn into internal links. Thoughts? I know this is probably a basic concept, but I've thought it over a number of times and can't come to a conclusion.
Technical SEO | | deyobr0 -
What is the best way to optimize a page for a magazine
Hi i have a serious problem with a website that i am building http://www.cheapflightsgatwick.com/ with reference to letting the search engines know what the magazine is about. I am building a holiday magazine which will focus on holiday news, cheap deals and holiday reviews. I am wanting the home page to feature for the following keywords holiday news, holiday magazine, holiday ideas, best holiday deals, but the problem i have is, i have tried putting an introduction on the home page but it looks out of place, so what is the best way for me to let google know about what the site is about and to get it ranking well in the search engines any help and advice would be great
Technical SEO | | ClaireH-1848860 -
Subdomain and Domain Rankings
I have read here that domain names with keywords might add a boost to your search rank For instance using a completely inane example monkey-fights.com might get a boost compared to mfl.com (monkey fighting league) when searching for "monkey fights" There seems to be a hot debate as to how much bonus the first domain might get over the second, but leaving that aside for the moment. Question 1. Would monkey-fights.mfl.com get the same kind of bonus as a root domain bonus? Question 2. If the answer to 1 above was yes would a 301 redirect from the suddomain URL to root domain URL retain that bonus I was just thinking on how hard it is to get root domains these days that are not either being squatted on etc. and if this might be a way to get the same bonus, or maybe subdomains are less bonus prone and so it would be a waste of time Thanks
Technical SEO | | bThere0 -
What is the most effective way of indexing a localised website?
Hi all, I have a website, www.acrylicimage.com which provides products in three different currencies, $, £ and Euro. Currently a user can click on a flag to indicate which region they are in, or if the user has not manually selected the website looks at the users Locale setting and sets the region for them. The website also has a very simple content management system which provides ever so slightly different content depending on which region the user is in. The difference in content might literally be a few words per page, like contact details, measurements i.e. imperial to metric. I dont believe that GoogleBot, or any other bot for that matter, sets a Locale, and therefore it will only ever be indexing the content on our default region - the UK. So, my question really is if I need to be able to index different versions of content on the same page, is the best route to provide alternate urls i.e.: /en/about-us
Technical SEO | | dotcentric
/us/about-us
/eu/about-us The only potential downside I see to this is there are currently a couple of pages that do have exactly the same content regardless of whether you have selected the UK or USA regions - could this be considered content duplication? Thanks for your help. Al0