How to Submit XML Site Map with more than 300 Subdomains?
-
Hi,
I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months.I have seen sites that create separate sitemap.xml for each subdomain which they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt
XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz ,
Currently in my website we have only 1 robots.txt file for main domain & sub domains.
Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately.
Is there any automatic way & do i have to ping separately if i add new pages in subdomain.
Please advise me.
-
Let me know how it goes. I'm sure it can be done. Just needs the right team
-
Yea in wordpress that option is available, but we are using Ruby On rails platform, so i am not sure whether we can do or not.
For eg http://windows7.iyogi.com/sitemap.xml.gz they use Wordpress CMS & it's mentioned in page that
"It was generated using the Blogging-Software WordPress and the Google Sitemap Generator Plugin by Arne Brachhold."
Anyway thx for ur help i will speak to my smart developers, let's c what they can do
-
Okay with this little bit of information it does sound like it might in fact be legitimate.
If it is, then the best solution is to work with the development team to automate the creation of each sitemap.xml file, and have them submitted to Google automatically, I know this is possible because I use the google Sitemaps plug-in for WordPress - and it automatically submits to Google and Bing.
How it does that I do not know. That's up to smart web developers to figure out and replicate.
-
Hi Alan, i recently joined this co & i can't change the whole structure.
I believe they have created virtual sub - domains & Moreover site traffic is growing at a great rate so they can't think of changing structure.
Last month it has been ranked as 20th Most visited website in India, so things are pretty fine. Moreover it's an education website and students can easily remember Subdomain URL eg: http://gmat.abc.com . also direct traffic to these sub domains is very high. So now how should i solve problem of XML sitemap
-
The more important, and URGENT issue is why are there so many subdomains, and why are there going to be more? That's got to be one of the most serious and potentially harmful things you could do to your SEO efforts unless it's an extremely rare situation that justifies the tactic.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migrating to new subdomain with new site and new content.
Our marketing department has decided that a new site with new content is needed to launch new products and support our existing ones. We cannot use the same subdomain(www = old subdomain and ww1 = new subdomain)as there is a technically clash between the windows server currently used, and the lamp stack required to run the new wordpress based CMS and site. We also have an aging piece of SAAS software on the www domain which is makes moving it to it's own subdomain far too risky. 301's have been floated as a way of managing the transition. I'm not too keen on that idea due to the double effect of new subdomain and content, and the SEO impact it might have. I've suggested uploading the new site to the new subdomain while leaving the old site in place. Then gradually migrating sections over before turning parts of the old site off and using a 301 at that point to finalise the move. The old site would inform user's there is a new version and it would then convert them to the new site(along with a cookie to auto redirect them in future.) while still leaving the old content in place for existing search traffic, bookmarks and visitors via static URLs. Before turning off sections on the old site we would create rel canonicals to redirect to the new pages based on a a mapped set of URLs(this in itself concerns me as the rel canonical is essentially linking to different content). Would be grateful for any advice on whether this strategy is flawed or whether another strategy might be more suitable?
Technical SEO | | Rezza0 -
When will all of Google Maps be the same again?
As many of you are aware that the pigeon update was only applied to the new Google maps resulting in very different search results for Google local business. When you search for a business on old Google maps then you get totally different results vs the new Google maps. Some businesses totally disappeared completely from the search results. I have done my research and found out that it's because the new Algo was only applied to the new maps. Also new algo does not apply to other countries. Well the reason I posted this topic is because I have noticed that all the new Google Business listings I am verifying for my clients are all being put under the old Google maps and not the new ones. They come up fine when searching from old maps but not the new ones. I understand Google has not rolled out the pigeon on all data centers but why? Will Google eventually roll out the update to old maps? Since Google is adding businesses to old google maps then what's the point of even adding new listings?
Technical SEO | | bajaseo0 -
Will Links to one Sub-Domain on a Site hurt a different Sub-Domain on the same site by affecting the Quality of the Root Domain?
Hi, I work for a SaaS company which uses two different subdomains on our site. A public for our main site (which we want to rank in SERPs for), and a secure subdomain, which is the portal for our customers to access our services (which we don't want to rank for) . Recently I realized that by using our product, our customers are creating large amounts of low quality links to our secure subdomain and I'm concerned that this might affect our public subdomain by bringing down the overall Authority of our root domain. Is this a legitimate concern? Has anyone ever worked through a similar situation? any help is appreciated!
Technical SEO | | ifbyphone0 -
301 redirecting old content from one site to updated content on a different site
I have a client with two websites. Here are some details, sorry I can't be more specific! Their older site -- specific to one product -- has a very high DA and about 75K visits per month, 80% of which comes from search engines. Their newer site -- focused generally on the brand -- is their top priority. The content here is much better. The vast majority of visits are from referrals (mainly social channels and an email newsletter) and direct traffic. Search traffic is relatively low though. I really want to boost search traffic to site #2. And I'd like to piggy back off some of the search traffic from site #1. Here's my question: If a particular article on site #1 (that ranks very well) needs to be updated, what's the risk/reward of updating the content on site #2 instead and 301 redirecting the original post to the newer post on site #2? Part 2: There are dozens of posts on site #1 that can be improved and updated. Is there an extra risk (or diminishing returns) associated with doing this across many posts? Hope this makes sense. Thanks for your help!
Technical SEO | | djreich0 -
Cross links between sites
hi, We have several ecommerce sites and we cross linked 3 of them by mistake. We realize that the sites were linked through WMT, We have shut down 2 of the sites about 2 months ago, but WMT still shows the links coming from those 2 sites. how do we make sure that google will see the sites are shut down. Is there a better of way resolving this issue. We are no longer using those sites, so do not need them to be active. whats the best solution to show google that the links are no longer there. Crawler shows that it was able to crawl the site 45 days after it is shut down. thanks nick
Technical SEO | | orion680 -
Removing links from another site
Hello, Some site that I have never been able to access as it is always down has over 3,000 links to my website. They disappeared the other week and our search queries dramatically improved but now they are back again in Google Webmaster and we have dropped again.I have contacted the site owner and got no response and I have also put in a removal form (though I am not sure this fits for that) and asked Google to remove as they have been duplicating our content also. It was in my pending section but has now disappeared.This links are really damaging our search and the site isnt even there. Do I have to list all 3,000 links in the link removal to Google or is there another way I can go about telling them the issue.Appreciate any help on this
Technical SEO | | luwhosjack0 -
What to do if my site was De-indexed?
Hello fellow SEOs, I have been doing SEO for about a year now, I'm not expert, but I know enough to get the job done. I'm learning everyday about better techniques. So enough about that... Tonight I noticed that my site has, I believe, been de-indexed. Its a fairly new site, as we just launched it a few days ago and I went in and did all the title tags and meta. I still have to go in to do the h1 and h2 tags...plus add some alt tags and anchor text. Well anyways, after a couple of days after the title tags were implemented. I was propagating all over the place. Using my keyword tool here...I was number on the first page in Google for 71 or the 88 keywords. My new site was just indexed yesterday and thats when i noticed all my keywords. Well today I noticed that I am no where to be found, even if i type in my company's name. PLEASE help me out...any advice would be appreciated. Thank you. p.s. could my competitors could have done something to my site? just wondering... The website is www.eggheadconsultants.com
Technical SEO | | Jegghead1 -
On-Site Sitemaps - Guidance Required
Hi, I am looking to find good examples of on-site sitemaps. We already submit our XML sitemap regularly through GWMT but I now wonder if we still need an on-site sitemap, as we have about 30 static pages and 300+ Wordpress blogs which in a sense makes that a spammy page as it has too many links and a higher than average keyword density. The reason I am looking for good examples is that I want to create a basic on-site sitemap that aids navigation but is styled to look ok as well. The Solution I have in mind: mydomain.com/link-example-one.php
Technical SEO | | tdsnet
mydomain.com/link-example-two.php
mydomain.com/liink-example-ten.php mydomain.com/blog then links to my 300 WP blogs, broken down into chunks navigated by using breadcrumbs. Will Google crawl this ok or should I stick to the current format listing ALL posts on one page? Thanks0