How to handle a blog subdomain on the main sitemap and robots file?
-
Hi,
I have some confusion about how our blog subdomain is handled in our sitemap. We have our main website, example.com, and our blog, blog.example.com.
-
Should we list the blog subdomain URL in our main sitemap? In other words, is listing a subdomain allowed in the root sitemap?
-
What does the final structure look like in terms of the sitemap and robots file? Specifically:
would I include a link to our blog subdomain (blog.example.com)?
would I include a link to BOTH our main sitemap and blog sitemap?
would I include a link to our main website URL (even though it's not a subdomain)?
does a subdomain need its own robots file?
I'm a technical SEO and understand the mechanics of much of on-page SEO.... but for some reason I never found an answer to this specific question and I am wondering how the pros do it. I appreciate your help with this.
-
-
It's my understanding that you treat each subdomain as a unique site. So each subdomain should have its own unique XML sitemap and robots.txt file, as well as submitted separately to Google Webmaster Tools. But to answer your inter-linking question, I would avoid including the other domain URLs in those files (XML and Robots). Only include the URLs for that particular domain and/or subdomain. With that being said however, I would inter-link them on the actual site somewhere. Maybe in the HTML sitemap, navigation, footer, or even naturally throughout your body content where appropriate as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple Sitemaps Vs One Sitemap and Why 500 URLs?
I have a large website with rental listings in 14 markets, listings are added and taken off weekly if not daily. There are hundreds of listings in each market and all have their own landing page with a few pages associated. What is the best process here? I could run one sitemap and make each market's landing page .8 priority in the sitemap or make 14 sitemaps for each market and then have one sitemap for the general and static pages. From there, what would be the better way to structure? Should I keep all the big main landing pages in the general static sitemap or have them be at the top of the market segmented sitemaps? Also, I have over 5,000 urls, what is the best way to generate a sitemap over 500 urls? Is it necessary?
Intermediate & Advanced SEO | | Dom4410 -
Root Domain v Subdomain
Hi, Just doing some analysis on a domain, and the (external) linking root domains show as:
Intermediate & Advanced SEO | | bjs2010
21 to Root Domain
4 to Subdomain The site is hosted under the www. subdomain version and there is no 301 from domain to www.domain Should the site be: Hosted on the root domain instead of subdomain 301 all incoming requests on domain to point to www.domain (subdomain) Any comments and experience on this type of situation appreciated!0 -
Subdomains and Forwarding Domains
Will someone who has experience chime in on these two issues below: Do sub-domain links which link back to the main site count as a new link from a new site; or would they be considered more of an internal link? Basically, would a sub-domain link to the main site be like a link from a unique website, or treated like any other internal link on the main site? I am speaking in terms of link juice. Should we 301 sub-domains to the main site internally? Thank you, we really appreciate any input you have!
Intermediate & Advanced SEO | | DJ1230 -
302 redirects in the sitemap?
My website uses a prefix at the end to instruct the back-end about visitor details. The setup is similar to this site - http://sanfrancisco.giants.mlb.com/index.jsp?c_id=sf with a 302 redirect from the normal link to the one with additional info and a canonical tag on the actual URL without the extra info ((the normal one here being http://sanfrancisco.giants.mlb.com,) However, when I used www.xml-sitemaps.com to create a sitemap they did so using the URLs with the extra info on the links... what should I do to create a sitemap using the normal URLs (which are the ones I want to be promoting)
Intermediate & Advanced SEO | | theLotter0 -
Moving popular blog from root to subdomain. Considerations & impact?
I'd like to move the popular company blog from /ecommerce-blog to blog.bigcommerce.com.WordPress application is currently living inside the application that runs the .com and is adding a large amount of files to the parent app, which results in longer deployment times than we'd like. We would use HTTP redirection to handle future requests (e.g. HTTP status code 301). How can this be handled from a WP point of view? What is the impact of SEO, rankings, links, authority? Thanks.
Intermediate & Advanced SEO | | fullstackmarketing.io0 -
SEO friendly blog.
i've read somewhere that if you list too many links/articles on one page, google doesn't crawl all of them. In fact, Google will only crawl up to 100 links/articles or so. Is that true? If so, how do I go about creating a page or blog that will be SEO friendly and capable of being completely crawled by google?
Intermediate & Advanced SEO | | greenfoxone0 -
Googlebot Can't Access My Sites After I Repair My Robots File
Hello Mozzers, A colleague and I have been collectively managing about 12 brands for the past several months and we have recently received a number of messages in the sites' webmaster tools instructing us that 'Googlebot was not able to access our site due to some errors with our robots.txt file' My colleague and I, in turn, created new robots.txt files with the intention of preventing the spider from crawling our 'cgi-bin' directory as follows: User-agent: * Disallow: /cgi-bin/ After creating the robots and manually re-submitting it in Webmaster Tools (and receiving the green checkbox), I received the same message about Googlebot not being able to access the site, only difference being that this time it was for a different site that I manage. I repeated the process and everything, aesthetically looked correct, however, I continued receiving these messages for each of the other sites I manage on a daily-basis for roughly a 10-day period. Do any of you know why I may be receiving this error? is it not possible for me to block the Googlebot from crawling the 'cgi-bin'? Any and all advice/insight is very much welcome, I hope I'm being descriptive enough!
Intermediate & Advanced SEO | | NiallSmith1 -
Disallow my store in robots.txt?
Should I disallow my store directory in robots.txt? Here is the URL: https://www.stdtime.com/store/ Here are my reasons for suggesting this: SEOMOZ finds crawl "errors" in there that I don't care about I don't think I care if the search engines index those pages I only have one product, and it is not an impulse buy My product has a 60 day sales cycle, so price is less important than features
Intermediate & Advanced SEO | | raywhite0