Subdomain Robots.txt
-
If I have a subdomain (a blog) that is having tags and categories indexed when they should not be, because they are creating duplicate content. Can I block them using a robots.txt file? Can I/do I need to have a separate robots file for my subdomain?
If so, how would I format it? Do I need to specify that it is a subdomain robots file, or will the search engines automatically pick this up?
Thanks!
-
Thanks Wissam. I was thinking this was the way to go, and I appreciate your input.
I do use the Yoast SEO plugin for Wordpress on another site, but the blog in question is through BlogEngine. I will do what you have suggested.
Cheers!
-
if the url is http://blog.website.com
then the Robots.txt should be accessable threw http://blog.website.com/robots.txt
I would suggest these steps
- Verify your blog the Google webmaster tools
- generate a robots .txt file with Google webmaster tools
- Upload it to the Subdomain.
There is another way if you are using Wordpress.
There is a All in One SEO plugin / Wordpress SEO by Yoast. threw the settings you can specify to add NOINDEX to all Category, tags, author and others. its faster and error free.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving E-Commerce Store to Subdomain?
Hi all, We have a customer who currently uses Square for their in-store point-of-sale system as well as for their e-commerce website. From my understanding, a Square site is a watered-down version of Weebly, and is proving to be highly restrictive from an SEO and content structuring standpoint. It's been an uphill battle to try and get traction for their site in SERPs. Would it be a bad idea to move the entire Square online store to a subdomain, and install WordPress on the root domain? This way their online store would remain as-is, but the primary pages on the site would be on WordPress which would give us a lot more control over the content. I just want to make sure this doesn't negatively impact their SEO. Thanks!
Technical SEO | | suarezventures0 -
Does using a reverse proxy to make a subdomain appear as a subdirectory affect SEO?
Using a reverse proxy only makes it appear that a subdomain is really a subfolder. However, links in the end remain the same. Does this have any negative (or positive) impact on SEO? Does it make it difficult from the blog's (subdomain's) sitemap or robots.txt file to be properly read by search engines?
Technical SEO | | rodelmo41 -
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
What's wrong with this robots.txt
Hi. really struggling with the robots.txt file
Technical SEO | | Leonie-Kramer
this is it: User-agent: *
Disallow: /product/ #old sitemap
Disallow: /media/name.xml When testing in w3c.org everything looks good, testing is okay, but when uploading it to the server, Google webmaster tools gives 3 errors. Checked it with my collegue we both don't know what's wrong. Can someone take a look at this and give me the solution.
Thanx in advance! Leonie1 -
No descripton on Google/Yahoo/Bing, updated robots.txt - what is the turnaround time or next step for visible results?
Hello, New to the MOZ community and thrilled to be learning alongside all of you! One of our clients' sites is currently showing a 'blocked' meta description due to an old robots.txt file (eg: A description for this result is not available because of this site's robots.txt) We have updated the site's robots.txt to allow all bots. The meta tag has also been updated in WordPress (via the SEO Yoast plugin) See image here of Google listing and site URL: http://imgur.com/46wajJw I have also ensured that the most recent robots.txt has been submitted via Google Webmaster Tools. When can we expect these results to update? Is there a step I may have overlooked? Thank you,
Technical SEO | | adamhdrb
Adam 46wajJw0 -
Automatic subdomain redirection programmed over a given period?
Hi all, I have a technical problem that is quite beyond my "expertise", and I'm wondering if someone can help me to find a way to handle this. My website automatically changes appearance and some content as well twice a year (summer and winter), and we also change the sub domain, like this: September to may > www.nameofmysite.com and june to august > summer.nameofmysite.com I know it's a bit tricky, and we're doing this with a 303 redirection and will also provide a possibility (via a button) to switch beetwen "seasons". To sum up, during the "summer" I've got: http://summer.nameofmysite.com/webzine/news/name-of-my-page.html and during the "winter": http://www.nameofmysite.com/webzine/news/name-of-my-page.html As you can see, it's the same content, with a temporary redirection 303. Thanks you very much in advance! Simon
Technical SEO | | Simon-0 -
"Extremely high number of URLs" warning for robots.txt blocked pages
I have a section of my site that is exclusively for tracking redirects for paid ads. All URLs under this path do a 302 redirect through our ad tracking system: http://www.mysite.com/trackingredirect/blue-widgets?ad_id=1234567 --302--> http://www.mysite.com/blue-widgets This path of the site is blocked by our robots.txt, and none of the pages show up for a site: search. User-agent: * Disallow: /trackingredirect However, I keep receiving messages in Google Webmaster Tools about an "extremely high number of URLs", and the URLs listed are in my redirect directory, which is ostensibly not indexed. If not by robots.txt, how can I keep Googlebot from wasting crawl time on these millions of /trackingredirect/ links?
Technical SEO | | EhrenReilly0 -
Directing traffic to subdomain
Hi everyone, For this question, please note that we will be directing traffic using a load balancer (an Amazon ELB, to be specific) rather than using a 301 redirect. The question: Will the SEO ranking of links to pages be negatively impacted by directing traffic to servers with a different hostname (or subdomain) within mycompany.com? For example, we would like to have www.mycompany.com load balanced between host1.mycompany.com and host2.mycompany.com. Many thanks for your input! Jay
Technical SEO | | SeoExpansion0