Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Robots txt. in page with 301 redirect
-
We currently have a a series of help pages that we would like to disallow from our robots txt.
The thing is that these help pages are located in our old website, which now has a 301 redirect to current site.
Which is the proper way to go around?
1- Add the pages we want to disallow to the robots.txt of the new website?
2- Break the redirect momentarily and add the pages to the robots.txt of the old one?
Thanks
-
In that case, you'd need to add the robots meta tag at the page level before the tag.
or
-
Hey, for some time we will keep the files in the old domain. Should we break the redirect and insert the disallows to the robot.txt of the old site?
-
So, the problem is that the robots.txt file can't be accessed because of the 301 redirect to the new domain?
Do you plan to keep the help files on the old domain, or will they be removed completely?
-
Hi Laura,
Thanks for your reply. I don't want to disallow the URLs these pages are being redirected to. Actually these URLs are in the old version but still can be accessed. So to put it simply, this is my case:
1- This was our current website: www.kilgray.com (With a 301 redirect)
2- This is our new website: www.memoq.com
3- I would like to disallow the following links on the old website that are still visible (haven't been redirected):
http://kilgray.com/memoq/2015-100/help-en/index.html
http://kilgray.com/memoq/2014/help-en/
-
Do you want to disallow the URLs that these pages are being redirected to? If not, there's no need to add anything to the robots.txt file.
If you do want to disallow the URLs that these pages are being redirected to, use relative URLs in your robots.txt file. For example, let's say olddomain.com/old-help-page/ is being redirected to newdomain.com/new-help-page/. If that's the case, add the following to your robots.txt file.
Disallow: /new-help-page/
There's no need to disallow the specific URLs that are being redirected to something else. Are you trying to get them removed from Google's index or something? If so, Google will update their index eventually based on your 301 redirects.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have two robots.txt pages for www and non-www version. Will that be a problem?
There are two robots.txt pages. One for www version and another for non-www version though I have moved to the non-www version.
Technical SEO | | ramb0 -
Robots.txt Tester - syntax not understood
I've looked in the robots.txt Tester and I can see 3 warnings: There is a 'syntax not understood' warning for each of these. XML Sitemaps:
Technical SEO | | JamesHancocks1
https://www.pkeducation.co.uk/post-sitemap.xml
https://www.pkeducation.co.uk/sitemap_index.xml How do I fix or reformat these to remove the warnings? Many thanks in advance.
Jim0 -
Is sitemap required on my robots.txt?
Hi, I know that linking your sitemap from your robots.txt file is a good practice. Ok, but... may I just send my sitemap to search console and forget about adding ti to my robots.txt? That's my situation: 1 multilang platform which means... ... 2 set of pages. One for each lang, of course But my CMS (magento) only allows me to have 1 robots.txt file So, again: may I have a robots.txt file woth no sitemap AND not suffering any potential SEO loss? Thanks in advance, Juan Vicente Mañanas Abad
Technical SEO | | Webicultors0 -
301 Redirects in subfolders
Hi, we're making our site into a static site but I would like to transfer the Google juice. Most of the links and database exist on subfolders though. Could I simply do 301 redirects on the subfolders and retain the value or does it have to be on the full domain?
Technical SEO | | Therealmattyd0 -
Best Practice on 301 Redirect - Images
We have two sites that sell the same products. We have decided to retire one of the sites as we'd like to focus on one property. I know best practice is to redirect apples to apples, which in our case is easily done since the sites sold the same thing. www.SiteABC.com/ProductA can be redirected to www.SiteXYZ.com/ProductA. My question is how far does that thinking go regarding images? Each product has a main product page, of course, and then up to 6 images in some cases. Is it necessary to redirect www.SiteABC.com/ProductA-Image1.jpg to www.SiteXYZ.com/ProductA-Image1.jpg? Or can they all be redirected to just the product page?
Technical SEO | | Natitude0 -
Block Domain in robots.txt
Hi. We had some URLs that were indexed in Google from a www1-subdomain. We have now disabled the URLs (returning a 404 - for other reasons we cannot do a redirect from www1 to www) and blocked via robots.txt. But the amount of indexed pages keeps increasing (for 2 weeks now). Unfortunately, I cannot install Webmaster Tools for this subdomain to tell Google to back off... Any ideas why this could be and whether it's normal? I can send you more domain infos by personal message if you want to have a look at it.
Technical SEO | | zeepartner0 -
Allow or Disallow First in Robots.txt
If I want to override a Disallow directive in robots.txt with an Allow command, do I have the Allow command before or after the Disallow command? example: Allow: /models/ford///page* Disallow: /models////page
Technical SEO | | irvingw0