Correct use for Robots.txt
-
I'm in the process of building a website and am experimenting with some new pages. I don't want search engines to begin crawling the site yet. I would like to add the Robot.txt on my pages that I don't want them to crawl. If I do this, can I remove it later and get them to crawl those pages?
-
Lewis,
Thank you for the clarification!
-
Hi Eric
The guidance above means that Google when it looks to crawl your site won't its not a message to Google telling it never to come back.
Once everything is sorted, remove whichever approach you took to block the search engines and supply a sitemap to Google via the Webmaster tools. Your site should be crawled in no time after that.
Hope this helps.
-
Damian,
Thanks for your answer, that helps. If I add either one of the above items to my web page, and then remove it at a later date, will the search engines crawl and rank my site (at sometime after they are removed)? In other words, and I know this sounds stupid, but does a search engine see a Robots.txt file and never visit it again?
-
Hey Eric,
If you want to create and work on pages but you don't want them indexed you can add the following to the page in the section (the pages will still be crawled):
If you want NONE of your pages to be crawled (I.E the whole website) you can add the following to your robots.txt file:
User-agent: * Disallow: /
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using WebP Image Alongside Existing Images
Is it worthwhile to add in WebP images alongside existing images? WebP Images can be three times smaller than PNGs and 25% smaller than JPGs, according to a plugin option I am looking at. The alternative WebP images are provided via CDN. Does anyone have any experience with this, and is it worth doing?
Web Design | | GrouchyKids0 -
Disallow: /sr/ and Disallow: /si/ - robots.txt
Hello Mozzers - I have come across the two directives above in a robots.txt file of a website - the web dev isn't sure what they meant although he implemented robots.txt - I think just legacy stuff that nobody has analysed for years - I vaguely recall sr means search request but can't remember. If any of you know what these directives do, then please let me know.
Web Design | | McTaggart0 -
We added hundreds of pages to our website & restructured the layout to include 3 additional locations within the sub-pages, same brand/domain name. How long could Google take to crawl/index the new pages and rank the keywords used within those pages?
We added hundreds of pages to our website & restructured the layout to include 3 additional locations within the sub-pages, same brand/domain name. The 3 locations old domains were redirected to their sites within our main brand domain. How long could Google take to crawl/index the new pages and rank the keywords used within those pages? And possibly increase our domain authority hopefully? We didn't want our brand spread out over multiple websites/domains on the internet. This also allowed for more content to be written on pages, per each of our locations service's, as well.
Web Design | | BurgSimpson0 -
Duplicate Title Issues using # anchor tags
Our homepage navigation uses anchor tags (?TabNumb=1#, ?TabNumb=1# etc) rather than directly linking to different pages to decrease load time (and simplify the build process I owuld imagine). These anchor links are showing up as duplicate titles in Moz. I am pretty sure if I were to use noindex or rel tags, that could have a negative affect on my search results. Any way to tackle this outside of a complete redesign of the structure? http://www.dedoose.com/about-us/?TabNum=2# as an example
Web Design | | sbnjl0 -
The impact of using directories without target keyword on our Rankings
Hello all, I have a question regarding a website I am working on. I’ve read a lot of Q en A’s but couldn’t really find the best answer. For one of our new websites we are thinking about the structure of this website and the corresponding URL-structure. Basically we have a main product (and a few main keywords) which should drive the most traffic to our website, and for which we want to optimize our homepage. Besides those main keywords, we have an enormous base of long-tail keywords from which we would like to generate traffic. This means we want to create a lot of specific pages which are optimized. My main question is the following: We are thinking of two options: Option 1: www.example.com/example-keyword-one Option 2: www.example.com/directory/example-keyword-one With option 1 we will link directly from our homepage to the most important pages (which represent our most important keywords). All the pages with the long tail content will be linked from another section on our website, which is one click away from our homepage (specifically a /solutions page which is linked from the footer). All the pages with long-tail content will have this structure www.example.com/example-keyword-one so the URLs will not contain the directory /solutions With option 2 we will use more subdirectories in our URLs. Specifically, for all the long tail content we would use URLs like this: www.example.com/solutions/example-keyword-one
Web Design | | NielsB
The directories we want to use wouldn't really have added value in terms of SEO, since they don’t represent important keywords. So what is the best way to go? Option 1, straightforward, short URL’s which don’t really represent the linking structure of our website, but only contain important keywords. Or option 2, choose for more directories in our URLs which represent the linking structure of our website, but contain directories which don’t represent important keywords. Would the keyword ‘solutions’ in the directory (which doesn’t really relate to the content on the page) have a negative impact on our rankings for that URL?0 -
Does anyone know how much a wordpress site can store (in terms of data) I want to put all my movies on it and use it as a personal global external hard drive! Thanks!!
So basically, I have about 500 GB of movies on my computer and I don't want to buy an external hardrive. I don't want to spend the money A website I could access anytime, and anywhere, without having to carry my external with me everywhere I go. Thanks in advance for any help/ references.
Web Design | | TylerAbernethy0 -
Using H1 Headings - More than 1?
I've known about avoiding the use of more than 1 H1 Heading Tags, however, with HTML5 is this going to change... at least that's how I understand it. According to HTML5 Specs, Each 'section' can have an H1 heading, which at least theoretically means certain web pages that have multiple "sectioning elements" can have more than 1 H1 heading... true? False? What I'm looking for here is some insight into the ramifications HTML5 will have on the use of H1 tags. And would like to know how search engines currently handle this and are they anticipated to change as the HTML5 outline algorithm becomes widely supported? thanks in advance Kelly
Web Design | | KellysTutorials0 -
The use of foreign characters and capital letters in URL's?
Hello all, We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: ñ and ó. We have done our research around the web and realised that many of the top competitors for keywords such as Diseño Web (web design) and Aplicaión iPhone (iphone application) DO NOT use these special chacracters in their URL structure. Here is an example of our URL's EX: http://www.twago.es/expert/Diseño-Web/Diseño-Web However when I simply copy paste a URL that contains a special character it is automatically translated and encoded. EX: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone (When written out long had it appears: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters? When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently? My second question is the same, but focusing on the use of Capital letters in our URL structure. NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue? Any help anyone could give us would be greatly appreciated! Thanks, David from twago
Web Design | | wdziedzic0