Is having no robots.txt file the same as having one and allowing all agents?
-
The site I am working on currently has no robots.txt file. However, I have just uploaded a sitemap and would like to point the robots.txt file to it.
Once I upload the robots.txt file, if I allow access to all agents, is this the same as when the site had no robots.txt file at all; do I need to specify crawler access on can the robots.txt file just contain the link to the sitemap?
-
According to me a sitemap is more important than robots.txt as it help a search engine bot in effectively crawling a website. Robots.txt is generally used to request (allow: or disallow:)a crawler not to crawl and index certain section of your website containing sensitive data. This is totally upto the crawler to respect the request by not crawling and indexing that sensitive part. However, it is a general practice among webmasters world wide to have a robots.txt file for each of their sites. A common robots.txt with permission to access the entire website should look like this:
User-agent: *
Disallow:Sitemap: http://www.yoursite.com/sitemap.xml
So if you want some section (folders, directories) of your site not to be crawled by a bot then you can use a robots.txt.
Yes logically its the same like having a robots.txt file granting all the access and not having one completely. Its just a difference between like something having 'by default". Having a robots.txt file doesn't guarantee a rank boost in the SERP. Hope it helps. For more understanding please refer these resources:
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Set Up htaccess File
Looking for expert help (willing to pay) to set up a proper htaccess file. I'm having an issue as the site has a subdomain at secure.domain.com and has php extensions there. I tried a couple recommended code sets but it seems to be a mess. The site is working properly but this may be causing rankings issues. It's coded in pure HTML and PHP, no Wordpress stuff.
Technical SEO | | execubob
The delete www causes the secure side to fail. The delete html extensions causes the php extensions to fail.0 -
Disavow file and backlinks listed in webmaster tools
Hi guys, I've sent a disavow file via webmaster tools. After that, should the backlinks from domains listed in that file disappear from the list of links to my website in webmaster tools? Or does webmaster tools show all the links, whether I've sent disavow file or not?
Technical SEO | | superseopl0 -
Hashtag in url seems to remove the google plus one
My site has a catalogue page (catalog in US) with #anchors so that customers can get straight to them. I even link from other pages to the #anchor on the catalogue page. so I have for example: www.example.co.uk/catalogue.htm www.example.co.uk/catalogue.htm#blueitems www.example.co.uk/catalogue.htm#redtems I understand google doesn't index after the #, here is the post I found: http://moz.com/community/q/hashtag-anchor-text-within-content#reply_91192 So I shouldn't have an seo problem. BUT, if I navigate to www.example.co.uk/catalogue.htm and plusone the page it will show the plusone and then I navigate to www.example.co.uk/catalogue.htm#blueitems the plus one is gone. The same happens in reverse, if i plusone www.example.co.uk/catalogue.htm#redtems, then that plusone doesn't show in www.example.co.uk/catalogue.htm. I added rel=canonical and that fixed the plusone problem, now if you plus one /catalogue.htm#redtems it still shows on catalogue.htm This seems a bit extreme and did I do the right things??
Technical SEO | | Peter24680 -
301 redirect of one site version to new domain
Hello all. I today have domain.com that has 10 language versions and the structure is by folders: domain.com/ru/
Technical SEO | | Gregos
domain.com/pl/ etc. Soon I plan redesign,new CMS and I plan to register 9 new ccTLDs. My question is: Can I 301 redirect domain.com/ru/ to domain.ru without having some bad effect on domain.com? I mean,the main domain,com is not going to be affected by permanent redirect of one url to completly new domain right?0 -
Are you allowed to point different urls to same page
hi, i have some urls that i am either going to put on hold or thinking about pointing to one of my sites. what it is, i am looking at re-designing the pages but not until next year, so i thought i would point some of the urls to a site that i am happy with to different pages, but not sure if i am allowed this or not so for example, if i have a site on cars, and one of the url is www.rovercars.co.uk i was thinking about pointing it to the page that is about rover cars. can anyone let me know if this is allowed or not please
Technical SEO | | ClaireH-1848860 -
Wordpress Robots.txt Sitemap submission?
Alright, my question comes directly from this article by SEOmoz http://www.seomoz.org/learn-seo/r... Yes, I have submitted the sitemap to google, bing's webmaster tools and and I want to add the location of our site's sitemaps and does it mean that I erase everything in the robots.txt right now and replace it with? <code>User-agent: * Disallow: Sitemap: http://www.example.com/none-standard-location/sitemap.xml</code> <code>???</code> because Wordpress comes with some default disallows like wp-admin, trackback, plugins. I have also read this, but was wondering if this is the correct way to add sitemap on Wordpress Robots.txt. [http://www.seomoz.org/q/removing-...](http://www.seomoz.org/q/removing-robots-txt-on-wordpress-site-problem) I am using Multisite with Yoast plugin so I have more than one sitemap.xml to submit Do I erase everything in Robots.txt and replace it with how SEOmoz recommended? hmm that sounds not right. like <code> <code>
Technical SEO | | joony2008
<code>User-agent: *
Disallow: </code> Sitemap: http://www.example.com/sitemap_index.xml</code> <code>``` Sitemap: http://www.example.com/sub/sitemap_index.xml ```</code> <code>?????????</code> ```</code>0 -
Does no preferred domain allow interlinking spammers to double their output?
Doing research on new client's links. Found 151 linking root domains all from same interlinking scam. Here are duplicated domains for one site (not my client) on his scam: DrainageHouston.com, patiosandponds.net, patioshouston.net, houstonlandscape.org, drainagehouston.com, houstonoutdoorlighting.com. I have attached an img. showing these in OSE with each having a www and a non www linking to the site. Note: When I found this it was by checking other domains they owned that they did not know had sites on them. They literally were all cloned with other domain names. We took the three additional sites and did 301 redirects from those to main site. Since there were only three additional and only about 30 pages per, I do not see it as a problem with redirect. So the question is: By doing this without preferred domain and 301 in .htaccess of non-www to www, is he able to double his dubious enterprise?
Technical SEO | | RobertFisher0 -
Problem with indexed files before domain was purchased
Hello everybody, We bought this domain a few months back and we're trying to figure out how to get rid of indexed pages that (i assume) existed before we bought this domain - the domain was registered in 2001 and had a few owners. I attached 3 files from my webmasters tools, can anyone tell me how to get rid of those "pages" and more important: aren't this kind of "pages" result of some kind of "sabotage"? Looking forward to hearing your thoughts on this. Thank you, Alex Picture-5.png Picture-6.png Picture-7.png
Technical SEO | | pwpaneuro0