Best way to create robots.txt for my website
-
How I can create robots.txt file for my website guitarcontrol.com ?
It is having login and Guitar lessons.
-
Hi,
First you need to understand your website need, you have to decide which part of your website should not be indexed or crawled by SE bots, like your website provides user login and user areas, if you are providing private dashboard for your user then it should be blocked by robots.txt (or you can use meta tag to prevent robots from crawling and indexing your particular page like ) or you can learn more about robots.txt here https://moz.com/learn/seo/robotstxt
Hope it helps
-
I see that you're on WordPress.
This CMS create "virtual" robots.txt. You can see this here:
https://codex.wordpress.org/Search_Engine_Optimization_for_WordPress#Robots.txt_OptimizationBut on your website there is error in robots.txt and you should see in web server log files (access and error) why this is happening. Also you may need looking .htaccess because something preventing this text file to be accessed.
There is alternative way for using robots.txt in WordPress. All you need is to create new and blank robots.txt in same folder and put this there:
User-agent: *
Disallow:Then save file and that's all. Now bad news - WP can't control indexing and crawling anymore.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Setting Up A Website For Redirects
I've got an old defunct domain with a lot of backlinks to individual pages. I'd like to use these backlinks for link juice by redirecting them to individual pages on the new domain (both sites belong to the same company). What is the best way to set this up? I presume I need some kind of hosting & site, even if it's just a default Wordpress install, which I can then use to set up the redirects? Would it be best done using .htaccess file for 301 redirects or some other way?
Technical SEO | | abisti20 -
Best Practice - Linking out to client websites in niche industry
I have a client in a niche building industry that provides 4 different services to them. She has provided me with a list of 131 past clients of hers that she wants hyperlinked on her site to theirs. The logic is that a lot of these clients are heavy hitters and quite impressive to their peers so the links will be reinforcing my client's value. Is there a best practice for determining whether the link should be follow/no follow? Should I be checking the client's site's spam score, page rank, anything else? Some of these 131 links will be duplicated due to the client performing more than one service for them.
Technical SEO | | JanetJ1 -
Website not ranking but the blog is!
I am hoping someone might be able to help me, I am doing some work on a website. A new version of the site was recently launched and since then rankings have plummeted and the new blog pages are ranking better! When the new version of the site went live, the domain changed to the non-www version, plus an incorrect robots.txt file and we have never really been able to fully recover (both of these things were beyond my control!). The robots.txt file was corrected and some of the external links links changed to the non-www but there is a 301 redirect in place so changing to the non-www shouldn't have been the reason to drop the site out completely. Before the launch of the new website, the site was ranking on the front page of Google for a lot of relevant keywords such as outdoor blinds, outdoor blinds Perth, cafe blinds, patio blinds, etc. The quality of the links is pretty bad and I am attempting to remove them before doing a disavow of all the really bad quality links but unless we were really unlucky I don't think it's the links right now that are causing the problem. I have ran the site through numerous crawl tests, checked the robots.txt, there are no messages in GWMT, the pages are indexed but I have a feeling there is something wrong with the site that is stopping this site from ranking well. If anyone could give me any insights I would be really grateful. I know the site could be better structured from a keyword/ structure perspective but the site was ranking fine!
Technical SEO | | Karen_Dauncey0 -
Robots.txt anomaly
Hi, I'm monitoring a site thats had a new design relaunch and new robots.txt added. Over the period of a week (since launch) webmaster tools has shown a steadily increasing number of blocked urls (now at 14). In the robots.txt file though theres only 12 lines with the disallow command, could this be occurring because a line in the command could refer to more than one page/url ? They all look like single urls for example: Disallow: /wp-content/plugins
Technical SEO | | Dan-Lawrence
Disallow: /wp-content/cache
Disallow: /wp-content/themes etc, etc And is it normal for webmaster tools reporting of robots.txt blocked urls to steadily increase in number over time, as opposed to being identified straight away ? Thanks in advance for any help/advice/clarity why this may be happening ? Cheers Dan0 -
Best way to create a shareable dynamic infographic - Embed / Iframe / other?
Hi all, After searching around, there doesn't seem to be any clear agreement in the SEO community of the best way to implement a shareable dynamic infographic for other people to put into their site. i.e. That will pass credit for the links to the original site. Consider the following example for the web application that we are putting the finishing touches on: The underlying site has a number of content pages that we want to rank for. We have created a number of infogrpahics showing data overlayed on top of a google map. The data continuously changes and there are javascript files that have to load in order to achieve the interactivity. There is one infographic per page on our site and there is a link at the bottom of the infographic that deep links back to each specific page on our site. What is the ideal way to implement this infographic so that the maximum SEO value is passed back to our site through the links? In our development version we have copied the youtube approach implemented this as an iframe. e.g. <iframe height="360" width="640" src="http://www.tbd.com/embed/golf" frameborder="0"></iframe>. The link at the bottom of that then links to http://www.tbd.com/golf This is the same approach that Youtube uses, however I'm nervous that the value of the link wont pass from the sites that are using the infographic. Should we do this as an embed object instead, or some other method? Thanks in advance for your help. James
Technical SEO | | jtriggs0 -
New website, to www or not
I was just wondering if there are any advantages to using the www instead of just the domain name for seo. Can these be elaborated on?
Technical SEO | | simvegas1 -
Robots.txt
Hello Everyone, The problem I'm having is not knowing where to have the robots.txt file on our server. We have our main domain (company.com) with a robots.txt file in the root of the site, but we also have our blog (company.com/blog) where were trying to disallow certain directories from being crawled for SEO purposes... Would having the blog in the sub-directory still need its own robots.txt? or can I reference the directories i don't want crawled within the blog using the root robots.txt file? Thanks for your insight on this matter.
Technical SEO | | BailHotline0 -
Does creating a mobile site in html5 create duplicate content?
We are creating a mobile site in html5 to serve smartphones only. On a seperate domain, m.example.com. From what I have read Google treats smartphones as desktops due to thier advanced web browser capabilities. So no need to bother with googlebot.mobile right? Googlebot should index the site once I create a normal sitemap.xml. My concern is that the mobile site pulls the same content as the main site which is already indexed. Would this not create duplicate content?
Technical SEO | | sfseo0