Search engines have been blocked by robots.txt., how do I find and fix it?
-
My client site royaloakshomesfl.com is coming up in my dashboard as having Search engines have been blocked by robots.txt, only I have no idea where to find it and fix the problem. Please help! I do have access to webmaster tools and this site is a WP site, if that helps.
-
Here's the screen in Wordpress to check to see if you've blocked search engines through WP.
-
Ah, I see. Thanks!!
-
nothing comes back when I run this
site:www.royaloakshomesfl.com
So looks like its not indexed
-
When I search their brand name, they pop up in Google, so they must be indexed somehow, no?
-
I would do as Lavellester suggest, then i would then i would re-submit as i can see you are not in either index.
-
You can see it here:
http://royaloakshomesfl.com/robots.txt
So its in the webroot of your hosting. You can change it via FTP to and change it to the following to grant full access until you have figured out robot access rights. I think you need at least one Disallow: line without the / slash in your existing configuration to make it work.
User-agent: * Disallow:
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using one robots.txt for two websites
I have two websites that are hosted in the same CMS. Rather than having two separate robots.txt files (one for each domain), my web agency has created one which lists the sitemaps for both websites, like this: User-agent: * Disallow: Sitemap: https://www.siteA.org/sitemap Sitemap: https://www.siteB.com/sitemap Is this ok? I thought you needed one robots.txt per website which provides the URL for the sitemap. Will having both sitemap URLs listed in one robots.txt confuse the search engines?
Technical SEO | | ciehmoz0 -
Page disappears from Google search results
Hi, I recently encountered a very strange problem.
Technical SEO | | JoelssonMedia
One of the pages I published in my website ranked very well for a couple of days on top 5, then after a couple of days, the page completely vanished, no matter how direct I search for it, does not appear on the results, I check GSC, everything seems to be normal, but when checking Google analytics, I find it strange that there is no data on the page since it disappeared and it also does not show up on the 'active pages' section no matter how many different computers i keep it open. I have checked to page 9, and used a couple of keyword tools and it appears nowhere! It didn't have any back links, but it was unique and high quality. I have checked on the page does still exist and it is still readable. Has this ´happened to anyone before? Any thoughts would be gratefully received.0 -
No related searches
Does anyone have any insight as to why a site wouldn't show any results when using this google search operator:
Technical SEO | | SoulSurfer8
related:site.com There are no results appearing. We recently moved from .com to .org with 301 redirects in place and change of address tool submitted. There are no penalties or warnings in search console but we have seen a significant decrease in search traffic. Thanks in advance.0 -
Can I Block https URLs using Host directive in robots.txt?
Hello Moz Community, Recently, I have found that Google bots has started crawling HTTPs urls of my website which is increasing the number of duplicate pages at our website. Instead of creating a separate robots.txt file for https version of my website, can I use Host directive in the robots.txt to suggest Google bots which is the original version of the website. Host: http://www.example.com I was wondering if this method will work and suggest Google bots that HTTPs URLs are the mirror of this website. Thanks for all of the great responses! Regards,
Technical SEO | | TJC.co.uk
Ramendra0 -
Helping finding a link
Hi So Ive done a crawl of the site using screaming frog. There are a few old category and sub category pages which don't exist any more but somehow the crawler is finding them. An example is below: http://www.ebuyer.com/store/Home-Appliances/cat/Health-&-Beauty/subcat/Male-Grooming Just wondering if anybody had any ideas about how I could go and find these urls and remove them off the site. Any ideas would be really appreciated. Thanks Andy
Technical SEO | | Andy-Halliday0 -
Can someone PLEASE help me find a solution to use a custom search engine for iPhones?? Thanks in advance!
Hey mozzers, I'm in quite the pickle today and would really appreciate some help! i need a way to have my members set their default custom search engine on their iPhones and androids to our sites search engine. Google chrome does on desktops but not iPhones. Thanks for your time/help, Tyler Abernethy
Technical SEO | | TylerAbernethy0 -
Wordpress Robots.txt Sitemap submission?
Alright, my question comes directly from this article by SEOmoz http://www.seomoz.org/learn-seo/r... Yes, I have submitted the sitemap to google, bing's webmaster tools and and I want to add the location of our site's sitemaps and does it mean that I erase everything in the robots.txt right now and replace it with? <code>User-agent: * Disallow: Sitemap: http://www.example.com/none-standard-location/sitemap.xml</code> <code>???</code> because Wordpress comes with some default disallows like wp-admin, trackback, plugins. I have also read this, but was wondering if this is the correct way to add sitemap on Wordpress Robots.txt. [http://www.seomoz.org/q/removing-...](http://www.seomoz.org/q/removing-robots-txt-on-wordpress-site-problem) I am using Multisite with Yoast plugin so I have more than one sitemap.xml to submit Do I erase everything in Robots.txt and replace it with how SEOmoz recommended? hmm that sounds not right. like <code> <code>
Technical SEO | | joony2008
<code>User-agent: *
Disallow: </code> Sitemap: http://www.example.com/sitemap_index.xml</code> <code>``` Sitemap: http://www.example.com/sub/sitemap_index.xml ```</code> <code>?????????</code> ```</code>0 -
How do I fix duplicate content with the home page?
This is probably SEO 101, but I'm unsure what to do here... Last week my weekly crawl diagnostics were off the chart because http:// was not resolving to http://www...fixed that but now it's saying I have duplicate content on: http://www.......com http://www.......com/index.php How do I fix this? Thanks in advance!
Technical SEO | | jgower0