How to fix and issue with robot.txt ?
-
I am receiving the following error message through webmaster tools
http://www.sourcemarketingdirect.com/: Googlebot can't access your site
Oct 26, 2012
Over the last 24 hours, Googlebot encountered 35 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 100.0%.The site has dropped out of Google search.
-
Hi Stacey
What plugins do you have running - any caching plugins such the W3 Total Cache plugin?
Are you able to access your servers error logs to see if you can see anything there?
-
Thanks for your answer.
I have received this message from Google
**http://www.sourcemarketingdirect.com/ **using the Meta tag method (less than a minute ago). Your site's home page returns a status of 500 (Internal server error) instead of 200 (OK)
It looks like the permalink structure has changed but I'm not sure how.
-
I've seen several people ask this very same question over the last week in different forums. I am wondering if the major outages with hurricane Sandy have affected several hosts or DNS's.
Your robots.txt looks fine to me.
I'm guessing that you will completely recover once Google has a chance to fully crawl the site again.
-
just a quick check you have got wordpress visible to search engines set in the admin area? if not it will be set to disallow googlebot to crawl it.
it is in admin - options - privacy and select appropriate box - default is no index, no follow.
-
Thanks Matt.
There is no robots.txt as far as I can see. Is there a plugin I can use for wordpress?
The site was down for 2 days last month while hte original host transfered the site over to me.
Right now a site search says their are 13 pages indexed.
Just concerned that this site has always ranked number 1 for a company name search and now they are not on the first 10 pages in Google.
-
have you made sure your robots.txt is loading in your browser by adding robots.txt after your domain same as a normal page and can you see contents? has your site been down in this period? have you changed the contents of the file just before this issue? are you sure googlebot hasnt come back since that date - whats your analytics say? do an index site: search for your domain to see if it is in google.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does a similar CMS fabric/theme used by 2 colleagues, cause SEO issues?
Does using the same CMS fabric/theme with substantially different content for each website but the same business address (on google places and the websites) cause SEO issues? For example: 2 colleagues with somewhat similar services located at the same business addresses desire to use the same CMS fabric /theme but will have different content on the sites. Will this hurt their SEO / should they use a different website skin/theme?
Web Design | | toti5880 -
Drupal Domain Access SEO Issues
I'm working with a new developer to redesign several Drupal sites and have 3-4 sites with similar designs and modules. The developer is keen on using Drupal Domain Access to make maintenance and sharing user information easier. Each site currently has a unique domain and content (although the sites are in related niches). Are there issues from an SEO perspective with the Drupal Domain Access Module? With only one instance of Drupal on the backend will Google somehow not view these as independent sites? Thanks for any info!
Web Design | | talltrees0 -
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Hi MozCommunity, I have been tearing my hair out trying to figure out why BING wont index a test site we're running. We're in the midst of upgrading one of our sites from archaic technology and infrastructure to a fully responsive version.
Web Design | | AU-SEO
This new site is a fully AngularJS driven site. There's currently over 2 million pages and as we're developing the new site in the backend, we would like to test out the tech with Google and Bing. We're looking at a pre-render option to be able to create static HTML snapshots of the pages that we care about the most and will be available on the sitemap.xml.gz However, with 3 completely static HTML control pages established, where we had a page with no robots metatag on the page, one with the robots NOINDEX metatag in the head section and one with a dynamic header (X-ROBOTS meta) on a third page with the NOINDEX directive as well. We expected the one without the meta tag to at least get indexed along with the homepage of the test site. In addition to those 3 control pages, we had 3 pages where we had an internal search results page with the dynamic NOINDEX header. A listing page with no such header and the homepage with no such header. With Google, the correct indexation occured with only 3 pages being indexed, being the homepage, the listing page and the control page without the metatag. However, with BING, there's nothing. No page indexed at all. Not even the flat static HTML page without any robots directive. I have a valid sitemap.xml file and a robots.txt directive open to all engines across all pages yet, nothing. I used the fetch as Bingbot tool, the SEO analyzer Tool and the Preview Page Tool within Bing Webmaster Tools, and they all show a preview of the requested pages. Including the ones with the dynamic header asking it not to index those pages. I'm stumped. I don't know what to do next to understand if BING can accurately process dynamic headers or AngularJS content. Upon checking BWT, there's definitely been crawl activity since it marked against the XML sitemap as successful and put a 4 next to the number of crawled pages. Still no result when running a site: command though. Google responded perfectly and understood exactly which pages to index and crawl. Anyone else used dynamic headers or AngularJS that might be able to chime in perhaps with running similar tests? Thanks in advance for your assistance....0 -
Fixing Render Blocking Javascript and CSS in the Above-the-fold content
We don't have a responsive design site yet, and our mobile site is built through Dudamobile. I know it's not the best, but I'm trying to do whatever we can until we get around to redesigning it. Is there anything I can do about the following Page Speed Insight errors or are they just a function of using Dudamobile? Eliminate render-blocking JavaScript and CSS in above-the-fold content Your page has 3 blocking script resources and 5 blocking CSS resources. This causes a delay in rendering your page.None of the above-the-fold content on your page could be rendered without waiting for the following resources to load. Try to defer or asynchronously load blocking resources, or inline the critical portions of those resources directly in the HTML.Remove render-blocking JavaScript: http://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js http://mobile.dudamobile.com/…ckage.min.js?version=2015-04-02T13:36:04 http://mobile.dudamobile.com/…pts/blogs.js?version=2015-04-02T13:36:04 Optimize CSS Delivery of the following: http://fonts.googleapis.com/…:400|Great+Vibes|Signika:400,300,600,700 http://mobile.dudamobile.com/…ont-pack.css?version=2015-04-02T13:36:04 http://mobile.dudamobile.com/…kage.min.css?version=2015-04-02T13:36:04 http://irp-cdn.multiscreensite.com/kempruge/files/kempruge_0.min.css?v=6 http://irp-cdn.multiscreensite.com/…mpruge/files/kempruge_home_0.min.css?v=6 Thanks for any tips, Ruben
Web Design | | KempRugeLawGroup0 -
Fixing my sites problem with duplicate page content
My site has a problem with duplicate page content. SEO MOZ is telling me 725 pages worth. I have looked a lot into the 301 Re direct and the Rel=canonical Tag and I have a few questions: First of all, I'm not sure which on I should use in this case. I have read that the 301 Redirect is the most popular path to take. If I take this path do I need to go in and change the URL of each of these pages or does it automatically change with in the redirect when I plug in the old URL and the new one? Also, do I need to just go to each page that SEO MOZ is telling me is a duplicate and make a redirect of that page? One thing that I am very confused about is the fact that some of these duplicates listed out are actually different pages on my site. So does this just mean the URL's are too similar to each other, and there fore need the redirect to fix them? Then on the other hand I have a log in page that says it has 50 duplicates. Would this be a case in which I would use the Canonical Tag and would put it into each duplicate so that the SE knew to go to the original file? Sorry for all of the questions in this. Thank you for any responses.
Web Design | | JoshMaxAmps0 -
HELP! IE secure page display issue on new live site
For some reason IE 7, 8, & 9 do not display the following page: https://www.jwsuretybonds.com/protools.htm All they show is the Norton seal. It shows properly in all other browsers without issue (including IE 10+), but the earlier versions flash the page for a split second, then hides everything. Can someone shed some light on this? This is a new live site we just launched minutes ago and these browsers account for 12% of our overall traffic. UGH I hate you microsoft!!! Thanks all 🙂
Web Design | | TheDude0 -
Crawler issues
Can anyone please suggest why our site is not being crawled by Google at the moment? Thanks,
Web Design | | CheethamBellJWT0 -
Duplicate content issue
I have recently built a site that has a main page intended to rank for national coverage. This site also has a number of pages targeted at local searches, these pages are slight variations of each other with town specific keywords. Does anyone know if google will see this as spam and quarantine my site from ranking? Thanks
Web Design | | stebutty0