Impact of "restricted by robots" crawler error in WT
-
I have been wondering about this for a while now with regards to several of my sites. I am getting a list of pages that I have blocked in the robots.txt file. If I restrict Google from crawling them, then how can they consider their existence an error? In one case, I have even removed the urls from the index.
And do you have any idea of the negative impact associated with these errors.
And how do you suggest I remedy the situation.
Thanks for the help
-
Google is just showing you a warning that hey, these are excluded, make sure that you want them excluded. They're not passing a judgement on whether or not they should be excluded. So, as long as they're excluded on purposes, no worries.
-
Hi Patrick,
That section is simply there to advice on any URLs that Google feels are wrongly excluded within the robots.txt
If the URLs are not wrongly excluded, don't worry about it showing in WMT's - it's there just as an advisory.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the difference between "Referring Pages" and "Total Backlinks" [on Ahrefs]?
I always thought they were essentially the same thing myself but appears there may be a difference? Any one care to help me out? Cheers!
Technical SEO | | Webrevolve0 -
How can I Style Long "List Posts" in Wordpress?
Hi All, I have been working on a list-post which spans over 100 items. Each item on the list has a quick blurb to explain it, an image and a few resource links. I am trying to find an attractive way to present this long list post in Wordpress. I have seen several sites with long list posts however; they place their items one on top of the other which yields a VERY long page and the end user has to do a lot of scrolling. Others turn their lists into slideshows, but I have no data on how slides perform against 10-mile-long-lists which load in 1 page. I would like to do something similar to what List25.com does as they present about 5-10 items per page and they seem to have pagination. The pagination part I understand however; is there a shortcode plugin to format lists in an attractive way just like list25?
Technical SEO | | IvanC0 -
GWT Error for RSS Feed
Hello there! I have a new RSS feed that I submitted to GWT. The feed validates no problemo on http://validator.w3.org/feed/ and also when I test the feed in GWT it comes back aok, finds all the content with "No errors found". I recently got a issue with GWT not being able to read the rss feed, error on line 697 "We were unable to read your Sitemap. It may contain an entry we are unable to recognize. Please validate your Sitemap before resubmitting." I am assuming this is an intermittent issue, possibly we had a server issue on the site last night etc. I am checking with my developer this morning. Wanted to see if anyone else had this issue, if it resolved itself, etc. Thanks!
Technical SEO | | CleverPhD0 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
Increase in authorization permission errors error after site switch
We launched our new site 2 days ago , since site was down for 12 hours for maintenance, we saw google webmaster tool shows this error . Since then google hasnt crawled, its been 36 hours. Do we need to do anyting? We have close to a million page google crawled before and I am wondering if this will effect anything.
Technical SEO | | tpt.com0 -
Drupal Updates = errors
We have worked diligently to correct our SEO Moz crawl diagnostic errors to below 20. However, at least twice now, our coder updates the drupal security warnings and BINGO- every time - our errors go sky high again. Any thoughts?
Technical SEO | | Stevej240 -
With or without "/" at the end of domain
Hello, A client domains appear sometimes like www.domain.co.uk and sometimes like www.domain.co.uk/ I would like to place redirects from URLs that contain strings such as /index.aspx?id=42 to the main page but which one should I pick? With or without the "/" ? Thank you
Technical SEO | | DavidSpivac0 -
301 for "index.php" in Web.config?
Hi there, I'm trying to create a 301 redirect for the file "index.php" but I keep getting a "fail to redirect" message in Firefox whenever I insert it into the Web.config file. <location path="index.php"></location> Is there anyway around this? Thanks for any help According to Open Site Explorer, there are about 500 links to my index file but it only has a 302 status so will not be passing link juice.
Technical SEO | | tdsnet0