Google (GWT) says my homepage and posts are blocked by Robots.txt
-
I guys.. I have a very annoying issue..
My Wordpress-blog over at www.Trovatten.com has some indexation-problems..
Google Webmaster Tools data:
GWT says the following: "Sitemap contains urls which are blocked by robots.txt." and shows me my homepage and my blogposts..This is my Robots.txt: http://www.trovatten.com/robots.txt
"User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/Do you have any idea why it says that the URL's are being blocked by robots.txt when that looks how it should?
I've read a couple of places that it can be because of a Wordpress Plugin that is creating a virtuel robots.txt, but I can't validate it..1. I have set WP-Privacy to crawl my site
2. I have deactivated all WP-plugins and I still get same GWT-Warnings.Looking forward to hear if you have an idea that might work!
-
Do you know which plugin (or combination) was the trouble?
I use a lot of wordpress, and this is very interesting.
-
You are absolutely right.
The problem was that a plugin I installed messed with my robots.txt
-
I am going to disagree with the above.
The command <meta < span="">name="robots" content="noodp, noydir" /> has nothing to do with denying any access to the robots.</meta <>
It is used to prevent the engines from displaying meta descriptions from DMOZ and the Yahoo directory. Without this line, the search engines might choose to use those descriptions, rather than the descriptions you have as meta descriptions.
-
Hey Frederick,
Here's your current meta data for robots on your home page (in the section):
name="robots" content="noodp, noydir" />
Should be something like this:
name="robots" content="INDEX,FOLLOW" />
I don't think it's the robots.txt that's the issue, but rather the meta-robots in the head of the site.
Hope this helps!
Thanks,
Anthony
[moderator's note: this answer was actually not the correct answer for this question, please see responses below]
-
I have tweak around with an XML SItemap-generater and I think it works. I'll give an update in a couple of hours!
Thansk!
-
Thanks for your comment Stubby and you are probably right.
But the problem is the Disallowing and not the sitemaps.. And based on my Robots.txt should everything be crawable.
What I'm worried about is that the virtuel Robots.txt that WP-generates is trouble.
-
Is Yoast generating another sitemap for you?
You have a sitemap from a different plugin, but Yoast can also generate sitemaps, so perhaps you have 2 - and one of the sitemaps lists the items that you are disallowing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I Block https URLs using Host directive in robots.txt?
Hello Moz Community, Recently, I have found that Google bots has started crawling HTTPs urls of my website which is increasing the number of duplicate pages at our website. Instead of creating a separate robots.txt file for https version of my website, can I use Host directive in the robots.txt to suggest Google bots which is the original version of the website. Host: http://www.example.com I was wondering if this method will work and suggest Google bots that HTTPs URLs are the mirror of this website. Thanks for all of the great responses! Regards,
Technical SEO | | TJC.co.uk
Ramendra0 -
Good robots txt for magento
Dear Communtiy, I am trying to improve the SEO ratings for my website www.rijwielcashencarry.nl (magento). My next step will be implementing robots txt to exclude some crawling pages.
Technical SEO | | rijwielcashencarry040
Does anybody have a good magento robots txt for me? And what need i copy exactly? Thanks everybody! Greetings, Bob0 -
Is there a limit to how many URLs you can put in a robots.txt file?
We have a site that has way too many urls caused by our crawlable faceted navigation. We are trying to purge 90% of our urls from the indexes. We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags. Meanwhile we are getting hit with excessive url warnings and have been it by Panda. Would it help speed the process of purging urls if we added the urls to the robots.txt file? Could this cause any issues for us? Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.
Technical SEO | | kcb81780 -
Blocked URL parameters can still be crawled and indexed by google?
Hy guys, I have two questions and one might be a dumb question but there it goes. I just want to be sure that I understand: IF I tell webmaster tools to ignore an URL Parameter, will google still index and rank my url? IS it ok if I don't append in the url structure the brand filter?, will I still rank for that brand? Thanks, PS: ok 3 questions :)...
Technical SEO | | catalinmoraru0 -
Hit by Google
My site - www.northernlightsiceland.com - has been hit by google and Im not sure why. The traffic dropped 75% last 24 hours and all the most important keywords have dropped significantly in the SERP. The only issue I can think of are the subpages for the northern lights forecasting I did every day e.g. http://www.northernlightsiceland.com/northern-lights-forecast-iceland-3-oct-2012/ I have been simply doing a copy/paste for 1 month the same subpage, but only changing the top part (Summary) for each day. Could this be the reason why Im penalized? I have now simply taken them all down minus the last 3 days (that are relevant). What can I do to get up on my feet again? This is mission critical for me as you can imagine. Im wondering if it got hit by this EMD update on 28 sept that was focusing on exact match domains http://www.webmasterworld.com/google/4501349-1-30.htm
Technical SEO | | rrrobertsson0 -
Removing robots.txt on WordPress site problem
Hi..am a little confused since I ticked the box in WordPress to allow search engines to now crawl my site (previously asked for them not to) but Google webmaster tools is telling me I still have robots.txt blocking them so am unable to submit the sitemap. Checked source code and the robots instruction has gone so a little lost. Any ideas please?
Technical SEO | | Wallander0 -
Help with google adsense
Hi i wonder if anyone can help me with google adsense. I am having trouble making money with google adsense. I have been altering my pages to try and get better results with google adsense but nothing works. my traffic at the moment is about 3000 visitors a day but this should be doubled to around 6000 a day within the next two months. here is the layout of a typical page and i would be grateful for any advice on how to alter it to make money with google adsense http://www.in2town.co.uk/showbiz-gossip/rihanna-news/rihanna-shocks-fans-over-her-sexy-body-claims
Technical SEO | | ClaireH-1848860 -
Google May Update
As per google the May update takes care of all content scrapping sites Then why is this site - http://www.viduba.com is still having good ranking ? All of its videos are hotlinkled from youtube
Technical SEO | | krishru0