Why are these results being showed as blocked by robots.txt?
-
If you perform this search, you'll see all m. results are blocked by robots.txt: http://goo.gl/PRrlI, but when I reviewed the robots.txt file: http://goo.gl/Hly28, I didn't see anything specifying to block crawlers from these pages.
Any ideas why these are showing as blocked?
-
Hi,
Your robots.txt file is very .. steroid healthy. It has his own universe
Are you 100% sure all of the entries are legit and clean ?
First thing I would do is to check Web M;aster Tools for the mobile subdomain. If you don't have it yet, that will be a good place to start - to verify the m subdomain.
Once in WeB Master Tools - you can debug this in no time.
Cheers.
-
but, even when i search from my mobile device, I get the same results (that m. is blocked)
-
I can't submit because I haven't claimed m. in GWT
-
If you haven't already done so, I recommend testing your robots.txt file against one of your mobile pages (such as m.healthline.com/treatments) in Google Webmaster Tools. You can do this by logging into GWT, then click Health, then Blocked URLs.
If you have already tested it in GWT, can you let us know what the results said?
-
Another good article from the community
-
So after a little it or research as I never ever came past this before as all the site we do are responsive, I found this
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=72462
It seems Google wont index a website that they think is a mobile website within the main serp, and vice verse ...
Hope that helps, cause it had me puzzled
Regards
John
-
Which directory are you storing your mobile website files within ...
-
Oh, sorry, on further investigation I see its just your mobile site that are being blocked ...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mystery URLs showing in Analytics - All 404s
Hi Guys So we have a whole load of mystery urls showing in analytics .The urls are completely not relevant and have somehow been created However - when you click on the URLs - they all go to 404 pages - pages not found. The website is a travel website but is showing pages like /overcome-fatigue-during-mesothelioma-treatment/ in analytics. Webmaster is not showing any of these pages - but analytics is showing traffic for them??? My initial thought was that it was a spam URL injection - but they are not pages. They don't exist Our database is fine, WP admin seems fine - none of these supposed pages have been created on WP - so why are they showing on analytics as having driven traffic??? None of the urls are indexed on Google. Its a mystery!!!! Can anyone help? Has anyone seen this before????
Intermediate & Advanced SEO | | CayenneRed890 -
How do you call this kind of result snippets?
I was wondering how do you kind this kind result snippet? Structured snippets? Rich Snippets? https://imgur.com/a/12qBk 12qBk
Intermediate & Advanced SEO | | hartator1 -
Robots txt is case senstive? Pls suggest
Hi i have seen few urls in the html improvements duplicate titles Can i disable one of the below url in the robots.txt? /store/Solar-Home-UPS-1KV-System/75652
Intermediate & Advanced SEO | | Rahim119
/store/solar-home-ups-1kv-system/75652 if i disable this Disallow: /store/Solar-Home-UPS-1KV-System/75652 will the Search engines scan this /store/solar-home-ups-1kv-system/75652 im little confused with case senstive.. Pls suggest go ahead or not in the robots.txt0 -
Question about Syntax in Robots.txt
So if I want to block any URL from being indexed that contains a particular parameter what is the best way to put this in the robots.txt file? Currently I have-
Intermediate & Advanced SEO | | DRSearchEngOpt
Disallow: /attachment_id Where "attachment_id" is the parameter. Problem is I still see these URL's indexed and this has been in the robots now for over a month. I am wondering if I should just do Disallow: attachment_id or Disallow: attachment_id= but figured I would ask you guys first. Thanks!0 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
Huge increase in server errors and robots.txt
Hi Moz community! Wondering if someone can help? One of my clients (online fashion retailer) has been receiving huge increase in server errors (500's and 503's) over the last 6 weeks and it has got to the point where people cannot access the site because of server errors. The client has recently changed hosting companies to deal with this, and they have just told us they removed the DNS records once the name servers were changed, and they have now fixed this and are waiting for the name servers to propagate again. These errors also correlate with a huge decrease in pages blocked by robots.txt file, which makes me think someone has perhaps changed this and not told anyone... Anyone have any ideas here? It would be greatly appreciated! 🙂 I've been chasing this up with the dev agency and the hosting company for weeks, to no avail. Massive thanks in advance 🙂
Intermediate & Advanced SEO | | labelPR0 -
Robots.txt Blocked Most Site URLs Because of Canonical
Had a bit of a "Gotcha" in Magento. We had Yoast Canonical Links extension which worked well , but then we installed Mageworx SEO Suite.. which broke Canonical Links. Unfortunately it started putting www.mysite.com/catalog/product/view/id/516/ as the Canonical Link - and all URLs with /catalog/productview/* is blocked in Robots.txt So unfortunately We told Google that the correct page is also a blocked page. they haven't been removed as far as I can see but traffic has certainly dropped. We have also , at the same time had some Site changes grouping some pages & having 301 redirects. Resubmitted site map & did a fetch as google. Any other ideas? And Idea how long it will take to become unblocked?
Intermediate & Advanced SEO | | s_EOgi_Bear0 -
Anchor Tag around Table / Block
Our homepage (here) has four large promotional sections taking up most of the real estate. Each promo section has an image and styled text. We want each promo section to link to the appropriate page, so we created the promo sections as and wrapped each in an anchor. That works fine for users but I tried viewing our site in a text-only browser (Lynx) and couldn't follow those links! My fear is that GoogleBot can't follow them either and doesn't know what anchor text to pull. So, my question: What's the best way to make this entire block clickable, but still have it crawlable by robots? Or is our current implementation ok? For reference, here's a simplified version of the relevant code block: |
Intermediate & Advanced SEO | | Richline_Digital| All Diamonds Extra 20% Off | [|
| Jessica Simspon Extra 20% Off |](http://jessicasimpson.jewelry.com/shop/)
0