How to get a list of robots.txt file
-
This is my site.
Its in wordpress.I just want to know is there any way I can get the list of blocked URL by Robots.txt
In Google Webmaster its not showing up.Just giving the number of blocked URL's.
Any plugin or Software to extract the list of blocked URL's.
-
If you use Bing Webmaster tools you can see a complete list all URLs blocked by robots.txt. You can export the file and then filter.
Just go to Reports & Data > Crawl Information within your Bing webmaster account. I am not aware of this feature being in Google webmaster tools. Hope this helps.
-
simon_realbuzz buddy If I use this /classifieds/ it means I am blocking all URL starting with it.I want to get a list of all blocked URL's of site.
Example
http://muslim-academy.com/classifieds/
How many URL's associated with this classified are blocked by my robots.txt.
-
I'm sorry I don't follow. If you go to that URL you will see the list of blocked URLs as I've pasted below.
User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /forum/viewtopic.php?p=
Disallow: /forum/viewtopic.php?=&p=
Disallow: /forum/viewtopic.php?t=
Disallow: /forum/viewtopic.php?start=
Disallow: /forum/&view=previousDisallow: /forum/&view=next
Disallow: /forum/&sid=
Disallow: /forum/&p=
Disallow: /forum/&sd=a
Disallow: /forum/&start=0
Disallow: /forum/memberlist.php
Disallow: /forum/posting.php
Disallow: /classifieds/
Disallow: /forum/index.php
Disallow: /forum/ucp
Disallow: /http://muslim-academy.com/الا�%A..
Disallow: /http://muslim-academy.com/особенн%D
Disallow: /http://muslim-academy.com/ислам-ка%
Disallow: /http://muslim-academy.com/classifieds/ads/Disallow: /http://muslim-academy.com/значени%D..
Disallow: /.ifieds/
Disallow: /.ifieds/ads/
Disallow: /forum/alternatelogin/al_tw_connect.php?authentication=1
Disallow: /forum/search.php -
simon_realbuzz I need a list of blocked URL's not the robots.txt file path.
-
You can view your robots file simply by appending /robots.txt to your site URL. Just put the following http://muslim-academy.com/robots.txt and you'll be able to view your robots file.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whats the best way to move 30% of our content behind a paywall and still get indexed without penalties and without letting people see our content before they subscribe.
Hi all - We want to create a membership program so that they can get more great stuff from us and offers, deals, etc. but only if they qualify to be a member via a purchase for example. The question is we want to move only some of our content (c.30%) behind the membership curtain - will be a mix of SEO value content. There are few questions/ concerns I am hoping you the SEO community can help me with: How can i ensure Google continues to index it without getting penalized. If i tell google bot to index but not allow Google and other sites to see the membership content will that create a penalty? Is that considered a form of cloaking? How can i prevent having to reveal 3 pages a day under Google's First Click Free set-up. I suppose i want my cake and eat it and i suspect the answer is well i cant. Any help or insights that can help me make this decision better is gratefully accepted.
Reporting & Analytics | | Adrian-phipps0 -
Advanced Segments for Remarketing Lists
Hello Moz community! I have a question around advanced segmentation for a client and was hoping someone would like to take a crack at it. Basically, I’m trying to establish segments which map to areas of the sales funnel, which I can then utilize as remarketing lists. So the scenario is this, for the time being we have 4 stages, let’s call them Aware, Interested, Engaged and Intent. For each of these stages a user will need to meet specific criteria. So for instance, Aware segment looks like the following Aware: Visited 1x (site.com/home) Had 2 unique page views/session That’s pretty easy to do in segments. Where it gets a little more confusing is when we try to take it to the “Interested” category. Which would have the criteria of Interested All of aware Repeat visitor (2x in last 30 days) Visited specific deeper pages (site.com/home/page1) (site.com/home/page2) Had specific events fire Clicked on download data sheet In GA, does this segment architecture make sense? | Filter users Include Page Contains site.com/home AND Unique Pageviews per session ≥ 2 Filter Users Include Page Contains site.com/home/page1 OR Page Contains site.com/home/page2 AND Count of Sessions session ≥ 2 Filter Users Include Event Label Matches download datasheet | I would love any feedback on this! Thanks in advance
Reporting & Analytics | | amichaels0 -
How do I get compete.com to track my data
Is there a tracking code for them? I cannot find a way to get them to track my site data. I know it seems trivial, but it is sadly a big tool in my industry so I need to get my data on their site
Reporting & Analytics | | Atomicx0 -
Getting google impressions for a site not in the index...
Hi all Wondering if i could pick the brains of those wise than myself... my client has an https website with tons of pages indexed and all ranking well, however somehow they managed to also set their server up so that non https versions of the pages were getting indexed and thus we had the same page indexed twice in the engine but on slightly different urls (it uses a cms so all the internal links are relative too). The non https is mainly used as a dev testing environment. Upon seeing this we did a google remove request in WMT, and added noindex in the robots and that saw the index pages drop over night. See image 1. However, the site still appears to getting return for a couple of 100 searches a day! The main site gets about 25,000 impressions so it's way down but i'm puzzled as to how a site which has been blocked can appear for that many searches and if we are still liable for duplicate content issues. Any thoughts are most welcome. Sorry, I am unable to share the site name i'm afraid. Client is very strict on this. Thanks, Carl image1.png
Reporting & Analytics | | carl_daedricdigital0 -
Robots.txt file issue.
Hi, Its my third thread here and i have created many like it on many webmaster communities.I know many pro are here so badly needs help. Robots.txt blocked 2k important URL's of my blogging site http://Muslim-academy.com/ Especially of my blog area which are bringing good number of visitors daily.My organic traffic declined from 1k daily to 350. I have removed the robots.txt file.Resubmitted existing Sitemap.Used all Fetch to index options and 50 URL submission option in Bing Webmaster Tool. What Can I do know to have these blocked URL's back in Google index? 1.Create a NEW sitemap and submit it again in Google webmaster and bing webmaster tool? 2.Bookmark,linkbuilding or share the URL's.I did a lot of bookmarking for blocked URL's. I fetch the list of blocked URLS Using BING WEBMASTER TOOLS.
Reporting & Analytics | | csfarnsworth0 -
How to Get Keyword Ranks
I want to check the keywords position of the entire site just by giving the site URL. Is there any way ?
Reporting & Analytics | | akansh0 -
Best way to generate analytics reports for listing style website
I'm working on a website that includes dedicated pages for ~40 local businesses, and I need to be able to generate and export some basic reports I can send to each business. The data I need for each report is split between general sitewide data: total number of visitors to the site for month. number of visitors to each main category page what country they are from (main countries) - top 5 traffic source / keywords avg time on site As well as specific data for each individual page: how many people viewed specific total pages Time spent on individual page. Would this be possible with custom reports in analytics? I can see the number of different reports being difficult to maintain, especially as site grows. Anyone had expereince on a similar site of ideas on the best way to do this? Thanks
Reporting & Analytics | | zeald0 -
On what report do I get to know where do the external links come from?
I need to get a list of the external links. I don't find that report, where do I get it?
Reporting & Analytics | | carloscontinua0