How can i block the below URLs
-
Google indexed plugins pages for my website. Please check below. How can stop them to be indexed on google.?
http://www.ayurjeewan.com/wp-content/plugins/LayerSlider/static/skins/glass/
http://www.ayurjeewan.com/wp-content/plugins/LayerSlider/static/skins/borderlesslight3d/
http://www.ayurjeewan.com/wp-content/plugins/LayerSlider/static/skins/defaultskin/
My robots.txt file is -
User-agent: * Disallow: /wp-admin/
-
robots.txt:
Disallow:
/wp-admin/
/wp-content/plugins/
Then, deindex them as mentioned in the first response.
-
It's also worth remembering that blocking a URL in a robots.txt file does not automatically mean that the URL will be deindexed. The robots.txt file will prevent robots, such as the Googlebot, from seeing or accessing the file and that's all. Now obviously, if a robot repeatedly tries to access a file and gets denied it will eventually stop trying to, which is what leads to the file being deindexed (the same principle applies to 404 and 410 errors).
Therefore, if you want a quicker and more definite deindexing solution, you should use the explicit noindex robot command, as recommended above. This will tell any visiting robot to not index it straight away, which will reduce the number of revisits and drop the page from the index faster.
-
Take a look over here - http://www.robotstxt.org/robotstxt.html and we can't forget the Moz version - http://moz.com/learn/seo/robotstxt
Alternatively you can just add meta noindex on the pages with with the added bonus of letting link juice flow better as well as being a bit more stern to robots ( recommend the noindex tag!)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having country in page url, good idea or bad idea?
Lets say i sell louisville hockey sticks if i have a page url as louisville-hockey-sticks-canada is this better than louisville-hockey-sticks I have a .CA domain
On-Page Optimization | | garmatinc0 -
Toxic URL???
Hi I have a URL that produced page 1, number 1 to 3 for most of our industries top phrases. Then we received a google penalty, (as did several of our competitors on the same day). We were effectively wiped from google. After much disavowing we were allowed back into the search results, this took about 3 months. I have employed the services of a top London SEO company for over a year now and have seen no significant improvement. I believe they are doing there best, however there results are VERY poor. According to the various tools, (searchmetrics, woorank, semrush) to name but a few, our site scores very well, yet we are not getting the results. Page one seems to be full of totally new websites, most of which I have never heard of, and have appeared from nowhere. Should I scrap our URL and put up a completely new one, and put a redirect from the original one? This would be a biggy since our url has been around for 20 years. Thanks for reading. Andy
On-Page Optimization | | First-VehicleLeasing0 -
Product names are defacto identical - How can I modify Title?
Hi! Our site http://www.metecoparts.gr/ help our customers search our inventory of cars for used parts. The way the titles of product pages are structured is: Car Parts – Make Model Year – Brand Name. So for instance this page http://www.metecoparts.gr/car/4074-alfa-romeo-156-1997 have the title "Car Parts Alfa Romeo 156 1997 – Meteco SA" (I have translated Greek words to English) So as you can imagine we end up with a bunch of duplicate titles pages. I’m trying to diversify the titles but the problem is that there aren’t many product identifiers other than body configuration, car color, fuel, engine capacity and engine code. But only color and year of production is partly unique as most Alfa Romeo 156 are 4 door saloons, with 1600cc petrol engine. The only thing that is unique for every product page is the page ID. So should I use it in the page title? Its the only unique identificator but on ther other hand it has no use for our customers. Or is there any other way of having unique titles that I am missing? Please note that in our industry on-page optimization is crucial since there is no social sharing and much link building (who wants to share an old car that is being sold for parts?) Thank you
On-Page Optimization | | Johnlock10 -
Why Can't I Get Indexed?
I cannot seem to get my website indexed by Google! I submitted the sitemap using Google WMT about a month ago but only one page is being indexed. There are very few backlinks to the site, so I don't believe there are any penalties due to over-optimization that would prevent indexing. Also, my robots.txt file is properly configured and is not preventing any pages from being crawled. I've tried using the "Fetch as Google" settings in WMT with no luck. Any ideas?
On-Page Optimization | | socialfirestarter0 -
Can someone explain this to me in simple language? Basically what do I have to do?
Accessible to Engines Easy fix <dl> <dt>Crawl status</dt> <dd>Status Code: 200
On-Page Optimization | | Devinder
meta-robots: None
meta-refresh: 0;url=http://akaalpet.com/default.aspx
X-Robots: None</dd> <dt>Explanation</dt> <dd>Pages that can't be crawled or indexed have no opportunity to rank in the results. Before tweaking keyword targeting or leveraging other optimization techniques, it's essential to make sure this page is accessible.</dd> <dt>Recommendation</dt> <dd>Ensure the URL returns the HTTP code 200 and is not blocked with robots.txt, meta robots or x-robots protocol (and does not meta refresh to another URL)</dd> </dl>0 -
New CMS system - 100,000 old urls - use robots.txt to block?
Hello. My website has recently switched to a new CMS system. Over the last 10 years or so, we've used 3 different CMS systems on our current domain. As expected, this has resulted in lots of urls. Up until this most recent iteration, we were unable to 301 redirect or use any page-level indexation techniques like rel 'canonical' Using SEOmoz's tools and GWMT, I've been able to locate and redirect all pertinent, page-rank bearing, "older" urls to their new counterparts..however, according to Google Webmaster tools 'Not Found' report, there are literally over 100,000 additional urls out there it's trying to find. My question is, is there an advantage to using robots.txt to stop search engines from looking for some of these older directories? Currently, we allow everything - only using page level robots tags to disallow where necessary. Thanks!
On-Page Optimization | | Blenny0 -
How Can I Get Yahoo to Index My Site?
How Can I Get Yahoo to Index My Site? I have installed Bing webmaster tool two months ago -- is Yahoo that slow. My site has been out since May 2010 and for a year and a half, I only have 40 pages index. HELP!!!!
On-Page Optimization | | AppleCapitalGroup0 -
Tool for Generating Sitemap/ URL List
HI, I'm looking for a tool that'll generate a URL list for a site. I looked at this thread here http://www.seomoz.org/q/online-sitemap-generator which came up when I searched for sitemap generator. However, I don't need a sitemap per se, and I don't need to submit it to Google - just a list of pages is what I need.If it updated automatically, that would be useful as well. Anyone know of a tool, on or offline? Or anyone used Xenu and know if it's what I'm looking for? Or is there a simple solution that I'm missing? Thanks.
On-Page Optimization | | 5225Marketing0