Is it hurting my seo ranking if robots.txt is forbidden?
-
robots.txt is forbidden - I have read up on what the robots.txt file does and how to configure it but what about if it is not able to be accessed at all?
-
Yes, excluding certain pages can be a benefit to your rankings: if the excluded pages could be considered duplicate content with your marketing pages or with it each other.
This is usually the case for blogs (think wordpress categories) or webshops (pagination, as well as single product pages reachable by different paths (and thus having different urls). As Ryan pointed out: controll that on the page level via noindex,follow to allow PR to flow. Use noindex,nofollow for "internal" pages you dont want to see crawled.
I am not sure, but having 9950 pages indexed, but considered duplicate content might hurt rankings for other pages on that domain. Google might consider the Domain spammy.
If you need a specific hint for your domain, send me a PM and I have a look if time permits.
-
In general, I do not use robots.txt. It is a better practice to use "noindex" for the pages you do not wish to have indexed.
If I had a 10k page site with 50 marketing pages, I would either want to index the entire site, or question why the other 99% of the site exists if it does not help market the products. There are numerous challenges your scenario prevents. If you block 99% of your site with robots.txt or the noindex meta tag, you are severely disrupting the flow of PR throughout your site. Also you are either blocking content which should be indexed, or you are wasting time and resources creating junk pages on your site.
If the content truly should not be indexed, it likely should be moved to another site. I would need a lot more details about the site, it's purpose and the pages involved. Whatever the proper solution, it is not likely going to be using robots.txt to block 99% of the site.
-
So in regards to increasing ranking, is there a benefit of using the robots.txt file to only index certain "marketing" page and exclude other content that may dilute your site. For example, lets say I have 10,000 pages but only about 50 or so are my marketing page. Would using robots.txt to only crawl my main marketing pages help place emphasis on that content?
-
Sebes is correct. To add a bit more, it is not necessary to provide a robots.txt file. Actually, it is preferable in most cases not to use the file but it is necessary if you do not have direct control over the code used in every page of your site. For example, if you have a CMS or Ecommerce based site you may not have likely do not have control over many pages on your site which are automatically generated through the software. In these cases the only way you can control how crawlers will treat your site's pages is either to pay for custom modifications to your site's code or to use a robots.txt file.
-
If the robots.txt can not be read by google or bing they assume that they can crawl as much as they want to. Check out the google webmaster tool to see whether google can "see" and access your robots.txt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Audit after Penguin 2.1 what are you guys seeing? this is my thougts
We have looked at around 2000 sites since Penguin 2.1 launched a few weeks back. These include our customers and their own competitors site. We are going through all the data which is obviously going to take some time. Hopefully we will publish a report on our findings as we are happy to share. What I currently see in my early analysis is Roughly 70% of sites tested have 0% exact match Anchor Text for their money keywords. The other 30% have less than 5% exact match Anchor Text. The quality of the links is often still poor to the sites ranking on page 1. The content surrounding the links is only about 10-15% of the time related to the money keywords. The loading time of the sites ranking seems to not matter, we encountered a lot of slow sites. Design and usability of the site was not important. We are not seeing much impact via Social media, a lot of these sites are small business Less than 10% of sites on page 1 had a Google+ account More than 40% of page 1 sites had Facebook profiles. More than 80% of the sites ranking on page 1 had less than 100 links to the landing page that ranked What are your opinions of helping to recover if hit by the above??? Q) If you have too high an anchor text percentage and have been hit or may get hit in the future would you. a) create some more high quality links with more varied anchor text, ie Click here, brand name etc b) not create any more links just remove the links you have to dilute the anchor text c) change the anchor text on links you are able to These figures are a work in progress so data will change just wanting to share our early findings and try to get a good conversation going. What are you guys seeing?
Algorithm Updates | | tempowebdesign0 -
SEO having different effects for different sites
Hi, I hope this isn't a dumb question, but I was asked by a local company to have a look at their website and make any suggestions on how to strengthen and improve their rankings. After time spent researching their competitors, and analysing their own website I was able to determine that they are actually in a good position. The have a well structured site that follows the basic search rules, they add new relevant content regularly and are working on their social strategy. Most of their pages are rated A within Moz, and they spend a lot of time tweaking the site. When I presented this to them, they asked why there are sites that rank above them that don't seem to take as much care over their website. For example, one of their main competitors doesn't engage in any social networking, and rarely adds content to their site. I was just wondering if anyone could shed any light on why this happens? I appreciate there's probably no simple answer, but it would be great to hear some different input. Many thanks
Algorithm Updates | | dantemple880 -
Ranking gcctld?
I am working on a new site that uses a .io TLD. I have just started working to get the domain to rank for its own name. What are some others experience with ranking and SEO for gcctlds?
Algorithm Updates | | JoshAM0 -
Google Page Rank not improving
Hi All, I have a site live with a homepage rank of 5, Ever since relaunching (on the same domain) 6 months ago the inner page rank has remained at NA. Its crawled pretty consistently, Can anyone think of a reason this may be happening? www.glowm.com
Algorithm Updates | | thebluecubeuk0 -
New Zealand rankings - Is Google taking city into account now?
I've just noticed over the last 24 hours that Google New Zealand seems to be taking city into account with the search results - when I search from Christchurch - I get CHCH companies and Auckland results when searching from Auckland. I would like to rank my client nationally - any tips? Has anyone else noticed this? I need to be the top of Google for my keyword throughout the country. Any articles that could help? Kind regards, David
Algorithm Updates | | David_Buckingham0 -
Is using WPML (WordPress Multilingual Plugin) ok for On-Page SEO?
Hi Mozzers, I'm investigating multilingual site setup and translating content for a small website for 15-20 pages and came accross WPML (WordPress Multilingual Plugin) which looks like it could help, but I am curious as to whether it has any major international SEO limitations before trialing/buying. It seems to allow the option to automatically setup language folder structures as www.domain.com/it/ or www.domain.com/es/ etc which is great and seems to offer easy way of linking out to translators (for extra fee), which could be convenient. However what about the on-page optimization - url names, title tags and other onpage elements - I wonder if anyone has any experiences with using this plugin or any alternatives for it. Hoping for your valued advice!
Algorithm Updates | | emerald0 -
SEO updates and rank changes
We have been updating page titles and meta descriptions for a client (not changing ANY links and the content we are replacing is "fluff," no major keywords or any relevant information) yet in the past few weeks, rankings have plummeted. I used the SEOMoz grader to check and make sure we have the keywords in there, in the right places for the updated page source info, and we're getting A's yet for those same keywords, the website is nowhere to be found. For example for the phrase "organic t shirts," we get an A for this page: http://greenpromotionalitems.com/organic-t-shirts.htm but when searching organic t shirts, no Green Promotional Items... Ideas?
Algorithm Updates | | laidlawseo0 -
Mobile SEO
Hello, A bit of an open question to start the week... What do you think are the key differences between Google's mobile search and web search algorithms? Obviously greater importance on site speed for mobile search!?
Algorithm Updates | | A_Q0