Why are so many pages indexed?
-
We recently launched a new website and it doesn't consist of that many pages. When you do a "site:" search on Google, it shows 1,950 results. Obviously we don't want this to be happening. I have a feeling it's effecting our rankings. Is this just a straight up robots.txt problem? We addressed that a while ago and the number of results aren't going down. It's very possible that we still have it implemented incorrectly. What are we doing wrong and how do we start getting pages "un-indexed"?
-
What's to stop google from finding them? They're out there and available on the internet!
Block or remove pages using a robots.txt file
You can do this by putting:
User-agent: * Disallow: /
in the robots.txt file.
You might also want to stop humans from accessing the content too - can you put this content behind a password using htaccess or block access based on network address?
-
Sounds like you need to put a robots.txt on those subdomains (and maybe consider some type of login too).
Quick fix: put a robots.txt on the subdomains to block them from being indexed. Go into Google Webmaster Tools and verify each subdomain as its own site, then request removal of each of those subdomains (which should be approved, since you've already blocked it in robots.txt).
I took a quick look at lab.capacity.com/robots.txt and it isn't blocking the entire subdomain, though the robots.txt at fb.capacitr.com is.
-
I most certainly do not want those pages indexed, they're used for internal purposes only. That's exactly what I'm trying to figure out here. Why are those subdomains being indexed? They should obviously be private. Any insights would be great.
Thanks!
-
What are are you searching for? I notice that if you do a site:.capacitr.com you get the 1,950 results you mention above.
If you do a search for site:www.capacitr.com then you only get 29 results.
Its looks like there's a whole load of pages being indexed on other subdomains - fb.capacitr.com and lab.capacity.com. (Which has 1,860 pages!)
What are these used for, do you really want these in the index!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Would My Page Have a Higher PA and DA, Links & On-Page Grade & Still Not Rank?
The Search Term is "Alcohol Ink" and our client has a better page authority, domain authority, links to the page, and on-page grade than those in the SERP for spaces 5-10 and we're not even ranked in the top 51+ according to Moz's tracker. The only difference I can see is that our URL doesn't use the exact text like some of the 5-10 do. However, regardless of this, our on-page grade is significantly higher than the rest of them. The one thing I found was that there were two links to the page (that we never asked for) that had a spam score in the low 20's and another in the low 30's. Does anyone have any recommendations on how to maybe get around this? Certainly, a content campaign and linking campaign around this could also help but I'm kind of scratching my head. The client is reputable, with a solid domain age and well recognized in the space so it's not like it's a noob trying to get in out of nowhere.
Intermediate & Advanced SEO | | Omnisye0 -
Possible to Improve Domain Authority By Improving Content on Low Page Rank Pages?
My sites domain authority is only 23. The home page has a page authority of 32. My site consists of about 400 pages. The topic of the site is commercial real estate (I am a real estate broker). A number of the sites we compete against have a domain authority of 30-40. Would our overall domain authority improved if we re-wrote the content for several hundred of pages that had the lowest page authority (say 12-15)? Is the overall domain authority derived by an average of the page authority of each page on a domain? Alternatively could we increase domain authority by setting the pages with the lowest page authority to "no index". By the way our domain is www.nyc-officespace-leader.com Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Ranking slipped to page 6 from page 1 over the weekend?
My site has been on page one for 2 phrases consistently from May onwards this year. The site has fewer than 100 backlinks and the link profile looks fairly even. On Friday we were on page 1, we even had a position 1, however now we are on page 6. Do you think this is Penguin or some strange Google blip? We have no webmaster tools messages at all. Thanks for any help!
Intermediate & Advanced SEO | | onlinechester0 -
Page Indexed but not Cached
A section of pages on my site are indexed (I know because they appear in SERPs if I copy and paste a sentence from the content), however according to the text-only cached version of the page they are not being read by Google.Why are they indexed event hough it seems like Google is not reading them..... or is Google in fact reading this text even though it seems like they should not be?Thanks for your assistance.
Intermediate & Advanced SEO | | theLotter0 -
Getting Google in index but display "parent" pages..
Greetings esteemed SEO experts - I'm hunting for advice: We operate an accommodation listings website. We monetize by listing position in search results, i.e. you pay more to get higher placing in the page. Because of this, while we want individual detailed listing pages to be indexed to get the value of the content, we don't really want them appearing in Google search results. We ideally want the "content value" to be attributed to the parent page - and google to display this as the link in the search results instead of the individual listing. Any ideas on how to achieve this?
Intermediate & Advanced SEO | | AABAB0 -
Indexation of content from internal pages (registration) by Google
Hello, we are having quite a big amount of content on internal pages which can only be accessed as a registered member. What are the different options the get this content indexed by Google? In certain cases we might be able to show a preview to visitors. In other cases this is not possible for legal reasons. Somebody told me that there is an option to send the content of pages directly to google for indexation. Unfortunately he couldn't give me more details. I only know that this possible for URLs (sitemap). Is there really a possibility to do this for the entire content of a page without giving google access to crawl this page? Thanks Ben
Intermediate & Advanced SEO | | guitarslinger0 -
Sitemap not indexing pages
My website has about 5000 pages submitted in the sitemap but only 900 being indexed. When I checked Google Webmaster Tools about a week ago 4500 pages were being indexed. Any suggestions about what happened or how to fix it? Thanks!
Intermediate & Advanced SEO | | theLotter0 -
Does having multiple links to the same page influence the Link juice this page is able to pass
Say you have a page and it has 4 outgoing links to the same internal page. In the original Pagerank algo if these links were links to an page outside your own domain, this would mean that the linkjuice this page is able to pass would be devided by 4. The thing is i'm not sure if this is also the case when the outgoing link, is linking to a page on your own domain. I would say that outgoing links (whatever the destination) will use some of your link juice, so it would be better to have 1 outgoing link instead of 4 to the same destination, the the destination will profit more form that link. What are you're thoughts?
Intermediate & Advanced SEO | | TjeerdvZ0