Why does Google say they have more URLs indexed for my site than they really do?
-
When I do a site search with Google (i.e. site:www.mysite.com), Google reports "About 7,500 results" -- but when I click through to the end of the results and choose to include omitted results, Google really has only 210 results for my site.
I had an issue months back with a large # of URLs being indexed because of query strings and some other non-optimized technicalities - at that time I could see that Google really had indexed all of those URLs - but I've since implemented canonical URLs and fixed most (if not all) of my technical issues in order to get our index count down.
At first I thought it would just be a matter of time for them to reconcile this, perhaps they were looking at cached data or something, but it's been months and the "About 7,500 results" just won't change even though the actual pages indexed keeps dropping!
Does anyone know why Google would be still reporting a high index count, which doesn't actually reflect what is currently indexed?
Thanks!
-
It seems like you are taking the correct steps. I'm guessing those pages were tossed in to the supplementary index (as they were most likely were dupes) and I beleive by tweaking your robots.txt files, over time, these should be removed.
Another thing to do is inform Google on what to do with those parameters inside webmaster tools:
Configuration => URL parameters
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta Descriptions - Google ignores what we have
Hi I still write meta descriptions to help with CTR. I am currently looking at a page where the CTR needs improving. I check the meta on Google SERPs & it isn't pulling through the meta description we have - but other info on the page. This isn't ideal - why does this happen? Will Google just make the decision and are descriptions not worth writing?
Algorithm Updates | | BeckyKey0 -
Google Algorithm change this month - theories ?
Hi Moz fans! If you've not had the chance to check out moz cast go check it out it seems Google has been busy and so much so it broke Moz cast. There has been some discussion on seo round table about the changes and something seems to be going on. I wanted to find out how you guys are all finding it have you had a change in rankings? Any theories on what you think Google is up to? Personally I've seen rather a few of my sites go up in the rankings last week or so. As always look forward to hearing your thoughts and feelings on it Moz.
Algorithm Updates | | GPainter3 -
How can a site with two questionable inbound links outperform sites with 500-1000 links good PR?
Our site for years was performing at #1 for but in the last 6 months been pushed down to about the #5 spot. Some of the domains above us have a handful of links and they aren't from good sources. We don't have a Google penalty. We try to only have links from quality domains but have been pushed down the SERP's? Any suggestions?
Algorithm Updates | | northerncs0 -
Could Retail Price Be A Google Ranking Factor???
I have not done any detailed studies on this but it seems that Google might be using low retail prices for specific items as a ranking factor in their organic SERPs. Does anyone else suspect this? Just askin' to hear your thoughts. Thanks!
Algorithm Updates | | EGOL0 -
Why is site dropping in rank after we update it?
One of our sites - supereyes.com - appears to drop in rank after we update it. The client notified us of this today and I've verified that it did indeed drop in Google -- four spots since last week. He says this happens every time we make changes to the site, but then a week later it will go back up and is usually higher than where it was before. I have not verified this, but I'm very worried it may not rise again In the past week, we've posted a new blog entry to their site and we've changed some of the content -- specifically, added their locations to the header, added a contact page and put two testimonials in their sidebar. We've also had someone submitting their site to directories and local business sites like Angie's List and so forth. There are about 16 new backlinks established in the past 2-3 weeks. Also, I should note, traffic is higher than it's ever been, but the client doesn't look at traffic. They only look at their Google results. Can anyone offer any insight into what's going on here and if I need to be worried the site won't rise again in the rankings?
Algorithm Updates | | aloley0 -
What Is The Deal Between Indeed and Google?
Anyone notice the love affair of Indeed and Google lately? Indeed is cannibalizing the top 30 SERPs for job related keywords. Seeing keywords where Indeed has 10-15 of the organic listings in the top 30. Compete.com is showing a +8% increase in search volume between in April and May. But it seems as if they really started to cannibalize the SERPS since the Penguin update at end of May. Any one else noticing this?
Algorithm Updates | | joncrowe0 -
How do I get the expanded results in a Google search?
I notice for certain site (ex: mint.com) that when I search, the top result has a very detailed view with options to click to different subsections of the site. However for my site, even though we're consistently the top result for our branded terms, the result is still only a single line item. How do I adjust this?
Algorithm Updates | | syount1 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0