Site: Query Question
-
Hi All,
Question around the site: query you can execute on Google for example. Now I know it has lots of inaccuracies, but I like to keep a high level sight of it over time.
I was using it to also try and get a high level view of how many product pages were indexed vs. the total number of pages.
What is interesting is when I do a site: query for say www.newark.com I get ~748,000 results returned.
When I do a query for www.newark.com "/dp/" I get ~845,000 results returned.
Either I am doing something stupid or these numbers are completely backwards?
Any thoughts?
Thanks,
Ben
-
Barry Schwartz posted some great information about this in November of 2010, quoting a couple of different Google sources. In short, more specific queries can cause Google to dig deeper and give more accurate estimates.
-
Yup. get rid of parameter laden urls and its easy enough. If they hang around the index for a few months before disappearing thats no big deal, as long as you have done the right thing it will work out fine
Also your not interested in the chaff, just the bits you want to make sure are indexed. So make sure thise are in sensibly titled sitemaps and its fine (used this on sites with 50 million and 100 million product pages. It gets a bit more complex at that number, but the underlying principle is the same)
-
But then on a big site (talking 4m+ products) its usually the case that you have URL's indexed that wouldn't be generated in a sitemap because they include additional parameters.
Ideally of course you rid the index of parameter filled URL's but its pretty tough to do that.
-
Best bet is to make sure all your urls are in your sitemap and then you get an exact count.
Ive found it handy to use multiple sitempas for each subfolder i.e. /news/ or /profiles/ to be able to quickly see exactly what % of urls are indexed from each section of my site. This is super helpful in finding errors in a specific section or when you are working on indexing of a certain type of page
S
-
What I've found the reason for this comes down to how the Google system works. Case in point, a client site I have with 25,000 actual pages. They have mass duplicate content issues. When I do a generic site: with the domain, Google shows 50-60,000 pages. If I do an inurl: with a specific URL param, I either get 500,000 or over a million.
Though that's not your exact situation, it can help explain what's happening.
Essentially, if you do a normal site: Google will try its best to provide the content within the site that it shows the world based on "most relevant" content. When you do a refined check, it's naturally going to look for the content that really is most relevant - closest match to that actual parameter.
So if you're seeing more results with the refined process, it means that on any given day, at any given time, when someone does a general search, the Google system will filter out a lot of content that isn't seen as highly valuable for that particular search. So all those extra pages that come up in your refined check - many of them are most likely then evaluated as less than highly valuable / high quality or relevant to most searches.
Even if many are great pages, their system has multiple algorithms that have to be run to assign value. What you are seeing is those processes struggling to sort it all out.
-
about 839,000 results.
-
Different data center perhaps - what about if you add in the "dp" query to the string?
-
I actually see 'about 897,000 results' for the search 'site:www.newark.com'.
-
Thanks Adrian,
I understand those areas of inaccuracy, but I didn't expect to see a refined search produce more results than the original search. That just seems a little bizarre to me, which is why I was wondering if there was a clear explanation or if I was executing my query incorrectly.
Ben
-
This is an expected 'oddity' of the site: operator. Here is a video of Matt Cutts explaining the imprecise nature of the site: operator.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
PDF best practices: to get them indexed or not? Do they pass SEO value to the site?
All PDFs have landing pages, and the pages are already indexed. If we allow the PDFs to get indexed, then they'd be downloadable directly from google's results page and we would not get GA events. The PDFs info would somewhat overlap with the landing pages info. Also, if we ever need to move content, we'd now have to redirects the links to the PDFs. What are best practices in this area? To index or not? What do you / your clients do and why? Would a PDF indexed by google and downloaded directly via a link in the SER page pass SEO juice to the domain? What if it's on a subdomain, like when hosted by Pardot? (www1.example.com)
Reporting & Analytics | | hlwebdev1 -
Using Site Maps Correctly
Hello I'm looking to submit a sitemap for a post driven site with over 5000 pages. The site hasn't got a sitemap but it is indexed by google - will submitting a sitemap make a difference at this stage? Also, most free sitemap tools only go up to 5000 pages, and I'm thinking I would try a sitemap using a free version of the tool before I buy one - If my site is 5500 pages but I only submit a sitemap for 5000 (I have no control of which pages get included in the sitemap) would this have a negative effect for the pages that didn't get included? Thanks
Reporting & Analytics | | wearehappymedia0 -
Does a Manual Penalty Affect Other Sites in Same GA Account
Hello Mozzers, I was a bit foolish a couple of years back when first getting into the game, and employed a dodgy agency to do some SEO for me on some sites. Fast forward to this year, and the two sites in my Google Analytics account have been hit with a manual penalty. I decided to ditch the websites and move on, so removed them from my GA account, webmaster tools etc and will simply let them die a death. My question is, do you think this would affect how easy it would be to rank other websites within my GA account? Does anybody have any views on this? Thanks!
Reporting & Analytics | | Marc-FIMA0 -
Weird Analytics Question
Looking at a Google Analytics report for a client - Traffic Sources - Referrals - Landing Pages from one particular referrer. This one referral site is a large trade directory that links onto several deep pages of the site, but also links onto the homepage. Analytics is showing the one landing page as //index.html. That's 2 // - not one. If you click on the link, it's a 404. I've never seen this in Analytics before. I'm looking at the client's info on this trade directory site and I can't see a link that points to this 404. The majority of incoming traffic from this site is apparently coming to this //index.html page, so you'd think it would be coming from their main profile on the site. But it's not there. Also, if you had all this referral traffic coming to a 404, you'd expect a really high bounce rate, but it's not - it's average. The client also has a sister site also listed in this directory, and I'm not seeing this same issue in their Analytics. Is this just some weird glitch in Analytics?
Reporting & Analytics | | stevefidelity0 -
Does prevent links from being included in Google Webmaster linking sites report?
My client has clean links in edit from nytimes.com. The links do not have nofollow tags. Google Webmaster stopped including links from nytimes.com in the external linking domains report and we don't know why since the URL is still live. The nytimes.com URL includes this tag in the source code: Are links on pages with NOARCHIVE still counted in Google Webmaster linking domains reports?
Reporting & Analytics | | ebenthurston0 -
How to Refesh site comapign?
How to Refesh site comapign? its displaying 3 days old data. now fixed some contents. unable to test it. kindly guide me for howto refresh the report?
Reporting & Analytics | | peanut20100 -
Organic bounce rate after site re-launch
3 months ago a client of mine re-launched theoir web site (after having a lot of work done on it). Since then, many of the SEO indications are good - more non-paid keywords sending search visits, more organic visits overall, more URLs receiving entrances via search, etc. The issue is that their bounce rate has been increasing pretty much EVERY week since. Has anyone seen a similar issue and what could a potential solution be for this? Thanks everyone!
Reporting & Analytics | | CathalOMaoilfhinn0 -
Something strange going on with new client's site...
Please forgive my stupidity if there is something obvious here which I have missed (I keep assuming that must be the case), but any advice on this would be much appreciated. We've just acquired a new client. Despite having a site for plenty of time now they did not previously have analytics with their last company (I know, a crime!). They've been with us for about a month now and we've managed to get them some great rankings already. To be fair, the rankings weren't bad before us either. Anyway. They have multiple position one rankings for well searched terms both locally and nationally. One would assume therefore that a lot of their traffic would come from Google right? Not according to their analytics. In fact, very little of it does... instead, 70% of their average 3,000 visits per month comes from just one referring site. A framed version of their site which is through reachlocal, which itself doesn't rank for any of their terms. I don't get it... The URL of the site is: www.namgrass.co.uk (ignore there being a .com too, that's a portal as they cover other countries). The referring site causing me all this confusion is: http://namgrass.rtrk.co.uk/ (see source code at the bottom for the reachlocal thing). Now I know reach local certainly isn't sending them all that traffic, so why does GA say it is... and what is this reachlocal thing anyway?? I mean, I know what reachlocal is, but what gives here with regards to it? Any ideas, please??
Reporting & Analytics | | SteveOllington0