Difference between SEOMOZ and Webmaster Tools information
-
Hello,
There is an issue that confuses me and I thought perhaps you will be able to help me shed some light on it.
I have a website which shows 2,549 crawled pages on SEOMOZ and 24,542 pages on webmaster tools!
Obviously there is some technical issue with the site, but my question is: why the vast difference between what the SEOMOZ crawl report and webmaster tools report show?
Thanks!
Guy Cizner
-
Thanks for stepping in everyone, though it looks like we were trying to answer the wrong question. This one is with Roger's crawl of the OP's own site, rather than links indexed in OSE.
Guy, do you have a feel for how many pages SHOULD be in the index? If you only have a couple of thousand pages, then it could be that Google is crawling and indexing some parameters. If you've got 20k+ pages in the index, then Roger isn't finding some things.
Also..are you looking at perhaps just the www.domain subdomain in SEOmoz and is GWT looking at the entire site? If you had a compact www.domain site, but then had forum.domain and wiki.domain, and GWT was reporting pages for all of the subdomains on domain.com, that would explain things too.
-
hello
thanks for all the replies.
the pages crawled are part of an SEO I am running.
How the crawl is done when a campaign is defined?
I assume all the site is being crawled.
thanks
-
This may also shed some light:
Oct 9, 2012 Keri Morgret On-site Community Manager at SEOmoz:
Another reason is that we just don't have the same size server farm that Google and Bing have. We could crawl all of Twitter and get nothing else crawled, or we could crawl some of Twitter, and some of the rest of the web. We aren't able to crawl all of the web, and we release a new index about once a month, so that's why you don't see all of your links or see them right away.
However, what we do offer that is different from Google and Bing is that we show you links for sites that are not your own, we add metrics about the trust and authority of the page, etc.
-
The Mozscape index, as brilliant as it is, can in no way compete with the size of the index that Google can handle.
As a result, your WMT report should always have a bigger amount of pages, links etc crawled. It's just bigger.
-
Either those 'issues' might be the cause. For example incorrect canonicalization that is picked up differently by Google and the SEOmoz bot Roger. Another option could be that Google tries really hard to index each and every page of the web, while Roger has a slightly more restrictive way of crawling the web by only crawling pages above a certain level of authority / only a certain amount of clicks from the homepage etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiregional SEO - different website designs
Hello, when it comes to location & language targeting, is it possible to have different website page designs for their respective regions? So let's say we have one design for www.example.com/product-1 and a completely different design for www.example.com/us/product-1? Our branding is different for the US so would ideally like to use specific US designs for US pages. Thanks,
Technical SEO | | SEOCT0 -
Duplicate content on Product pages for different product variations.
I have multiple colors of the same product, but as a result I'm getting duplicate content warnings. I want to keep these all different products with their own pages, so that the color can be easily identified by browsing the category page. Any suggestions?
Technical SEO | | bobjohn10 -
Does incomplete or private WHOIS information have negative effect on SEO?
I wondered if having incomplete WHOIS information has a negative effect on rankings? Also I know it's possible to keep WHOIS information private, so I wondered if google would think this suspicious and rank a site not as highly as it should? Thanks!
Technical SEO | | Silktide0 -
Strange Webmaster Tools Crawl Report
Up until recently I had robots.txt blocking the indexing of my pdf files which are all manuals for products we sell. I changed this last week to allow indexing of those files and now my webmaster tools crawl report is listing all my pdfs as not founds. What is really strange is that Webmaster Tools is listing an incorrect link structure: "domain.com/file.pdf" instead of "domain.com/manuals/file.pdf" Why is google indexing these particular pages incorrectly? My robots.txt has nothing else in it besides a disallow for an entirely different folder on my server and my htaccess is not redirecting anything in regards to my manuals folder either. Even in the case of outside links present in the crawl report supposedly linking to this 404 file when I visit these 3rd party pages they have the correct link structure. Hope someone can help because right now my not founds are up in the 500s and that can't be good 🙂 Thanks is advance!
Technical SEO | | Virage0 -
Webmaster Tools "Links to your site" history over time?
Is there a way to see a history of the "links to your site"? I've seen a lot of posts here from people say "I just saw a big drop in my numbers." I don't look at this number enough to be that familiar with it. Is there a way to see if Google has suddenly chopped our numbers? I've poked around a little, but not found a method yet. Thanks, Reeves
Technical SEO | | wreevesc0 -
BingHoo Tools No Longer Directly Supporting AJAX
This post was derived from the fact that our website, which uses AJAX and AJAX urls formatted using Google's crawl guidelines (and ranks well), is being completely misread by BingHoo's bots (and destroying our rankings). After some research, we found that Bing's tools used to contain an option for their crawlers to interpret AJAX-infused urls, but that this feature was removed with the latest update. I've seen others post on this issue with no response, so I figured I'd post the customer support email we received below - kind of strange thing to receive. Takeaway - even AJAX done right is rough in BingHoo. (sorry I can't post my site here..). Hello, This is Roxanne of Bing Technical Support and I will be assisting you with this issue. I understand that you cannot find the Configure Bing for AJAX Crawling box in Bing Webmaster. Let me explain. We appreciate your feedback about Bing Webmaster tools. We regret to inform you that Bing Webmaster Tools is no longer directly supporting AJAX. We'll pass this feedback onto our Bing development team. If you are having ajax-specific related issues with your site in Webmaster tools, please let us know. We apologize for the time spent and the inconvenience this may have caused you. If you should require further clarification and need more assistance, please feel free to reply to this email. Thank you and have a great day! Regards, Roxanne Bing Technical Support
Technical SEO | | Blenny0 -
Schema Markup and Google's Rich Snippet Tool
Has anyone ever used the snippet tool and gotten the following error "could not fetch website"? When using the tool and placing an url that does not have markup present it will show that as the error. Or if part of markup is wrong, it will diagnose it accordingly. Did a search online and found limited info...one of which someone had this error but when other users tested it, they were not getting the same error.
Technical SEO | | andrewv0 -
Best way to handle different views of the same page?
Say I have a page: mydomain.com/page But I also have different views: /?sort=alpha /print-version /?session_ID=2892 etc. All same content, more or less. Should the subsequent pages have ROBOTS meta tag with noindex? Should I use canonical? Both? Thanks!
Technical SEO | | ChatterBlock0