Google Webmaster tools vs SeoMOZ Crawl Diagnostics
-
Hi Guys I was just looking over my weekly report and crawl diagnostics. What I've noticed is that the data gathered on SeoMoz is different from Google Webmaster diagnostics. The number of errors, in particular duplicate page titles, content and pages not found is much higher that what google webmaster tools is represents. I'm a bit confused and don't know which data is more accurate. Please Help
-
I had this on the last crawl. There was no mention on the one before and I haven't changed anything in between so could be a glitch ?
-
I've been having similar issues but more specifically, SEOmoz reporting dup content which I cannot even locate manually. Hoping to see something different with the next crawl but it has been pretty confusing over the last couple of crawls.
-
I always make sure to manually check stuff on a regular basis as well as use tools. Tools are great, but like Steve said this is always going to be the case with software, especially in an industry as unregulated as ours.
Try to use the tools as a means of directing your attention to items that need it; rather than trusting them implicitly and using them as a checklist.
And again, Steve is correct in that Google's Webmaster Tools (as great as it is) can be a little dated. They will sometimes show errors that have been cleaned for a while. And keep in mind that seomoz tools are created within the context of SEO interests (Google will just show technical errors, SEO tools will show those, plus other warnings based on experience on things that will impact your rankings).
-
This will always be the case... there are different sources, and different interpretations of what constitutes errors. I wouldn't go thinking that just because GWT is Google that means it's the most efficient though, Google have a lot of other business to be thinking about and are unlikely to be too concerned if everything isn't perfect in their free GWT product, so I doubt it would take priority over other services of theirs. Where-as Moz of course is more focused in that particular area. I would just combine the two and iron out as much as you can from each... you'll never get them to match up though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
About how google works
Hey guyz,
Technical SEO | | atakala
I want to ask a basic question. If I search for Larry Page lets say.
I think google look for it's index for word larry and page distinct.
And mix it up. But the question ;
Can google show a result which only Larry exist on the page but any of the synonym or the stem of the Page not exist .
If it can happen how this page can be showned in larry page query. Thank you.0 -
WebMaster Tools keeps showing old 404 error but doesn't show a "Linked From" url. Why is that?
Hello Moz Community. I have a question about 404 crawl errors in WebmasterTools, a while ago we had an internal linking problem regarding some links formed in a wrong way (a loop was making links on the fly), this error was identified and fixed back then but before it was fixed google got to index lots of those malformed pages. Recently we see in our WebMaster account that some of this links still appearing as 404 but we currently don't have that issue or any internal link pointing to any of those URLs and what confuses us even more is that WebMaster doesn't show anything in the "Linked From" tab where it usually does for this type of errors, so we are wondering what this means, could be that they still in google's cache or memory? we are not really sure. If anyone has an idea of what this errors showing up now means we would really appreciate the help. Thanks. jZVh7zt.png
Technical SEO | | revimedia1 -
Wordpress & use of 'www' vs not for webmaster tools - explanation needed
I am having a hard time understanding the issue of canonization of site pages, specifically in regards to the 'www' or 'non-www' versions of a site. And specifically in regards to wordpress. I can see that it doesn't matter whether you type in 'www' or not in the url for a wordpress site, what is going on in the back end that allows this? When I link up to google webmaster tools, should i use www or not? thanks for any help d
Technical SEO | | dnaynay0 -
Google Webmaster Sitemap *pending*
Hey guys, I've noticed that my sitemap has been "pending" for quite some time in Google Webmaster tools. This leads me to believe that Google is not indexing my site. Could someone help me and point me to what I'm doing wrong? My site is The Tech Block
Technical SEO | | ttb0 -
Why Google not picking My META Description? Google itself populate the description.. How to control this Search Snippets??
Why Google not picking My META Description? Google itself populate the description.. How to control this Search Snippets??
Technical SEO | | greyniumseo0 -
Site Crawl
I was wondering if there was a way to use SEOmoz's tool to quickly and easily find all the URLs on you site and not just the ones with errors. The site that I am working on does not have a site map. What I am trying to do is find all the URLs along with their titles and description tags. Thank you very much for your help
Technical SEO | | pakevin0 -
Subdirectories vs subdomains
Hi SEO gurus 🙂 Anyone has input on what's better? blog.domain.com vs domain.com/blog store.domain.com vs domain.com/store etc I think the subdir (/xyz) will concentrate authority on the same subdomain so should be better? However sometimes it is tidier on the server to maintain online stores or blogs in a separate strucutre so subdomains work better in that sense. I just want to make sure that doesn't affect SEO? Cheers!
Technical SEO | | hectorpn0 -
Follow up from http://www.seomoz.org/qa/discuss/52837/google-analytics
Ben, I have a follow up question from our previous discussion at http://www.seomoz.org/qa/discuss/52837/google-analytics To summarize, to implement what we need, we need to do three things: add GA code to the Darden page _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.darden.virginia.edu']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Change links on the Darden Page to look like http://www.darden.virginia.edu/web/MBA-for-Executives/ and [https://darden-admissions.symplicity.com/applicant](<a href=)">Apply Now and make into [https://darden-admissions.symplicity.com/applicant](<a href=)" > onclick="_gaq.push(['_link', 'https://darden-admissions.symplicity.com/applicant']); return false;">Apply Now Have symplicity add this code. _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.symplicity.com']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Due to our CMS system, it does not allow the user to add onClick to the link. So, we CANNOT add part 2) What will be the result if we have only 1) and 3) implemented? Will the data still be fed to GA account 'UA-12345-1'? If not, how can we get cross domain tracking if we cannot change the link code? Nick
Technical SEO | | Darden0