Google Webmaster tools vs SeoMOZ Crawl Diagnostics
-
Hi Guys I was just looking over my weekly report and crawl diagnostics. What I've noticed is that the data gathered on SeoMoz is different from Google Webmaster diagnostics. The number of errors, in particular duplicate page titles, content and pages not found is much higher that what google webmaster tools is represents. I'm a bit confused and don't know which data is more accurate. Please Help
-
I had this on the last crawl. There was no mention on the one before and I haven't changed anything in between so could be a glitch ?
-
I've been having similar issues but more specifically, SEOmoz reporting dup content which I cannot even locate manually. Hoping to see something different with the next crawl but it has been pretty confusing over the last couple of crawls.
-
I always make sure to manually check stuff on a regular basis as well as use tools. Tools are great, but like Steve said this is always going to be the case with software, especially in an industry as unregulated as ours.
Try to use the tools as a means of directing your attention to items that need it; rather than trusting them implicitly and using them as a checklist.
And again, Steve is correct in that Google's Webmaster Tools (as great as it is) can be a little dated. They will sometimes show errors that have been cleaned for a while. And keep in mind that seomoz tools are created within the context of SEO interests (Google will just show technical errors, SEO tools will show those, plus other warnings based on experience on things that will impact your rankings).
-
This will always be the case... there are different sources, and different interpretations of what constitutes errors. I wouldn't go thinking that just because GWT is Google that means it's the most efficient though, Google have a lot of other business to be thinking about and are unlikely to be too concerned if everything isn't perfect in their free GWT product, so I doubt it would take priority over other services of theirs. Where-as Moz of course is more focused in that particular area. I would just combine the two and iron out as much as you can from each... you'll never get them to match up though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google's ability to crawl AJAX rendered content
I would like to make a change to the way our main navigation is currently rendered on our e-commerce site. Currently, all of the content that appears when you click a navigation category is rendering on page load. This is currently a large portion of every page visit’s bandwidth and even the images are downloaded even if a user doesn’t choose to use the navigation. I’d like to change it so the content appears and is downloaded only IF the user clicks on it, I'm planning on using AJAX. As that is the case it wouldn’t not be automatically on the site(which may or may not mean Google would crawl it). As we already provide a sitemap.xml for Google I want to make sure this change would not adversely affect our SEO. As of October this year the Webmaster AJAX crawling doc. suggestions has been depreciated. While the new version does say that its crawlers are smart enough to render AJAX content, something I've tested, I'm not sure if that only applies to content injected on page load as opposed to in click like I'm planning to do.
Technical SEO | | znotes0 -
Should I resubmit a 301 redirected domain in Webmaster Tools
We recently switched over a .com site to a new server. The .com site had a .co.uk domain redirecting to it previously, but when the switchover happened, the .co.uk was forgotten about. We have now realised what has happened, but not before taking a hit with our rankings. The .co.uk is still indexed in Google and now that we have sorted the redirects they are pointing to the right places. My question now; is there anything further I need to do? I know that the .co.uk will soon be removed from the SERPs, but I just want to make sure I haven't forgotten anything.
Technical SEO | | Ben_Malkin_Develo0 -
Dealing with 410 Errors in Google Webmaster Tools
Hey there! (Background) We are doing a content audit on a site with 1,000s of articles, some going back to the early 2000s. There is some content that was duplicated from other sites, does not have any external links to it and gets little or no traffic. As we weed these out we set them to 410 to let the Goog know that this is not an error, we are getting rid of them on purpose and so the Goog should too. As expected, we now see the 410 errors in the Crawl report in Google Webmaster Tools. (Question) I have been going through and "Marking as Fixed" in GWT to clear out my console of these pages, but I am wondering if it would be better to just ignore them and let them clear out of GWT on their own. They are "fixed" in the 410 way as I intended and I am betting Google means fixed as being they show a 200 (if that makes sense). Any opinions on the best way to handle this? Thx!
Technical SEO | | CleverPhD0 -
Google Webmaster Tools: MESSAGE
Dear site owner or webmaster of http://www.enakliyat.com.tr/,
Technical SEO | | iskq
Some of your site's pages may be using techniques that do not comply with Google's Webmaster Guidelines.
On your site, in particular, does not provide an adequate level of innovation in low-quality unique content or set of pages. Examples of this type of thin affiliate pages, pages, bridge pages, it will automatically be created or copied content. For more information about the unique and interesting content, visit http://www.google.com/support/webmasters/bin/answer.py?answer=66361.
We recommend you to make the necessary changes to your site to fit your site's quality guidelines. After making these changes, please submit your site for reconsideration in Google's search results.
If you have questions about how to resolve this problem, please see our Webmaster Help Forum for support.
Sincerely,
Google Search Quality Team **After this massege ve find our low quality pages and we added this urls on Robots.txt. Other than that, what can we do? ** **Our site is a home to home moving listing portal. Consumers who wants to move his home fills a form so that moving companies can cote prices. We were generating listing page URL’s by using the title submitted by customer. **0 -
Google Places Reviews
Has anyone had any delays on Google+ reviews to show up? We have multiple clients who have not received a new review in over two months. These are good accounts with good Zagat scores with 15+ good reviews from real customers. Our clients have asked their clients and have confirmed that there has been reviews left recently. However no new reviews have shown up in the past 60+ days.
Technical SEO | | CaseyKluver0 -
Webmaster Tools finding phantom 404s?
We recently (three months now!) switched over a site from .co.uk to .com and all old urls are re-directing to the new site. However, Google Webmaster tools is flagging up hundreds of 404s from the old site and yet doesn't report where the links were found, i.e. in the 'Linked From' tab there is no data and the old links are not in the sitemap. SEOmoz crawls do not report any 404s. Any ideas?
Technical SEO | | Switch_Digital0 -
Recent Webmaster Tools Glitch Impacting Site Quality?
The ramifications of this would not be specific to myself but to anyone with this type of content on their pages... Maybe someone can chime in here, but I'm not sure how much if at all site errors (for example 404 errors) as reported by Google Webmaster Tools are seen as a factor in site quality, which would impact SEO rankings. Any insight on that alone would be appreciated. I've noticed some fairly new weird stuff going on in the WMT 404 error reports. It seems as though their engine is finding objects within the source code of the page that are NOT links but look a URL, then trying to crawl them and reporting them as broken. I've seen a couple different of cases in my environment that seem to trigger this issue. The easiest one to explain are Google Analytic virtual pageview Javascript calls where for example you might send a virtual pageview back to GA for clicks on outbound links. So in the source code of your page you would have something like: onclick="<a class="attribute-value">_gaq.push(['_trackPageview', '/outboundclick/www.othersite.com']);</a> Although this is obviously not a crawl-able link, sure enough Webmaster Tools now would be reporting the following broken page with a 404: www.mysite.com/outboundclick/www.otherwite.com I've seen other such cases of thing that look like URLs but not actual links being pulled out of the page source and reported as broken links. Has anyone else noticed this? Do 404 instances (in this case false ones) reported by Webmaster Tools impact site quality rankings and SEO? Interesting issue here, I'm looking forward to hear some people's thoughts on this. Chris
Technical SEO | | cbubinas0 -
Site Crawl
I was wondering if there was a way to use SEOmoz's tool to quickly and easily find all the URLs on you site and not just the ones with errors. The site that I am working on does not have a site map. What I am trying to do is find all the URLs along with their titles and description tags. Thank you very much for your help
Technical SEO | | pakevin0