Discrepancy between SeoMoz vs Google Webmaster tools
-
SeoMoz reports over 70 4xx client server errors on my site, but Google Web Master Tools does not report any broken links. There are not any broken links on any of the pages that it is reporting. Could there be another reason for the 4xx errors besides broken links?
-
I would manualy visit all the pages that aparently have 4XX errors, and use a tool to extract the responce-headers.
You may have a simila rproblem to on I had. Due to the way a legacy site I worked with worked, there were places where the responce header was set manulay. I actualy ended up with 404 pages returning 200 and vis versa.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword Cannibalization vs. Optimizing Site
I am in the process of optimizing our website and I am having a hard time reconciling two best practices I have found on Moz. 1. You should avoid having multiple pages focus on the same keyword because you will lose some control of which result will show. 2. You should identify your core keywords and weave these keywords multiple times (naturally) throughout your site. I have spent months identifying our top 7 keywords and am working through the site now. The first piece of advice keeps giving me pause. Can anyone weigh in with other considerations or advice on how I can reconcile these two strategies. Thank you
On-Page Optimization | | NikCall2 -
How long does Google take to reduce the index size?
A few months ago, we have incorporated our custom search in our website www.ergodotisi.com . We hadn't been paying a lot of attention to our webmaster analytics, to find out a few months later than the Google Index had grown from 2K- 3K pages to one million because it was crawling all combinations of search filters. We have now followed the right instructions to add noindex meta tags and blocked most search result pages from the robot.txt. We allow indexing of some main categories by setting new seo-friendly url structures. A few weeks have passed and the index size has only reduced to 700K. How long does it take before it removes most of the duplicated search result pages from the index? Is it still crawling those pages but has not fully decided to remove most of them? How bad is this for SEO?
On-Page Optimization | | cplastiras840 -
Long-tail with few searches vs. Generic with many
Our business is a contract packager/manufacturer of products sold to very prominent brands who sell through retail. For example, we make the sunscreen under a brand’s name, which you might then find on the shelf in Target or CVS. As I’ve optimized our pages, I’ve attempted to go long-tail, which has been simply to add “…contract packaging” or a variation after the particular product. So, instead of trying to compete in “sunscreen”, which would pit me against big-box distributors and prominent brands and sellers of sunscreen, I’ve optimized for “sunscreen manufacturers.” “Sunscreen” has 31K – 72K searches, with an 81 Difficulty and 67 Potential. “Sunscreen manufacturers” has a low 13 Difficulty and a decent 54 Potential, but only 51 – 100 searches. Some of my terms have only 0 – 10 searches, but I’ve been thinking that it’s better to compete for fewer but more qualified / buyer-intent searches and have generally lower Difficulty. Can you please tell me if this is a smart strategy, or if I should instead try to compete in higher-volume terms but much greater Difficulty? Thanks a lot for everyone's help.
On-Page Optimization | | Beau_W0 -
Google search result dramatically dropped with drop in DA.
It looks like on 11/13 by site traffic dropped by like 75% and it just happens to coincide with the MOZ DA dropping to. Anyone else see this?
On-Page Optimization | | Motom70 -
Google index new data from my website page
Hi All, We have pages which are created few weeks before hand for Movie reviews in those pages we add value with adding the Movie cast and crew info and what ever info possible before the movie releases. The the movie releases we watch the movies and write reviews which is 500+ words. Now the issue is the pages are indexed a week before... How can i have these review pages scanned immediately when i have the complete review as the review content is not indexed for 3 to 5 days and the first day or 2 is when its important for the reviews to be seen in Google. Regards
On-Page Optimization | | AlexisWithers0 -
What´s the penalization extent applied by Google?
Hi! I still don´t get this web site penalization applied by Google due to duplicate content. My site has many of pages that were among the first positions for top keywords (A Photoshop web site). Those pages were linked by sites like LifeHacker, BoingBoing, Microsiervos, SmashingMagazine, John Nack, and many other well known blogs. After mid February 2012 everything went down the drain. I lost half of my traffic and my well ranked pages are now almost nowhere to be found. I have plenty of ads in some pages of my site, and duplicate content (amazon product description only) in other pages of my site. So, the good quality pages my site has, are no longer considered as good quality just because I have some duplicate content or ad filled pages? I´m not complaining. I´m trying to understand this. Google needs to serve good information to their visitors. But since they found some trash in my site, they decide to remove both the trash and the good information from the search engine? That doesn´t sound logical to me. Why don´t they just remove the trash and leave the good content? Of course, I understand that information is added everyday and some may come up with something better than mine, but dropping 40 or more places in the ranking sounds more like a penalty to me. Again, I´m not complaining (although it sounds like I am!), just want to understand the reasons behind this. Thanks, Enrique
On-Page Optimization | | enriquef0 -
Keyword vs Brand Domain Name
Hi guys, I'm about to launch a new site for a friend who is an accountant in a specialist field. He's already bought 2 domains: **www.[keyword]-accountants.net ** **www.[brand]accountants.com ** We have made the decision to use the brand domain to host the site but what can we do with the keyword domain as exact match domains still seem to be ranking well in the serps? e.g. build keyword links to the keyword domain (heavily seo'd content) and build brand links to the brand domain (conversion-optimised content) then after while 301 the keyword domain? Any new suggestions will be gratefully received!
On-Page Optimization | | Tman30 -
Paid CTR Vs Organic CTR for high ranked terms - say average position 2 in Adwords and average position of 4 in Google SERP
Consider a situation where we are getting 5000 impressions for a term in Adwords and Bing/Yahoo and the same term with the same landing page is ranked within top 5 positions of Google and Bing search. If we get 2.00% CTR in Paid - Adwords and MSN average, What will be the acceptable Organic CTR - which is available in webmaster tool? Apart from Title and Description, what are the other areas need improvement to increase Organic CTR?
On-Page Optimization | | gmk15670