Why aren't certain links showing in SEOMOZ?
-
Hi, I have been trying to understand our page rank and domains that are linking to us. When I look at the list of linking domains, I see some bigger ones are missing and I don't know why. For example, we are in the Yahoo Directory with a link to trophycentral.com, but SEOMOZ is not showing the link. If SEOMOZ is not seeing it, my guess is Google is not either, which concerns me. There are several onther high page rank domains also not showing. Anyone have any idea why? Thanks!
BTW, our domain is trophycentral.com
-
Thanks - I just looked up our site (www.trophycentral.com) on Google and see many of the missing likes. My guess is that it is mostly a timing issue. SEOMOZE is helpful because I can see the ratings and overall estimate, so I guess over time I will look at a few sites.
-
This is exactly what i do and have found the same result. In fact, at this moment in time well over half my links don't show in SEOMOZ
-
Thanks! I checked Google and most of them are there! ... Neil.
-
SEOMoz's link data is the best available in my opinion but you may want to consider looking at other sources as well like Majestic SEO and Google Webmaster Tools to supplement the SEOMoz data. Sometimes these other sources find links that SEOMoz doesn't have in their index.
-
Thank you!
-
Thank you!
-
SEOMoz database of links isn't exhaustive and also is only updated once a month or so, so I wouldn't sweat it too much if the links aren't showing up in there.
You can check back links in Google Webmaster Tools, personally i would use WMT over SEOMoz for this kind of info as after all, its Google's data that determines your rankings.
-
SEOMOZ crawls and updates their link index on a schedule you can see here:
http://apiwiki.seomoz.org/w/page/25141119/Linkscape Schedule
If this is a fairly new link, it may not have gotten indexed yet.
Also, note that page rank may have little or nothing to do with MozRank. Toolbar pagerank is notoriously inaccurate, and could be up to 6 months old. Plus, Toolbar pagerank doesn't reflect all the factors Google uses in PR calculation. And (as if that weren't enough) toolbar PR is based on a sort of Richter Scale - so a jump from, say, 4 to 5 could reflect a HUGE change, or a tiny one.
We normally focus on MozRank, instead.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My Homepage Won't Load if Javascript is Disabled. Is this an SEO/Indexation issue?
Hi everyone, I'm working with a client who recently had their site redesigned. I'm just going through to do an initial audit to make sure everything looks good. Part of my initial indexation audit goes through questions about how the site functions when you disable, javascript, cookies, and/or css. I use the Web Developer extension for Chrome to do this. I know, more recently, people have said that content loaded by Javascript will be indexed. I just want to make sure it's not hurting my clients SEO. http://americasinstantsigns.com/ Is it as simple as looking at Google's Cached URL? The URL is definitely being indexed and when looking at the text-only version everything appears to be in order. This may be an outdated question, but I just want to be sure! Thank you so much!
Technical SEO | | ccox10 -
Can I redirect a link even if the link is still on the site
Hi Folks, I've got a client who has a duplicate content because they actually create duplicate content and store the same piece of content in 2 different places. When they generate this duplicate content, it creates a 2nd link on the site going to the duplicate content. Now they want the 2nd link to always redirect to the first link, but for architecture reasons, they can't remove the 2nd link from the site navigation. We can't use rel-canonical because they don't want visitors going to that 2nd page. Here is my question: Are there any adverse SEO implications to maintaining a link on a site that always redirects to a different page? I've already gone down the road of "don't deliberately create duplicate content" with the client. They've heard me, but won't change. So, what are your thoughts? Thanks!
Technical SEO | | Rock330 -
What should I do with a large number of 'pages not found'?
One of my client sites lists millions of products and 100s or 1000s are de-listed from their inventory each month and removed from the site (no longer for sale). What is the best way to handle these pages/URLs from an SEO perspective? There is no place to use a 301. 1. Should we implement 404s for each one and put up with the growing number of 'pages not found' shown in Webmaster Tools? 2. Should we add them to the Robots.txt file? 3. Should we add 'nofollow' into all these pages? Or is there a better solution? Would love some help with this!
Technical SEO | | CuriousCatDigital0 -
Changes to website haven't been crawled in over a month
We redesigned our website at http://www.aptinting.com a few months ago. We were fully expecting the crawl frequency to be very low because we had redesigned the website from a format that had been very static, and that probably has something to do with the problem we're currently having. We made some important changes to our homepage about a month ago, and the cached version of that page is still from April 2nd. Yet, whenever we create new pages, they get indexed within days. We've made a point to create lots of new blog articles and case studies to send a message to Google that the website should be crawled at a greater rate. We've also created new links to the homepage through press releases, guest blog articles, and by posting to social media, hoping that all of these things would send a message to Google saying that the homepage should be "reevaluated". However, we seem to be stuck with the April 2nd version of the homepage, which is severely lacking. Any suggestions would be greatly appreciated. Thanks!
Technical SEO | | Lemmons0 -
Google webmaster tool doestn allow me to send 'URL and all linked pages"
Hello! I made a lot of optimization changes in my site ( seo urls, and a lot more ) , I always use Google Webmaster tools, fetch as Google Bot to refresh my site but now it doesnt allow me to 'Send URL and all linked pages' check the attachment Thank you
Technical SEO | | matiw0 -
Ratio of linking C-blocks to Linking domains
Hi, Our linkbuilding efforts have resulted in acquiring a high number of backlinks from domains within a C-block. We all know Google issues penalties whenever someone's link profile looks unnatural. A high number of backlinks but a low number of linking C-blocks would seem to be one of reasons to get penalized. Example: we have 6,000 links from 200 linking root domains coming in from 100 C-blocks. At what point should we start to worry about being penalized/giving off an unnatural look to mr G?
Technical SEO | | waidohuy0 -
SEOMoz is indicating I have 40 pages with duplicate content, yet it doesn't list the URL's of the pages???
When I look at the Errors and Warnings on my Campaign Overview, I have a lot of "duplicate content" errors. When I view the errors/warnings SEOMoz indicates the number of pages with duplicate content, yet when I go to view them the subsequent page says no pages were found... Any ideas are greatly welcomed! Thanks Marty K.
Technical SEO | | MartinKlausmeier0 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0