Total Linking Root Domins
-
I'm still very new to link building and SEO, (only one month in!) I have a reasonable starting point as our web site is fairly well designed but I have a long way to go to catch up in what is a very competitive market! (Music Retail).
My question is regarding Open Site explorer and its ability to accurately reflect back links. Looking at total quantity of unique back link domains Open Site currently has me at 36 domains, Google webmaster tools has me at 187!! The one site that all my 'watched' competitors has is DMOZ, this gives them a great boost on the SEOmoz and open site rankings. I have 3 links on DMOZ and they all show up on my Webmaster tools yet don't show on either SEOmoz or open site. I've left it for a couple of months to see if it caught up but no real change.
My worry is that there seems little value in the SEOmoz tools if their not at least as complete as Google webmaster is to allow me a fair comparison! Is there anything wrong with my setup on SEOmoz??
-
Your link is 5 clicks away from the home page which, for DMOZ, should be sufficient enough for the link to be discovered. I checked the AHREFS and Raven but neither show the link either which causes me to suspect this link is new. Can you confirm what date the link was first created?
I checked some other links from the same DMOZ page and they are not being seen by OSE either, but they do appear in AHREFS and Raven. I therefore opened a ticket with the SEOmoz help desk to investigate this issue.
Hopefully, the link will appear on 5/29 after the next update.
-
This is the link:
If anyone has an insight let me know!
-
Across the board, SEO tools can benefit from significant improvements, including the SEOmoz tool set. I was excited to learn about SEOmoz recent acquisition of $18 million. I am hopeful a significant portion of that budget will be allocated towards the development and improvement of SEO tools.
With the above stated, I do not anticipate any SEO tool to be able to keep up with the massively number and type of changes made by search engines. Just look at the past year. The following changes were HUGE: Panda, G+, Local, Bing taking over Yahoo, HTML5, a massive increase in mobile usage, Penguin and much more.
I too am seeking better tools and if you find them please share them with the community.
With respect to your DMOZ link, was it earned in the past 60 days? If it is older then 90 days and has not shown up in OSE, it is unlikely to appear after the next update. That's about all the information I can share without examining the link.
-
In response to the DMOZ depth, it is no deeper than my competitors, in some cases less deep but theirs are showing in Open Site and SEOmoz. The only reason I highlight that particular link is simply that it has reasonable value and my competitors have it in their link profiles on Open site.
I'm guessing my only choice is to wait another month to see if it appears.
I just want to state that I wasn't trying to rubbish SEOmoz as such, It's a fantastic website with excellent content, invaluble for a new starter. I was just finding it difficult to make a valid comparison particularly when there was so much disparity between the Google Webmaster results and Open site explorer.
-
All of the SEOmoz tools are based upon the Linkscape crawl of the web. The crawler takes 2-3 weeks to crawl the web and then another 1-2 weeks to process the data. The SEOmoz tools and Open Site Explorer (OSE) are updated about once per month. The next update is scheduled for May 29th. You can view the update schedule here: https://seomoz.zendesk.com/entries/345964-linkscape-update-schedule. In summary, the backlinks you see are always going to be 1 - 2 months behind.
You should also know the link profile shown is not 100% complete, nor even 90%. The last I heard Linkscape crawls about the top 40% of web pages. That is perfectly fine in that, for the most part, if a link is not visible in OSE it likely has little to no value.
There are numerous other tools out there that also crawl the web in a similar manner to Linkscape. I have researched many of them and, in my opinion, the next two best crawlers are Raven and AHREFS. All of them have the same issues. They are not complete. The internet involves tens of millions of websites. There are single websites which offer millions of pages so the total volume of pages to crawl reaches incredibly high numbers.
My worry is that there seems little value in the SEOmoz tools if their not at least as complete as Google webmaster is to allow me a fair comparison!
Google had $38 billion in sales in 2011. They have the ability to index a page in seconds where it would take SEOmoz at least weeks to capture that same link. Your expectation is that a group of companies whose combined sales is less then 1% of Google be able to produce results which are "at least as complete"? It's simply not reasonable.
If you wish to try Raven and AHREFS tools you can. Perhaps they will even see a specific link you desire such as the one from DMOZ.
If you can share the URL of the DMOZ link and your site, we can offer a more complete analysis. The problem with DMOZ is the depth of it's site. Web crawlers, including Google, only go so deep into a site. The question is, using your mouse, how many clicks would it take to get to your DMOZ link from the DMOZ home page. The design of a site like Wikipedia, for example, is very flat. The design of DMOZ is very deep. The page which contains your link matters a lot.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What would be causing our linking domains and inbound links to decline?
I am noticing a decline in the number of our linking domains and inbound links from month to month. It isn't drastic but looking like a trend. Any reason why this would happen? I'm not sure where to start. Thanks!
Reporting & Analytics | | amanda_feagle0 -
Tracking links and duplicate content
Hi all, I have a bit of a conundrum for you all pertaining to a tracking link issue I have run into on a clients site. They currently have over duplicate content. Currently, they have over 15,000 pages being crawled (using Screaming Frog) but only 7,000+ are legitimate pages in the sense of they are not duplicates of themselves. The client is using Omniture instead of Google Analytics and using an advanced tracking system on their site for internal and external links (ictids and ectids) in the URL parameters. This is creating thousands of duplicated pages being crawled by Google (as seen on their Search Console and on Screaming Frog). They also are in the middle of moving over from http to https and have thousands of pages currently set up for both, again, creating a duplicate content issue. What I have suggested for the tracking links is setting up a URL parameter in Search Console for these tracking links. I've also suggested they canonical all tracking links to point to the clean page so the pages that have already been indexed point to the correct clean url. Does this seam like the appropriate strategy? Additionally, I've told them before they submit a new sitemap to Google, they need to switch their website over to https to avoid worsening their duplicate content issue. They have not submitted a sitemap to Google Search Console since March 2015. Thank you for any help you can offer!
Reporting & Analytics | | Rydch410 -
Links From Public Info
Hi guys, I'm conducting a link analysis for one of my websites and I have found a few potentially damaging links. However, there are some people who I have contacted about removing links who have said that they will not remove my link because they are using data that is publicly available. Is there anything I can do to get those links removed rather than diavow? Thanks.
Reporting & Analytics | | AAttias0 -
Linking My Google analytics with visual.ly
Hello, I have a quick question: Is it secure and safe to link my Google Analytics account with Visual.ly here's the link: https://create.visual.ly/graphic/google-analytics I want to create easy-to-understand report from Google Analytics and if you have any experience with this service can you share it with me please. Thanks.
Reporting & Analytics | | JonsonSwartz0 -
Linking Multiple Niche Site In Same Google Analytics Account
Hi, I am providing SEO for Local business. Is it advisable to separate out the Google Analytics into different Google account or is it ok to remain it this way? Some of the client might be in the same niche, and might be competing with the same keywords as well. What I was worried is, Google might see these sites as same owner and only rank for 1 of the site. I was thinking to get the owners to register for their own Google Analytics and share the access to me.
Reporting & Analytics | | JonathanSoh0 -
Yikes..what is this? "We have detected that the root domain xxxxxx.com does not repsond to web requests"!!!
Can anyone tell me what this means? "We have detected that the root domain xxxxxx.com does not repsond to web requests. Using this domain, we will be unable to crawl your site or present accurate SERP information"
Reporting & Analytics | | MannixSEO0 -
Subdomain tracking codes on subdomain and not root
Afternoon all. I’m pretty sure this is going to be fine but I thought I would seek some confirmation before I action anything. We have a blog subdomain on our site, I have just noticed that the entire blog (built in Wordpress) has no analytics tracking code on it. As this is built in Wordpress I have just logged in on an admin account I think people forgot I had and added the code to the header section as it is a two second job. My question is this – On the main account, subdomain tracking has not been turned on, so the root domain has not got the additional _gaq.push(['_setDomainName', 'rootdomain.co.uk']); line of code in it. I have included this on the blog.rootdomain.co.uk code as is necessary to enable tracking. Will this work or cause tracking issues? I think it should be ok. I don’t want to have to update the root domain code if I can avoid it as that will need to go through the development team and may take weeks as they are swamped and under resourced. Thanks, Rich.
Reporting & Analytics | | Sarbs0 -
Subdomain and relative link paths cause crawl errors
I have a Wordpress blog on our subdomain and we use relative paths on our domain. It appears as though Google bot is crawling from the subdomain categories back to the domain relative paths. This of course results in hundreds of 404 pages. Any suggestions as to how to resolve this issue without changing the relative path structure of our domain? I can provide more information if need be. While I realize these issues are not that pressing, I'd obviously like to remove as many errors as possible. If anyone has encountered this problem, especially in Wordpress I'd really like to hear your solution or lack there of. Thank you in advance.
Reporting & Analytics | | BethA0