No results with Link Analysis
-
So I have been working with a domain since November last year that still shows no improvement in regards to the link analysis. I am baffled because we have gotten them onto the first page on Google for a few of the keywords we are optimizing. Any help with this is greatly appreciated and I am a noob so definitely open to learning. Thanks in advance to all of you.
Domain in question - www.modernportablerefrigeration.com
Domain is currently on a shared server if that makes any difference.
Cordially,
Todd Richard
-
Hi Todd,
The metrics you're seeing are from February. OSE is usually updated monthly, but we had technical problems and it's been seven weeks since the last update. There's a new update due on April 27th, and you should see some changes then.
-
This is a screenshot of what I am talking about, also thanks everyone for trying to help the noob.
-
Under link analysis it shows "1" for domain authority and zeros all the way down the rest. Thanks for the feedback everyone, I guess if everything is working it might not matter, but it is unnerving and annoying to be building and building with no results in one little area. The funny thing is all tools except Webmaster Tools shows the one for domain authority and zeros for the rest, or that the data is unavailable.
-
Open Site Explorer does not have the horsepower to crawl the entire internet like google does. With less than infinite resources, OSE attempts to crawl many important sites and avoids many directories and other low page rank sites. If the links you have built are lower quality (like many of mine), they may not show up in OSE. That doesn't mean that google doesn't see them or that they don't help with keyword ranking. Google webmaster tools shows 4x more linking domains to my site than OSE, but OSE shows nearly all the important ones.
Hope that helps. Remember that the goal is ranking on google, not OSE. OSE is just a tool.
-
Hi Todd,
A site can still rank for the keywords you are optimising for with on-page alone - if the area is niche enough (or the other pages have poor on-page SEO).
If you mean that you are appearing on page one and you are not getting people link back to you organically... That's just life
If you mean you have built links and they are not showing up using Open Site Explorer then either Roger has not crawled the site with your links on or the index has not updated (IIRC OSE's site index gets updated once a month).
-
Which specific metrics are not improving?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Broken canonical link errors
Hello, Several tools I'm using are returning errors due to "broken canonical links". However, I'm not too sure why is that. Eg.
Technical SEO | | GhillC
Page URL: domain.com/page.html?xxxx
Canonical link URL: domain.com/page.html
Returns an error. Any idea why? Am I doing it wrong? Thanks,
G1 -
Can I redirect a link even if the link is still on the site
Hi Folks, I've got a client who has a duplicate content because they actually create duplicate content and store the same piece of content in 2 different places. When they generate this duplicate content, it creates a 2nd link on the site going to the duplicate content. Now they want the 2nd link to always redirect to the first link, but for architecture reasons, they can't remove the 2nd link from the site navigation. We can't use rel-canonical because they don't want visitors going to that 2nd page. Here is my question: Are there any adverse SEO implications to maintaining a link on a site that always redirects to a different page? I've already gone down the road of "don't deliberately create duplicate content" with the client. They've heard me, but won't change. So, what are your thoughts? Thanks!
Technical SEO | | Rock330 -
Better to Remove Toxic/Low Quality Links Before Building New High Quality Links?
Recently an SEO audit from a reputable SEO firm identified almost 50% of the incoming links to my site as toxic, 40% suspicious and 5% of good quality. The SEO firm believes it imperative to remove links from the toxic domains. Should I remove toxic links before building new one? Or should we first work on building new links before removing the toxic ones? My site only has 442 subdomains with links pointing to it. I am concerned that there may be a drop in ranking if links from the toxic domains are removed before new quality ones are in place. For a bit of background my site has a MOZ Domain authority of 27, a Moz page authority of 38. It receives about 4,000 unique visitors per month through organic search. About 150 subdomains that link to my site have a Majestic SEO citation flow of zero and a Majestic SEO trust flow of zero. They are pretty low quality. However I don't know if I am better off removing them first or building new quality links before I disavow more than a third of the links to the site. Any ideas? Thanks,
Technical SEO | | Kingalan1
Alan0 -
Number of links you should have on a taxonomy term??
According to SeoMoz, my taxonomy terms contain more than 100 links (links to articles in my case) and it tells me that I should reduce it. I have seen a video by Matt Cutts, the google software engineer, and in that video he said that Google's engine has dramatically improved ever since and 100 is not the limit anymore. What do you guys think is the best practice here? To clarify the subject even more: I want to learn this from link juice perspective, does it effect how link juice is distributed? Let's say I have 5 taxonomy terms and all of them have 200 articles and these 5 terms are listed on the home page of a PR7 website. In this case some of the PR will be passed to these 5 taxonomy terms. However, if I increase taxonomy terms to 10, then i will reduce links to 100, but the PR will be distributed even more. This means each taxonomy term will have even less PR value. Am I wrong? Any ideas?
Technical SEO | | mertsevinc0 -
Noindex,follow - linked pages not showing
We have a blog on our site where the homepage and category pages have "noindex,follow" but the articles have "index,follow". Recently we have noticed that the article pages are no longer showing in the Google SERPs (but they are in Bing!) - this was done by using the "site:" search operator. Have double-checked our robots.txt file too just in case something silly had slipped in, but that's as it should be... Has anyone else noticed similar behaviour or could suggest things I could check? Thanks!
Technical SEO | | Nobody15569050351140 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0 -
Onpage linking
On my homepage, I currently link to about 40 internal pages. I'm considering altering the internal linking structure to have 50-100 links on the 2nd level pages. If I was to do this, I'd only need 8 homepage links. Do you think the 8 pages linked from the homepage would go up in the SERPs as the pagerank would be less diluted? I've heard so many mixed views on this. Be interested to see what people here think. Thanks, Pete
Technical SEO | | PeterM220