Link Diversity
-
With the current updates in the Seo world how critical is link diversity. We are revamping our site and planning to add many new pages to our site and planning to build links to relevant pages with relevant anchor texts keywords. Also we are planning to add relevant H1, H2 and H3 tags with metatag description and content with keyword rich content specific to that page. Any advise
-
Chris Menke, Francisco Meza , Jeepster thanks for your response
-
For god sakes, if the only page a website can build links to is the home page, then none of the other pages deserve to rank.
I couldn't agree more. Yet I've seen internal pages on high-authority sites rank #1 for very competitive keywords -- despite having no external inbound links and just a handful (4-5) of internal IBLs.
-
Hey Francisco, my understanding is that what you described above is called "deep linking", rather than "link diversity".
Also, Chris, the web could probably do without another keyword rich page of anything--certainly us users could. Rather, think creative, think engaging, think audience, think business objective when you put your pages together.
-
i partially agree with Chris. However, links need to be distributed to different landing pages. For god sakes, if the only page a website can build links to is the home page, then none of the other pages deserve to rank.
-
Link "diversity" describes the degree to which a site's back link profile is spread among sites on different IP addresses, as opposed to all of them coming from a single site or IP address. Link diversity is important because a site with any number of links would most naturally get them from a variety of sites on different IP addresses. If all the links come from one domain or one IP address, then there is the appearance that manipulation may be taking place.
It is believed that Google's algorithm is able to throttle the value of a site's back links if their diversity is inconsistent with the pattern of links displayed by similar sites.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bing search results - Site links
My site links in Bing search results are pulling through the footer text instead of the meta description (see image). Is there any way of controlling this? 2L2VusT
Technical SEO | | RWesley0 -
Unnatural Links on Forum Posts
Google responds to my reconsideration request. They give me like 2 links of the many unnatural links which are actually people mentioning our website in their conversation. How can that be unnatural, legitimate people discussing about our website services? Even if it's unnatural, how can I possibly remove a backlink from a forum post?
Technical SEO | | Droidman860 -
Toxic Link Removal
Greetings Moz Community: Recently I received an site audit from a MOZ certified SEO firm. The audit concluded that technically the site did not have major problems (unique content, good architecture). But the audit identified a high number of toxic links. Out of 1,300 links approximately 40% were classified as suspicious, 55% as toxic and 5% as healthy. After identifying the specific toxic links, the SEO firm wants to make a Google disavow request, then manually request that the links be removed, and then make final disavow request of Google for the removal of remaining bad links. They believe that they can get about 60% of the bad links removed. Only after the removal process is complete do they think it would be appropriate to start building new links. Is there a risk that this strategy will result in a drop of traffic with so many links removed (even if they are bad)? For me (and I am a novice) it would seem more prudent to build links at the same time that toxic links are being removed. According to the SEO firm, the value of the new links in the eyes of Google would be reduced if there were many toxic links to the site; that this approach would be a waste of resources. While I want to move forward efficiently I absolutely want to avoid a risk of a drop of traffic. I might add that I have not received any messages from Google regarding bad links. But my firm did engage in link building in several instances and our traffic did drop after the Penguin update of April 2012. Also, is there value in having a professional SEO firm remove the links and build new ones? Or is this something I can do on my own? I like the idea of having a pro take care of this, but the costs (Audit, coding, design, content strategy, local SEO, link removal, link building, copywriting) are really adding up. Any thoughts??? THANKS,
Technical SEO | | Kingalan1
Alan0 -
Cross links between sites
hi, We have several ecommerce sites and we cross linked 3 of them by mistake. We realize that the sites were linked through WMT, We have shut down 2 of the sites about 2 months ago, but WMT still shows the links coming from those 2 sites. how do we make sure that google will see the sites are shut down. Is there a better of way resolving this issue. We are no longer using those sites, so do not need them to be active. whats the best solution to show google that the links are no longer there. Crawler shows that it was able to crawl the site 45 days after it is shut down. thanks nick
Technical SEO | | orion680 -
Press release not giving me my link juice
The other day we released a press release, see it here http://www.businesswire.com/news/home/20120717006087/en/Rapid7-Metasploit-Pro-Increases-Vulnerability-Management-Efficiency. I asked them to include two links (seen in the first paragraph) with targeted anchor text (vulnerability management and penetration testing). The press release was published and when I check the open site explorer to see if I got any link juice from the press release, I am not seeing the link...ugh I noticed that they are using some sort of tracking code that seems to be redirecting the link, is this the problem? I talked to our sales rep at businesswire and he told me that they could take the code off if that is what needs to be done. Do you have any insight into this or have you ever ran into this problem?
Technical SEO | | PatBausemer0 -
Canonical Link Quesiton
I wrote an article that is a page article, but would also be a very good blog post - So my question is two things: 1. If i post it as a static page and syndicate it as a blog post and have it as a canonical link to the page, google will read see the blog and read the page _url as the one with credit correct? In turn not dinging me for duplicate content. 2. Given if the above statement is correct, should I write the blog and put it on my static page referencing the blog or the way i have it as a static page with the blog using a canonical reference back to the page. Any input would be greatly appreciated.
Technical SEO | | tgr0ss0 -
How does your crawler treat ajax links?
Hello! It looks like the seomoz crawler (and google) follows ajax links. Is this normal behavior? We have implemented the canonical element and that seems to resolve most of the duplicate content issues. Anything else we can do? Example: Krom
Technical SEO | | AJPro0 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0