Anchor Text Diversification – Branded VS Non Branded – What is the best approach… if any?
-
Our organization competes in the Drug & Alcohol Treatment Category… very competitively I must say.
While we create content for long-tail keywords, we focus on linking (blogging + Press Release + Acquisition, etc…) as the main strategy to increase relevancy for 4 major keywords. (Alcohol Rehab, Drug Rehab, Alcohol Treatment, and Drug Treatment)… all these terms have their respective landing pages, and we try to provide a good flow of new links coming to these pages on a weekly basis… Lately we have been acquiring more links than we anticipated… not a bad thing since they are from reputable websites… however I am a bit concern regarding the Anchor Text distribution of these links.
Example
Let’s say I get 100 links to my ‘Alcohol Rehab’ page – what is an appropriate percentage for the anchor text distribution?
For example:
Branded Links 20 - Keyword: St Jude Retreats
Exact Match Links 70 - Keyword: Alcohol Rehab
Broad Links 10 - Keyword: RehabIs this an ok distribution, or should I change things around?
Hope you guys can help!
Thanks!!!!
-
Thanks Donnie...
-
Ive read a few case studie and I found that it is best to link to your brand name 70% of the time and use the other 20% with keyword variation and lastly use 10% [Exact] keyword and you will be safe.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We are redirecting http and non www versions of our website. Should all versions http (non www version and www version) and https (non www version) should just have 1 redirect to the https www version?
We are redirecting http and non www versions of our website. Should all versions http (non www version and www version) and https (non www version) should just have 1 redirect to the https www version? Thant way all forms of the website are pointing to one version?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
Content Rendering by Googlebot vs. Visitor
Hi Moz! After a different question on here, I tried fetching as Google to see the difference between bot & user - to see if Google finds the written content on my page The 2 versions are quite different - with Googlebot not even rendering product listings or content, just seems to be the info in the top navigation - guessing this is a massive issue? Help Becky
Intermediate & Advanced SEO | | BeckyKey0 -
How best to deindex tens of thousands of pages?
Hi there, We run a quotes based site and so have hundreds of thousands of pages. We released a batch of pages (around 2500) and they ranked really well. Encouraged by this we released the remaining ~300,000 pages in just a couple of days. These have been indexed but are not ranking any where. We presume this is because we released too much too quickly. So we want to roll back what we've done and release them in smaller batches. So I wondered if: 1. Can we de-index thousands of pages, and if so what's the best way of doing this? 2. Can we then re-index these pages but over a much greater time period without changing the pages at all - or would we need to change the pages/the URL's etc? thanks! Steve
Intermediate & Advanced SEO | | SteveW19870 -
URL Change Best Practice
I'm changing the url of some old pages to see if I can't get a little more organic out of them. After changing the url, and maybe title/desc tags as well, I plan to have Google fetch them. How does Google know that the old url is 301'd to the new url and the new url is not just a page of duplicate content? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Anchor text is a url, but not the one I want?
I am currently getting links to my site where the anchor text is, say, domainx.com. However, my website's url is domainy.com - the anchor text is the url of another website. Is this hurting my website in the SERPS, or making my link building look unnatural? The link is on the blog roll of another website, so it has produced hundreds, maybe thousands of copies of this anchor text link. Does that have any impact on my ranking as well?
Intermediate & Advanced SEO | | redfishking0 -
Removing Content 301 vs 410 question
Hello, I was hoping to get the SEOmoz community’s advice on how to remove content most effectively from a large website. I just read a very thought-provoking thread in which Dr. Pete and Kerry22 answered a question about how to cut content in order to recover from Panda. (http://www.seomoz.org/q/panda-recovery-what-is-the-best-way-to-shrink-your-index-and-make-google-aware). Kerry22 mentioned a process in which 410s would be totally visible to googlebot so that it would easily recognize the removal of content. The conversation implied that it is not just important to remove the content, but also to give google the ability to recrawl that content to indeed confirm the content was removed (as opposed to just recrawling the site and not finding the content anywhere). This really made lots of sense to me and also struck a personal chord… Our website was hit by a later Panda refresh back in March 2012, and ever since then we have been aggressive about cutting content and doing what we can to improve user experience. When we cut pages, though, we used a different approach, doing all of the below steps:
Intermediate & Advanced SEO | | Eric_R
1. We cut the pages
2. We set up permanent 301 redirects for all of them immediately.
3. And at the same time, we would always remove from our site all links pointing to these pages (to make sure users didn’t stumble upon the removed pages. When we cut the content pages, we would either delete them or unpublish them, causing them to 404 or 401, but this is probably a moot point since we gave them 301 redirects every time anyway. We thought we could signal to Google that we removed the content while avoiding generating lots of errors that way… I see that this is basically the exact opposite of Dr. Pete's advice and opposite what Kerry22 used in order to get a recovery, and meanwhile here we are still trying to help our site recover. We've been feeling that our site should no longer be under the shadow of Panda. So here is what I'm wondering, and I'd be very appreciative of advice or answers for the following questions: 1. Is it possible that Google still thinks we have this content on our site, and we continue to suffer from Panda because of this?
Could there be a residual taint caused by the way we removed it, or is it all water under the bridge at this point because Google would have figured out we removed it (albeit not in a preferred way)? 2. If there’s a possibility our former cutting process has caused lasting issues and affected how Google sees us, what can we do now (if anything) to correct the damage we did? Thank you in advance for your help,
Eric1 -
301 vs Changing Link href
We have changed our company and want to 301 old domain from new domain in order to transfer the benefits of backlinks (DA: 50, 115 Linking Root Domains). I have the ability to modify around 50% of the backlinks. So my question is: Instead of redirecting all the links, should I update the 50% to link to the new domain instead of relying on redirects? Would this possibly trip an algorithmic filter and devalue these links? Or should I just do a 301 and not worry about modifying the links?
Intermediate & Advanced SEO | | Choice0 -
Recommendation to fix Google backlink anchor text over optimisation filter penalty (auto)
Hi guys, Some of you may have seen a previous question I posted regarding a new client I started working with. Essentially the clients website steadily lost all non domain name keyword rankings over a period of 4-12 weeks, despite content changes and various other improvements. See following:: http://www.seomoz.org/q/shouldn-t-google-always-rank-a-website-for-its-own-unique-exact-10-word-content-such-as-a-whole-sentence After further hair pulling and digging around, I realised that the back link anchor text distribution was unnatural for its homepage/root. From OSE, only about 55/700 of links anchor text contain the clients domain or company name!....8%. The distribution of the non domain keywords isn’t too bad (most repeated keyword has 142 links out of the 700). This is a result of the client submitting to directories over the last 3 years and just throwing in targeted keywords. Is my assumption that it is this penalty/filter correct? If it is I guess the lesson is that domain name anchor texts should make up more of your links? MY QUESTION: What are some of the effective ways I can potentially remove this filter and get the client ranking on its homepage again? Ensure all new links contain the company name?
Intermediate & Advanced SEO | | Qasim_IMG
Google said there was no manual penalty, so not sure if there’s any point submitting another reconsideration request? Any advice or effective experiences where a fix has worked would be greatly appreciated! Also, if we assume company is "www.Bluewidget.com", what would be the best way to link most naturally: Bluewidget
Blue widget
Blue widget .com
www.bluewidget.com
http://www.bluewidget.com....etc I'm guessing a mix of the above, but if anyone could suggest a hierarchy that would be great.0