Linking C blocks strategy - Which hat is this tactic?
-
This related to a previous question I had about satellite sites. I questioned the white-hativity of their strategy. Basically to increase the number of linking C blocks they created 100+ websites on different C blocks that link back to our main domain. The issue I see is that-
- the sites are 98% exactly the same in appearance and content. Only small paragraph is different on the homepage.
- the sites only have outbound links to our main domain, no in-bound links
Is this a legit? I am not an SEO expert, but have receive awesome advice here. So thank you in advance!
-
Thank you Robert! Let me take try your suggestions and then I will report back.
-
C3,
One of the things I would suggest is to start by having success defined utilizing KPI's, analytics, etc. Did you have an engagement with what they were to accomplish and so forth. Have a baseline of where the site was prior to the newcom coming on board. When did the changes take place (were they put into GA on the dates they occurred?)? What is the result since then? What else was done during that period? Now you have a starting point.
Next, I would suggest you get the lower cost ahrefs membership (even if only for a month) and run your site through ahrefs. You will have a near complete list of links to the site. Where are the 100 within this? How do they compare to the other links coming to the site? Also, look at the microsites and see if your site is the only one being linked to. Remember if you have your link and another, they gave half the value of the link away.
If this was the key strategy, when was it implemented and what has changed since then. Remember that data is your friend. With our clients we are careful to get a baseline, talk about the issues they are facing, delineate potential risks, etc. With these sites, run them in copyscape and see if even the unique content is unique. Did you pay for unique?
Next, I would run the site through a moz campaign and see what I see. I would look at GWMT and see if the linking sites are showing in GWMT and I would look to see how many new pages are being indexed subsequently. If someone is saying that this linking strategy is key and you have duplicate meta descriptions, Title Tags, no H1, etc. (run the site through Xenu and you will have all of that and more), I think you can find a dozen places where someone in SEO says, if you do not do the on page, etc. there is no reason to do the other.
So, the data will be your friend if you want to show whether or not this is working. Hey, if it is let us know and how and maybe we will all say, they are right, I was wrong.
Best,
Robert
-
Don't worry about any "major damage to our domain authority". Those sites/links as you described aren't helping any and in light of a potential penalty, you're better off removing them.
-
Hi Robert,
I appreciate you getting involved! According to our SEO provider this tactic is a major part of their strategy and reason for the success of the site. I asked them to disable them and then they said for sure we would see "major damage to our domain authority".
The other issue is that they actually don't spend any time on these sites. They haven't been updated or touched in 7 months. The blog posts and single "unique" paragraph per site has remained the same. In fact, blog posts are exactly the same on all sites, basically scraped. However, they bill us for these sites because they are supposedly required for our SEO success.
My challenge has been trying to question their strategy when I am not an expert and they are supposed to be. Yes, they speak as if this tactic is unicorn dust.
-
If you haven't done any link building to those sites, they are pretty much worthless. G knows about this strategy and best case scenario, ignores them. DA is irrelevant to rankings. I can show you many sites with amazing DA but shit rankings because they are penalized/crappy links.
Opportunity cost: 100 domains @ $10/yr + 100 ips @ $20/yr = $3k in yearly savings. You can easily put that money to better use.
-
Heh, heh. Does ring a bell doesn't it Robert?
I'd de-link stat before Google banishes my site and ignores my reconsideration requests.
-
C3
You have some good responses but this is another of those where it is hard to sit on the sidelines. I have to ask a few different questions with a situation like this; first, forget what they did re the C blocks. What was the desired result they were seeking? What was the plan (with rationale) to achieve that result? And, no matter the answer to any of that, what percentage of optimization/ranking do they or their client believe is related to linking?
So, do they really spend this much effort on a 20 to 30% factor? And remember, this is not effort around bringing in quality links, it is effort around linking as if that is the Holy Grail of SEO. Given the time spend, the opportunity spend, the actual cost to the client, etc. Is this 80% plus of the SEO effort? I would be surprised if it wasn't. Usually when I come across this kind of thing, the "SEO" firm doing it is doing it as some sort of silver bullet SEO. They have discovered a secret way to sprinkle unicorn dust on the algorithm, etc.
To me and in my opinion, it is not white hat, grey hat, or black hat with sequins. It is just a waste of time and energy. It is just highly inefficient. Are they saying they can do more with this strategy than say the people on this forum with an actual strategy? If you are worrying about can linking via multiple C blocks from EMD's I own for some sort of benefit to some site, I think you are looking at SEO from a very odd perspective (not you, I am using the global you as if for anyone who). Interesting approach.
Best
-
C3,
Let's see... if those sites have no inbound links, what value are they to the main domain? If they have no inbound links, how is Google going to find them? If you submit the urls to google, google will see 100 new new sites that were all registered at the same time (and maybe to the same owner), all with the same content, and all with links only to your site.
This attempt at manipulation is very easy for google to recognize and you're putting your main site in jeopardy by following this tactic.
-
Sorry, I just re-read my response. I wasn't trying to be condescending with the first line. I was actually trying to clarify who initiated the tactic. Thanks!
-
SEJunkie,
To clarify, the SEO provider did this. But, yes, 100+ direct match urls, all on different C block ip's, but mostly the same content. Navigational links from these site link to sections of our main site. Ex. "Electronics" on satellite site links to "Electronics" on our main site.
There is a paragraph on each homepage below the fold that describes that is unique for each page, but that is the only differing piece of content. The rest of the content is exactly the same including the blog posts.
-
Hi Eric,
Just to clarify, you have purchased 100+ domain names, created 100+ near duplicate websites, using hosting on 100+ different cblock ip's? I would lean more towards the thinking that it's a little bit on the black-hat side of the fence. With no backlinks these sites are offering no Domain Authority to your site. They still however, maybe passing some rank juice. You need to be able to test the effectiveness of the links in order to decide to keep it or remove it. If you find the links are passing some value, i wouldn't remove them. I suggest developing them into something more over time. You don't need to regularly update these sites, just develop somethng decent for a content centerpiece and move on to the next, before you know it you'll have your own network.
-
Oleg,
So what's best course of action? Building strong content for each of these sites (100+) would be an enormous task, but disabling would kill the number of linking domains, which I assume would lower our DA in a hurry.
We actually didn't ask or want the sites developed because we don't have the resources to develop content for so many sites. The SEO insisted and put the sites up for "free" as part of their strategy. Yet, they haven't developed any new content for these sites in over 7 months.
Seems like it was a mistake from the beginning to do this.
Thanks,
Eric -
This used to work, now its a waste of time that will most likely get you penalized.
You are better off using those time and resources to develop a strong piece of content and link building to it from authoritative sites.
Cheers,
Oleg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Strategies to recover from a Google Penalty?
2 years ago we took over a client who had a hacked site and also had signed up with a black hat SEO team that set up 50 spammy directory links back to the site. Since then we have cleaned up the hacks, had the site reviewed by Google and readded to the Search Index, and disavowed all the directory links through GWT. Over the last 2 years, we've encouraged the client to create new content and have developed a small but engaged social following. The website is www.fishtalesoutfitting.com/. The site's domain authority is 30, but it struggles to rank higher than 20 for even uncompetitive long tail keywords. Other sites with much lower domain authorities outrank the site for our primary keywords. We are now overhauling the site design and content. We are considering creating an entirely new URL for the primary domain. We would then use 301 redirects from the old url to the new. We'd welcome insight into why the current site may still be getting penalized, as well as thoughts on our strategy or other recommendations to recover from the events of 2 years ago. Thank you.
White Hat / Black Hat SEO | | mlwilmore0 -
Is there a danger linking to and from one website too many times?
Basically my webdeveloper has suggested that instead of using a subfolder to create an English and Korean version of the site I should create two different websites and then link them both together to provide the page in English, or in Korean, which ever the case may be. My immediate reaction is that search engines may perceive this kind of linking to be manipulative, as you can imagine there will be a lot of links (One for every page). Do you think it is OK to create two webpages and link them together page by page? Or do you think that the site will get penalized by search engines for link farming or link exchanging. Regards, Tom
White Hat / Black Hat SEO | | CoGri0 -
Is this a clear sign that one of our competitors is doing some serious black-hat SEO?
One of our competitors just recently increased their total external followed looks pretty drastically. Is it safe to say they are doing some pretty black-hat stuff? What actions exactly could this be attributed to? They've been online and in business for 10+ years and I've seen some pretty nasty drops in traffic on compete.com for them over the years. If this is black-hat work in action, would these two things be most likely related? Wh10b97
White Hat / Black Hat SEO | | Kibin0 -
Infographic submission sites potentially offering paid links....
Good Morning/Afternoon fellow Mozzers, I recently created an infographic and am now looking to get it distributed via as many publications as possible. I discovered some great sites with collections of infographics.However I have discovered a multitude of sites offering to review and feature the infographic, or "express" submissions so the graphic features faster for a price..... links below. http://www.amazinginfographics.com/submit-infographics/ http://infographicjournal.com/submit-infographics/ 2 questions 1. Is this considered as buying links? My instincts say Yes. 2. Some sites offer mix of free and "express" paid submissions. If the answer to Q.1 is yes, should I avoid them all together even if my graphic gets picked up free? Thanks in advance for the feedback.
White Hat / Black Hat SEO | | RobertChapman0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0 -
Links In Blog Posts: 1 Paragraph VS. Full Article
Hey guys, I've been using an article network to post unique articles (not spun). Been posting 1 paragraph articles with 1 text link. Just wondering what the main difference would be if I were to post a full article with 2 or 3 text links vs 1 paragraph with 1 text link, besides the fact that you get more links and save more time writing only 1 paragraph. Will the full article with 3 backlinks improve keyword ranks more or not by much? Cheers!
White Hat / Black Hat SEO | | upick-1623910 -
Do bad links "hurt" your ranking or just not add any value
Do bad links "hurt" your ranking or just not add any value. By this I mean, if you do have links from link farms and bad neighbourhoods, would it effectively pull you down in search engine rankings. Or is it more that it's just a waste of time to get these links, as it adds no value to your ranking. Are google saying avoid them because it will not have a positive effect, or avoid them becuase it will have a negative effect. I am under the opinion that it will not harm, but it will not help either. I think this because at the end of the day you are not 100% in control of your inbound links, any bad site could add you and if a competitor, god forbid, wanted to play some black hat games, couldn't they just add you to thousands of bad sites to pull your ranking down? Interested to hear your opinions on the matter, or any "facts" if they are out there.
White Hat / Black Hat SEO | | esendex0 -
Opinions Wanted: Links Can Get Your Site Penalized?
I'm sure by now a lot of you have had a chance to read the Let's Kill the "Bad Inbound Links Can Get Your Site Penalized" Myth over at SearchEngineJournal. When I initially read this article, I was happy. It was confirming something that I believed, and supporting a stance that SEOmoz has taken time and time again. The idea that bad links can only hurt via loss of link juice when they get devalued, but not from any sort of penalization, is indeed located in many articles across SEOmoz. Then I perused the comments section, and I was shocked and unsettled to see some industry names that I recognized were taking the opposite side of the issue. There seems to be a few different opinions: The SEOmoz opinion that bad links can't hurt except for when they get devalued. The idea that you wouldn't be penalized algorithmically, but a manual penalty is within the realm of possibility. The idea that both manual and algorithmic penalties were a factor. Now, I know that SEOmoz preaches a link building strategy that targets high quality back links, and so if you completely prescribe to the Moz method, you've got nothing to worry about. I don't want to hear those answers here - they're right, but they're missing the point. It would still be prudent to have a correct stance on this issue, and I'm wondering if we have that. What do you guys think? Does anybody have an opinion one way or the other? Does anyone have evidence of it being one way or another? Can we setup some kind of test, rank a keyword for an arbitrary term, and go to town blasting low quality links at it as a proof of concept? I'm curious to hear your responses.
White Hat / Black Hat SEO | | AnthonyMangia0