Linking C blocks strategy - Which hat is this tactic?
-
This related to a previous question I had about satellite sites. I questioned the white-hativity of their strategy. Basically to increase the number of linking C blocks they created 100+ websites on different C blocks that link back to our main domain. The issue I see is that-
- the sites are 98% exactly the same in appearance and content. Only small paragraph is different on the homepage.
- the sites only have outbound links to our main domain, no in-bound links
Is this a legit? I am not an SEO expert, but have receive awesome advice here. So thank you in advance!
-
Thank you Robert! Let me take try your suggestions and then I will report back.
-
C3,
One of the things I would suggest is to start by having success defined utilizing KPI's, analytics, etc. Did you have an engagement with what they were to accomplish and so forth. Have a baseline of where the site was prior to the newcom coming on board. When did the changes take place (were they put into GA on the dates they occurred?)? What is the result since then? What else was done during that period? Now you have a starting point.
Next, I would suggest you get the lower cost ahrefs membership (even if only for a month) and run your site through ahrefs. You will have a near complete list of links to the site. Where are the 100 within this? How do they compare to the other links coming to the site? Also, look at the microsites and see if your site is the only one being linked to. Remember if you have your link and another, they gave half the value of the link away.
If this was the key strategy, when was it implemented and what has changed since then. Remember that data is your friend. With our clients we are careful to get a baseline, talk about the issues they are facing, delineate potential risks, etc. With these sites, run them in copyscape and see if even the unique content is unique. Did you pay for unique?
Next, I would run the site through a moz campaign and see what I see. I would look at GWMT and see if the linking sites are showing in GWMT and I would look to see how many new pages are being indexed subsequently. If someone is saying that this linking strategy is key and you have duplicate meta descriptions, Title Tags, no H1, etc. (run the site through Xenu and you will have all of that and more), I think you can find a dozen places where someone in SEO says, if you do not do the on page, etc. there is no reason to do the other.
So, the data will be your friend if you want to show whether or not this is working. Hey, if it is let us know and how and maybe we will all say, they are right, I was wrong.
Best,
Robert
-
Don't worry about any "major damage to our domain authority". Those sites/links as you described aren't helping any and in light of a potential penalty, you're better off removing them.
-
Hi Robert,
I appreciate you getting involved! According to our SEO provider this tactic is a major part of their strategy and reason for the success of the site. I asked them to disable them and then they said for sure we would see "major damage to our domain authority".
The other issue is that they actually don't spend any time on these sites. They haven't been updated or touched in 7 months. The blog posts and single "unique" paragraph per site has remained the same. In fact, blog posts are exactly the same on all sites, basically scraped. However, they bill us for these sites because they are supposedly required for our SEO success.
My challenge has been trying to question their strategy when I am not an expert and they are supposed to be. Yes, they speak as if this tactic is unicorn dust.
-
If you haven't done any link building to those sites, they are pretty much worthless. G knows about this strategy and best case scenario, ignores them. DA is irrelevant to rankings. I can show you many sites with amazing DA but shit rankings because they are penalized/crappy links.
Opportunity cost: 100 domains @ $10/yr + 100 ips @ $20/yr = $3k in yearly savings. You can easily put that money to better use.
-
Heh, heh. Does ring a bell doesn't it Robert?
I'd de-link stat before Google banishes my site and ignores my reconsideration requests.
-
C3
You have some good responses but this is another of those where it is hard to sit on the sidelines. I have to ask a few different questions with a situation like this; first, forget what they did re the C blocks. What was the desired result they were seeking? What was the plan (with rationale) to achieve that result? And, no matter the answer to any of that, what percentage of optimization/ranking do they or their client believe is related to linking?
So, do they really spend this much effort on a 20 to 30% factor? And remember, this is not effort around bringing in quality links, it is effort around linking as if that is the Holy Grail of SEO. Given the time spend, the opportunity spend, the actual cost to the client, etc. Is this 80% plus of the SEO effort? I would be surprised if it wasn't. Usually when I come across this kind of thing, the "SEO" firm doing it is doing it as some sort of silver bullet SEO. They have discovered a secret way to sprinkle unicorn dust on the algorithm, etc.
To me and in my opinion, it is not white hat, grey hat, or black hat with sequins. It is just a waste of time and energy. It is just highly inefficient. Are they saying they can do more with this strategy than say the people on this forum with an actual strategy? If you are worrying about can linking via multiple C blocks from EMD's I own for some sort of benefit to some site, I think you are looking at SEO from a very odd perspective (not you, I am using the global you as if for anyone who). Interesting approach.
Best
-
C3,
Let's see... if those sites have no inbound links, what value are they to the main domain? If they have no inbound links, how is Google going to find them? If you submit the urls to google, google will see 100 new new sites that were all registered at the same time (and maybe to the same owner), all with the same content, and all with links only to your site.
This attempt at manipulation is very easy for google to recognize and you're putting your main site in jeopardy by following this tactic.
-
Sorry, I just re-read my response. I wasn't trying to be condescending with the first line. I was actually trying to clarify who initiated the tactic. Thanks!
-
SEJunkie,
To clarify, the SEO provider did this. But, yes, 100+ direct match urls, all on different C block ip's, but mostly the same content. Navigational links from these site link to sections of our main site. Ex. "Electronics" on satellite site links to "Electronics" on our main site.
There is a paragraph on each homepage below the fold that describes that is unique for each page, but that is the only differing piece of content. The rest of the content is exactly the same including the blog posts.
-
Hi Eric,
Just to clarify, you have purchased 100+ domain names, created 100+ near duplicate websites, using hosting on 100+ different cblock ip's? I would lean more towards the thinking that it's a little bit on the black-hat side of the fence. With no backlinks these sites are offering no Domain Authority to your site. They still however, maybe passing some rank juice. You need to be able to test the effectiveness of the links in order to decide to keep it or remove it. If you find the links are passing some value, i wouldn't remove them. I suggest developing them into something more over time. You don't need to regularly update these sites, just develop somethng decent for a content centerpiece and move on to the next, before you know it you'll have your own network.
-
Oleg,
So what's best course of action? Building strong content for each of these sites (100+) would be an enormous task, but disabling would kill the number of linking domains, which I assume would lower our DA in a hurry.
We actually didn't ask or want the sites developed because we don't have the resources to develop content for so many sites. The SEO insisted and put the sites up for "free" as part of their strategy. Yet, they haven't developed any new content for these sites in over 7 months.
Seems like it was a mistake from the beginning to do this.
Thanks,
Eric -
This used to work, now its a waste of time that will most likely get you penalized.
You are better off using those time and resources to develop a strong piece of content and link building to it from authoritative sites.
Cheers,
Oleg
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing Links to Spans with Robots.txt Blocked Redirects using Linkify/jQuery
Hi, I was recently penalized most likely because Google started following javascript links to bad neighborhoods that were not no-followed. The first thing I did was remove the Linkify plugin from my site so that all those links would disappear, but now I think I have a solution that works with Linkify without creating crawlable links. I did the following: I blocked access to the Linkify scripts using robots.txt so that Google won't execute the scripts that create the links. This has worked for me in the past with banner ads linking to other sites of mine. At least it appears to work because those sites did not get links from pages running those banners in search console. I created a /redirect/ directory that redirects all offsite URLs. I put a robots.txt block on this directory. I configured the Linkify plugin to parse URLs into span elements instead of a elements and add no follow attributes. They still have an href attribute, but the URLs in the href now point to the redirect directory and the span onclick event redirects the user. I have implemented this solution on another site of mine and I am hoping this will make it impossible for Google to categorize my pages as liking to any neighborhoods good or bad. Most of the content is UGC, so this should discourage link spam while giving users clickable URLs and still letting people post complaints about people that have profiles on adult websites. Here is a page where the solution has been implemented https://cyberbullyingreport.com/bully/predators-watch-owner-scott-breitenstein-of-dayton-ohio-5463.aspx, the Linkify plugin can be found at https://soapbox.github.io/linkifyjs/, and the custom jQuery is as follows: jQuery(document).ready(function ($) { 2 $('p').linkify({ tagName: 'span', attributes: { rel: 'nofollow' }, formatHref: function (href) { href = 'https://cyberbullyingreport.com/redirect/?url=' + href; return href; }, events:{ click: function (e) { var href = $(this).attr('href'); window.location.href = href; } } }); 3 });
White Hat / Black Hat SEO | | STDCarriers0 -
Should You Link Back from Client's Website?
We had a discussion in the office today, about if it can help or hurt you to link back to your site from one that you optimize, host, or manage. A few ideas that were mentioned: HURT:
White Hat / Black Hat SEO | | David-Kley
1. The website is not directly related to your niche, therefore Google will treat it as a link exchange or spammy link.
2. Links back to you are often not surrounded by related text about your services, and looks out of place to users and Search Engines. HELP:
1. On good (higher PR, reputable domain) domains, a link back can add authority, even if the site is not directly related to your services.
2. Allows high ranking sites to show users who the provider is, potentially creating a new client, and a followed incoming link on anchor text you can choose. So, what do you think? Test results would be appreciated, as we are trying to get real data. Benefits and cons if you have an opinion.2 -
Massive site-wide internal footer links to doorway pages: how bad is this?
My company has stuffed several hundred links into the footer of every page. Well, technically not the footer, as they're right at the end of the body tag, but basically the same thing. They are formatted as follows: [" href="http://example.com/springfield_oh_real_estate.htm">" target="_blank">http://example.com/springfield_pa_real_estate.htm">](</span><a class= "http://example.com/springfield_oh_real_estate.htm")springfield, pa real estate These direct to individual pages that contain the same few images and variations the following text that just replace the town and state: _Springfield, PA Real Estate - Springfield County [images] This page features links to help you Find Listings and Homes for sale in the Springfield area MLS, Springfield Real Estate Agents, and Springfield home values. Our free real estate services feature all Springfield and Springfield suburban areas. We also have information on Springfield home selling, Springfield home buying, financing and mortgages, insurance and other realty services for anyone looking to sell a home or buy a home in Springfield. And if you are relocating to Springfield or want Springfield relocation information we can help with our Relocation Network._ The bolded text links to our internal site pages for buying, selling, relocation, etc. Like I said, this is repeated several hundred times, on every single page on our site. In our XML sitemap file, there are links to: http://www.example.com/Real_Estate/City/Springfield/
White Hat / Black Hat SEO | | BD69
http://www.example.com/Real_Estate/City/Springfield/Homes/
http://www.example.com/Real_Estate/City/Springfield/Townhomes/ That direct to separate pages with a Google map result for properties for sale in Springfield. It's accompanied by the a boilerplate version of this: _Find Springfield Pennsylvania Real Estate for sale on www.example.com - your complete source for all Springfield Pennsylvania real estate. Using www.example.com, you can search the entire local Multiple Listing Service (MLS) for up to date Springfield Pennsylvania real estate for sale that may not be available elsewhere. This includes every Springfield Pennsylvania property that's currently for sale and listed on our local MLS. Example Company is a fully licensed Springfield Pennsylvania real estate provider._ Google Webmaster Tools is reporting that some of these pages have over 30,000 internal links on our site. However, GWT isn't reporting any manual actions that need to be addressed. How blatantly abusive and spammy is this? At best, Google doesn't care a spit about it , but worst case is this is actively harming our SERP rankings. What's the best way to go about dealing with this? The site did have Analytics running, but the company lost the account information years ago, otherwise I'd check the numbers to see if we were ever hit by Panda/Penguin. I just got a new Analytics account implemented 2 weeks ago. Of course it's still using deprecated object values so I don't even know how accurate it is. Thanks everyone! qrPftlf.png0 -
Cutting off the bad link juice
Hello, I have noticed that there is plenty of old low quality links linking to many of the landing pages. I would like to cut them off and start again. Would it be ok to do the following?: 1. create new URLs (domain is quite string and new pages are ranking good and better than the affected old landing pages) and add the old content there 2. 302 redirect old landing pages to the new ones 3. put "no index" tag on the old URLs (maybe even "no index no follow"?)or it wouldn't work? Thanks in advance
White Hat / Black Hat SEO | | ThinkingJuice0 -
Negative SEO impacting client rankings - How to combat negative linking?
I have a client which have been losing rankings for the key term "sell gold" in Google AU. However, while doing some investigating I realized that we have been receiving links from bad neighborhoods such as porn, bogus .edu sites as well as some pharmaceutical sites. We have identified this as negative SEO and have moved forward to disavow the links in Google. However, I would like to know what other measures can be taken to combat this type of negative SEO linking? Any suggestions would be appreciated!
White Hat / Black Hat SEO | | dancape0 -
Suggestion for Link Directory Script?
I own a subscription to PHP Link Directory but was wondering if anyone could suggest an alternative link directory script/software/service to PHPLD. Thanks!
White Hat / Black Hat SEO | | fergusonconsulting0 -
Big Brands Still Paying For Links!
We have been spending a lot of time creating unique and relevant content that is helpful to users in order to garner natural links. However, I still see large companies getting paid links to their site. They still rank despite the paid links - many higher that before thanks to the increased brand/domain authority bias by Google. I have seen a number of blogs with posts that have dofollow links to sites like Amazon and Dirtdevil. Are small businesses just getting buried or am I being too cynical?
White Hat / Black Hat SEO | | inhouseseo0 -
Too many links with the same Anchor-text?
My first question at SeoMoz: Recently my gambling site has been experimenting a subtle yo-yo effect for our most sought-after keyword. A month ago we legitimately added a PR-6 inbound link with that keyword (tragamonedas) from an institutional site of our own development. We are worried that google might have regarded that move as an illegitimate link acquisition, since those apparent troubles with our keyword appear to have started right after that link was processed. Is it too late to change the anchor text, in case that action might deliver positive results? Also, we might have focused too much on the very same keyword in our link building campaign. Can a constant repetition of the same anchor harm our indexing reputation? Thank you in advance and good SEO luck, Andi.
White Hat / Black Hat SEO | | castano0