How to take down a sub domain which is receiving many spammy back-links?
-
Hi all,
We have a sub domain which has less engagement for last few years. Eventually many spammy back links pointed to this sub domain. There are relevant back links too. We have deleted most of the pages which are employing spammy content or which have spammy back links. Still I'm confused whether to take this sub domain down or keep it. The confusion between "relevant backlinks might be helping our website" and "spammy backlinks are affecting to drop in rankings"?
Thanks
-
Hi vtmoz,
OK. You can upload a file containing all the domains you want to disavow you don't need to do that one by one. To check thousands of links is not something one wants to do for sure actually...
How you could do it: Disavow them all (from Webmaster Tools you export them all to a file) and then you delete a couple of dozens you know are strong and valuable domains.
Cheers,
Cesare
-
Hi Cesare,
But there are too many backlinks from different sub domains and how we are going to check thousands of links to disavow them? I think its hard to go with.
-
Hi vtmox,
Simply disavow the links that are spammy: https://support.google.com/webmasters/answer/2648487?hl=en. Thats it. Doing that you tell Google which ones not to take into account and the "good" ones will still going to benefit your subdomain. There is no need to take the subdomain down.
Hope this helps.
Cheers,
Cesare
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How effective are nofollow links today (2013) ?
Hi, We had a question about the effectiveness of nofollow today. Nofollowing some links on pages was to make sure pagerank flows to content which is most relevant and useful to visitors on the site. Looking at the 2009 article, http://www.seomoz.org/blog/google-says-yes-you-can-still-sculpt-pagerank-no-you-cant-do-it-with-nofollow, it seems that adding the meta tag nofollow would no longer help us in ensuring this goal. We had a couple of questions: 1. Do you think Google today only passes pagerank to dofollow links
Algorithm Updates | | SEMEnthusiast
2. Are sites today using iframes/javascript to make sure googlebot passes pagerank to only relevant pages
3. Any other best practice you would suggest Thanks0 -
Best way of seeing how many links come from individual root domain domains.
Just wondering how best to see this - which tool to use. I'm dealing with a website with several thousand inbound links from around 100 root domains. Thanks in advance, Luke
Algorithm Updates | | McTaggart0 -
Outsourcing of guest blog articles and usefulness of links from guest blogging
I'm not the greatest writer but want to do some guest blogging for links and traffic. Are there any businesses out there that write world class guest blog articles for subjects that match my business? Also, i've read that doing this for links is really moot because the blog posts get archived and become pagerank "unranked" thereby offering little link value after about a month or so. Once they get archived do they still get counted by google and does the anchor text and page rank still count? Thanks in advance mozzers! Ron
Algorithm Updates | | Ron100 -
Back to result after manual action revoked
Hello, Two weeks ago we got message from Google which said about our reconsideration request has been sucesfuly processed after few months of fight. So, manual action has been revoked but our problem is more complex. Here are flow: 1. 1-01-2012 we got filtered (manual filter - buy link) Here are 100k vistors per day... 2. 7-05-2012 we have problem with budget, hosting, etc.. Before that (after get filter) we get 20k vistors per day.. 3. 12-07-2012 we back Our statistics at this moment: 10-12k vistors per day, so we make 301 from country domain to global *.com 4.12.07-2012 - 1-09-2012 We try make a good reconsideration request without any positive result... Our statistics after 301 redirect are 5k vistors per day, not more. Every day is : 5 000 (+/- 5 %) 5.14-12-2012 Our reconsideration request has been procesed positive, manual action has been revoked but... after two weks from this message we have the same statistics (still are the same traffic source precentages). So whats going on? When will be back to SERP? Thanks! It's very important for us, before we got filter our statistics are permanent good. At this moment we don't see result for "keyword domainame" in TOP 10, only positions > 50 sometimes are normal (1-2 % of queries). Our competition have good statistics all time, their pages are above our results for phrases like: "keyword domainame"
Algorithm Updates | | thenaturat0 -
When to remove bad links.
Hi everyone. We were hit on the 5th Oct with manual penalties - after building some good links and building good content we saw some gains in our SERPS, not to where they were, but they are definately improving for some low competition keywords. In this case would people recommend still trying to remove bad links? We have audited our links and identified ones which seem spammy. We were going to go through a step by step process, emailing bad link providers where possible, and then sending a disavow for any links we were not able to remove. If we have started to see gains through other means is it wise in people's opinion to start contacting google? We watched Matt Cutts video on disavow usage and he states not to use it unless in extreme situations, so we don't want to 'wake the beast'. Many thanks. James.
Algorithm Updates | | Quime0 -
How to Link a Network of Sites w/o Penguin Penalties (header links)
I work for a network of sites that offer up country exclusive content. The content for the US will be different than Canada, Australia, Uk, etc.… but with the same subjects. Now to make navigation easy we have included in the header of every page a drop down that has links to the other countries, like what most of you do with facebook/twitter buttons. Now every page on every site has the same link, with the same anchor text. Example: Penguins in Canada Penguins in Australia Penguins in the USA Because every page of every site has the same links (it's in the header) the "links containing this anchor text" ratio is through the roof in Open Site Explorer. Do you think this would be a reason for penguin penalization? If you think this would hurt you, what would you suggest? no follow links? Remove the links entirely and create a single page of links? other suggestions?
Algorithm Updates | | BeTheBoss0 -
FLASH vs HTML links in SEO
In terms of a small flash slideshow and having text and links on various slides within, is such text and links as easily index-able (or even at all) compared to static html text on a webpage?
Algorithm Updates | | heritageseo0 -
Today all of our internal pages all but completely disappeared from google search results. Many of them, which had been optimized for specific keywords, had high rankings. Did google change something?
We had optimized internal pages, targeting specific geographic markets. The pages used the keywords in the url title, the h1 tag, and within the content. They scored well using the SEOmoz tool and were increasing in rank every week. Then all of a sudden today, they disappeared. We had added a few links from textlink.com to test them out, but that's about the only change we made. The pages had a dynamic url, "?page=" that we were about to redirect to a static url but hadn't done it yet. The static url was redirecting to the dynamic url. Does anyone have any idea what happened? Thanks!
Algorithm Updates | | h3counsel0