Complicated Question: Removing Spam Backlinks that were Not Requested
-
I'm new and seeking help with the following scenario:
1. Main site: is a domain.com established authority type site
2. Second site: is a domain.org (has robots.txt to no index) but someone obviously not site owner has done negative seo campaign against the .org domain and built spammy links to it. In fact, that's all that exist on this second domain because it is used for development purposes only right now.) No one would link to this one normally as it is just secondary domain used to protect trademark and for development use.) When searching for it by domain name it does not appear on first page for search results. Checking link profile the only links that show for domain.org are spam links.
Have contacted site/s where spam links were placed (no answer)
Main site domain.com and domain.org have same whois and hosted on the same server as they are owned by same company
Main site domain.com still appears first for its name but has lost some rankings. I am working to fix some technical issues ie: duplicate urls with CMS etc, but would like to find out what to do about the domain.org content that clearly has had someone target it with spammy non requested backlinks.)
domain.com has Google webmaster tools account, no messages about unnatural liking in those reports
1. I'm not sure I should add domain.org to GWT to see if there is an unnatural link penalty applied or if this would further connect the two domains through association.
If I could get some feedback/suggestions on what my options are with regards to making sure that the domain.org domain has a clean profile that would be most appreciated. Also because site owner has would like to begin using domain.org in the future for some unique content, but as it stands right now cannot because domain has been targed by poor backlinks.
Anyone else run into situation where the .org or .net versions were targeted by spammy backlinks even though the domains were not actively used?
What's the safest way to proceed? a) Concerned about possible co-penalty between main site domain.com and domain.org b) how to remove problems issues with domain.org so that owner can use it in future.
Many thanks for your thoughts and help with this one. I appreciate any help or feedback.
-
Ok another tricky situation. First of all, I would like to ask you to create a Google Webmaster Account with a new Gmail id for this Domain.org [since you are worried about the association, though I can assure you there is no such thing as association with a penalize website will incur penalty for other websites ].
_So, you need to create a Google Webmaster Tools account to figure out whether the website has got penalized by Penguin update. If you have received any message in Google Webmaster Tools account regarding link warning, you need to get all the spammy links removed before applying for a reconsideration request. However, if this is an algorithmic penalty, you need to give a try to Disavow Link option. _
-
Focus on the main domain and review the backlinks for that domain. If you identify any questionable links use the google disavow tool to submit information on those links.
Check the .org to ensure that the robots.txt no index is properly in place and none of the pages are getting indexed by Google.
-
I would write to Google explaining statically why you are doing it and how you think those links came about. I know that the system is set up so that if you're innocent like a competitor did that to you you would not be penalized.
The best of luck,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Beta Site Removal best practices
Hi everyone.
Intermediate & Advanced SEO | | bgvsiteadmin
We are doing a CMS migration and site redesign with some structural changes. Our temporarily Beta site (one of the staging environments and the only one that is not behind firewall) started appearing in search. Site got indexed before we added robots.txt due to dev error (at that time all pages were index,follow due to nature of beta site, it is a final stage that mirrors live site) As an remedy, we implemented robots.txt for beta version as : User-Agent: *
Disallow: / Removed beta form search for 90 days. Also, changed all pages to no index/no follow . Those blockers will be changed once code for beta get pushed into production. However, We already have all links redirected (301) from old site to new one. this will go in effect once migration starts (we will go live with completely redesigned site that is now in beta, in few days). After that, beta will be deleted completely and become 404 or 410. So the question is, should we delete beta site and simple make 404/410 without any redirects (site as is existed for only few days ). What is best thing to do, we don't want to hurt our SEO equity. Please let me know if you need more clarification. Thank you!0 -
Redirects Being Removed...
Hi We have a team in France who deal with the backend of the site, only problem is it's not always SEO friendly. I have lots of 404's showing in webmaster tools and I know some of them have previously had redirects. If we update a URL on the site, any links pointing to it on the website are updated straight away to point to the most up to date URL - so the user doesn't have to go through a redirect. However, the team would see this as the redirect not being 'used' after about 30 days and remove it from the database - so this URL no longer has any redirects pointing to it. My question is, surely this is bad for SEO? However I'm a little unsure as they aren't actually going through the redirect. But somewhere in cyber space the authority of this page must drop? Any advice is welcome 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
AJAX requests and implication for SEO
Hi, I got a question in regard to webpages being served via AJAX request as I couldn't find a definitive answer in regard to an issue we currently face: When visitors on our site select a facet on a Listing Page, the site doesn't fully reload. As a consequence only certain tags of the content (H1, description,..) are updated, while other tags like canonical URLs, meta noindex,nofollow tag, or the title tag are not updating as long as you don't refresh the page. We have no information about how this will be crawled and indexed yet but I was wondering if anyone of you knows, how this will impact SEO?
Intermediate & Advanced SEO | | FashionLux0 -
Few question about SEO
HI guys, I have few questions and I always find good answer here. I tried many SEO companies some very expensive and well known some with medium prices and some from India. I’m not an SEO expert but I always get the same things from SEO companies. They're saying you have to stay with us for few months before you’ll see any results. I completely understand however I don’t see the result on the end.1. What exactly Do I need SEO company for, after I do on page optimisation if they don’t work on proper backlinks. Just letting you know I’m getting content from other people.2. Is there something else which is really important after your page is optimised than backlinks? Or we should fully focus on get backlinks from customers, guest post, sharing on social media etc. to increase our DA and PA?3. Any advice about some individual or company who is good in backlink services?
Intermediate & Advanced SEO | | Lukas-ST
Thank youLukasThanks a lot.Lukas0 -
Schema question
Hi all, We have two Trustpilot schemas (Local Business) on our web pages ( One on desktop / one on mobile) but we are finding that it is not updating the number of reviews in the search results. When using the tool : https://developers.google.com/structured-data/testing-tool/ , the test results are coming back ok. I have two ideas as to why it may not be working; 1) The duplication of the schema code is causing issues 2) We had to change the html code for all of our 50+ backend pages using a search&replace WordPress plugin to save a vast amount of time. Maybe this is plugin related? The fact that the google testing tool gives back positive results adds to the confusion. I test both of the theorised issues to see if it provides a fixes. Can anyone shed some further light on this issue? Is there something obvious I am missing? All responses are greatly appreciated! Thanks, Tom p.s. Example Page: https://www.allcleartravel.co.uk/asthma-travel-insurance/
Intermediate & Advanced SEO | | AllClearMarketing0 -
Google Places - Advanced Question
Hi All, Ive found multiple threads about previous issues but I haven't found any tailored to my specific question.I know there are a large amount of factors so I wanted to see if any other individuals had ran into this previously. We are currently in a centralized position in a major city. We are discussing moving the main office about 15 miles away into another city, moving us out of the main city where we have been for the past 3 years. The city where we are currently located has a lot more GEO search volume compared to the new city search terms and variants of. If we move will our local rankings drop when someone searches in the city where we were previously? How long would it take for this ranking to fall? Or would we still rank because we are moving a short distance away and have a large amount of citations there? I know we would need to change over all our online directories, on page etc..Any other suggestions on a smooth transition? I know there are many factors that go into this and any past experience, guidance and/or assistance is greatly appreciated. Cheers!
Intermediate & Advanced SEO | | PRKEL0 -
Our login pages are being indexed by Google - How do you remove them?
Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: - No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?
Intermediate & Advanced SEO | | desmond.liang1 -
If this is what happens when a penalty is removed, I want more penalties
So a couple of weeks ago I posted that we had submitted a reconsideration request along with a list of about 40 spam pages that were linking to us that we had attempted to have remove their links to us. On 8/3 we received a note from Google that our manual penalty had been removed. We have thousands of inbound links so these 40 pages were a minuscule part of our links and ones that we hadn't tried to get in the first place. So I thought "Great, our rankings should go up." Up until this point our year-to-year organic Google traffic was between 45% and 100% over last year. As of 8/9 our traffic is now only 26%-39% above last year. I don't think we can handle too many more penalty reversals like this one.
Intermediate & Advanced SEO | | IanTheScot0