Does my "spam" site affect my other sites on the same IP?
-
I have a link directory called Liberty Resource Directory. It's the main site on my dedicated IP, all my other sites are Addon domains on top of it.
While exploring the new MOZ spam ranking I saw that LRD (Liberty Resource Directory) has a spam score of 9/17 and that Google penalizes 71% of sites with a similar score. Fair enough, thin content, bunch of follow links (there's over 2,000 links by now), no problem. That site isn't for Google, it's for me.
Question, does that site (and linking to my own sites on it) negatively affect my other sites on the same IP? If so, by how much? Does a simple noindex fix that potential issues?
Bonus: How does one go about going through hundreds of pages with thousands of links, built with raw, plain text HTML to change things to nofollow? =/
-
@Tom Roberts, your thinking is about on the same page as mine. I've always been suspect of "C-Blocks" as a ranking too. I don't use a CMS for this site, as I said it's all hard coded. Does the nofollow tag in the head section have the same effect as a nofollow on individual links? At least PHP could solve that issue pretty easily.
-
Hi Ethan
In theory - yes it could. We know that Google looks at a domain's Class-C IP level (at least - now that Google is a registrar it may extend to the full IP) when judging its quality. If a site is in a "bad neighbourhood" - ie sitting on an IP range with a number of 'spam' sites - then theoretically it could be affected, as a kind of guilty by association.
However, in reality I have some doubts as to whether this would happen. The fact is that the vast majority of the web uses shared hosting (particularly sme's) and so good sites are invariably always going to be mixed in with 'bad' sites. And what's stopping me from deliberately making a bad site on your IP in order to 'poison it'? I'm no way near that evil, but someone might be.
What I'm getting at here is that it seems extremely unlikely that there is a manageable way to differentiate these sites efficiently - which leads me to believe that having your spam site on the same IP as some 'good' sites _shouldn't _make a difference.
What you can do to reduce this risk even further would be to make sure the 'spam' site doesn't link to any properties of yours that you want to protect, to nofollow those links if feasible (not sure what CMS you're using but WordPress has a few plugins that would do this on bulk) and, if it isn't required, noindexing the site would pretty much get rid of all risk completely.
Hope this helps!
-
If you don't use the site in Google, you should noindex it just to clear up any potential issues (especially if the domains link together in any way.)
- Bonus: How does one go about going through hundreds of pages with thousands of links, built with raw, plain text HTML to change things to nofollow? =/
Download the full site and open all the pages in Notepad++. Find & replace. Save, reupload.
-
Hi Ethan,
I will say yes & no. Please watch below Matt cutts video on the exact issue.
https://www.youtube.com/watch?v=AsSwqo16C8s&noredirect=1
I hope it helps.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
Site address change: new site isn't showing up in Google, old site is gone.
We just transitioned mccacompanies.com to confluentstrategies.com. The problem is that when I search for the old name, the old website doesn't come up anymore to redirect people to the new site. On the local card, Google has even taken off the website altogether. (I'm currently still trying to gain access to manage the business listing) When I search for confluent strategies, the website doesn't come up at all. But if I use the site: operator, it is in the index. Basically, my client has effectively disappeared off the face of the Google. (In doing other name changes, this has never happened to me before) What can I do?
Technical SEO | | MichaelGregory0 -
Switching site from http to https. Should I do entire site?
Good morning, As many of you have read, Google seems to have confirmed that they will give a small boost to sites with SSL certificates this morning. So my question is, does that mean we have to switch our entire site to https? Even simple information pages and blog posts? Or will we get credit for the https boost as long as the sensitive parts of our site have it? Anybody know? Thanks in advance.
Technical SEO | | rayvensoft1 -
How to use rel="alternate" properly for mobile directory.
Hey everyone, For the URL - http://www.absoluteautomation.ca/dakota-alert-dcpa-p/dkdcpa2500.htm - I have the following tags in the header: rel="canonical" href="http://www.absoluteautomation.ca/dakota-alert-dcpa-p/dkdcpa2500.htm" /> rel="alternate" media="only screen and (max-width: 640px)" href="http://www.absoluteautomation.ca/mobile/Product.aspx?id=37564" /> Yes Google WMT is reading these as duplicate pages with duplicate titles, meta descriptions etc. How can I fix this? Thanks!
Technical SEO | | absoauto0 -
Help with pages Google is labeling "Not Followed"
I am seeing a number of pages that I am doing 301 redirects on coming up under the "Not Followed" category of the advanced index status in google webmasters. Google says this might be because the page is still showing active content or the redirect is not correct. I don't know how to tell if the page is still showing active content, and if someone can please tell me how to determine this it would be greatly appreciated. Also if you can provide a solution for how to adjust my page to make sure that the content is not appearing to be active, that would be amazing. Thanks in advance, here is a few links to pages that are experiencing this: www.luxuryhomehunt.com/homes-for-sale/sunnyisles.html www.luxuryhomehunt.com/homes-for-sale/summerield.html
Technical SEO | | Jdubin0 -
Backlinks go to "example.com" our homepage is "example.com/default.html" am I losing internal link power?
Hey everyone! Thanks again for everybodies contributions to my questions over the last few months. As the title states, our homepage is at "example.com/default.html" but everybody that backlinks to us (as expected) to "example.com" does that mean that I am probably losing a lot of the power of my links??
Technical SEO | | TylerAbernethy0 -
Want to Target Mobile site for Google Mobile Version and Desktop Site for Google Desktop Version
I have ecommerce site with both mobile version and desktop version. Mobile version starts with m.example.com and full version starts with www.example.com I am using same content through out both site and using 301 redirection by detecting user agent vice-versa. My both sites are accessible to crawl by any google spider. I have submitted both sites's sitemap to GWT and mobile site having mobile sitemap xml, so google can easily recognize my mobile site. Is it going to help to rank my both sites as per my expectation? I need to rank for mobile site in Google mobile and ranking for desktop site in Google desktop version. Some of pages of my mobile site are started to appearing in Google desktop version. So how I can stop them to appear in Google desktop? Your comments are highly welcome.
Technical SEO | | Hexpress0 -
Time on site
From what I understand, if you search for a keyword say "blue widgets" and you click on a result, and then spend 10 seconds there, and go back to google and click on a different result google will track that first result as being not very relevant. What I don't understand is what happens when (and this happens all the time, i did it today) you click on a result go to that page, find it (not?) relevant and then get distracted, phone call, or someone calls you into another room in the office. You end up accidentally leaving the tab open all day long, and never go back to the google search. So your time on site to google is what? infinity? there must be an upper cap here? at some point they must say, ok, the user is gone, time on site = our maximum = 5 minutes?!? Get me? any insight?
Technical SEO | | adriandg0