IP ranges and matching WHOIS
-
I have a client who owns two large websites and they sit on the same IP Range: XXX.XX.11.124 and XXX.XX.11.126. Obviously the X's match. Furthermore the whois in the same.
Website A is really old and has millions of pages, but has some unintentional but spammy subdomains that are duplicate content.... we are talking thousands. At this point addressing those subdomains is not an option. Website A also links to millions of times to Website B, but those links are either nofollowed or in javascript.
Website A has a ton of duplicate content through job feeds.
Now Website B is about 4-5 years old and has a great link graph but has been hit hard by Panda in which we recovered only to dive again. My question is, Website B is very close to having near perfect onpage SEO, hierarchy, etc. Could website A be effecting it in anyway?
If we assume that Website A is impacting B, what is the safest solution.... change servers and IP addresses? Change host entirely? Do I need to worry about whois?
Thank you!
-
I think what your asking, at a deeper level, if the really crappy SEO on site A can effect site B, through some form of administrative relationship. Is this correct? If there is a lot of questionable linking between the sites, you can see some negative effects. (if there is no linking relationship between the sites, then the answer is almost always no)
I agree with Alan completely that changing servers, IP address and all that doesn't effect the issue that those links still point to your website. Changing the host or other administrative associations of any of these sites likely would have zero impact on any penalties.
First, you want to determine if site A really is the problem. Is it Panda? Is it Penguin? Make sure to match up the dates of your traffic fluctuation with historical algorythm changes. http://www.seomoz.org/google-algorithm-change
We seen a lot of site-wide cross linking, with over-optimized anchor text, as a key root of a lot of recent Penguin penalties.
All of these penalties have a lot of factors that could be the cause, so I'd make sure to look everywhere, including the links between the sites.
-
If Site A is a cash cow that does not need SEO, then I would block the entire site from search engines via robots.txt file. Even on separate hosts, all the links pointing to Site B are a big negative due to the sheer volume, given that there's likely a "bad rap" label associated with SEO on site A.
Duplicate content does not need a "same server" relationship to be a big problem either. All duplicate content is a problem regardless of location.
If a client I represent is doing things that I believe are impeding their success, I personally believe it's important to communicate my concern. However, if they choose to ignore that communication, that's their right to do so.
-
Honestly.. Site A is a Cash Cow that does not need SEO. I would love to clean it up, but there are external forces preventing that at the moment. These forces are not technical.
Site B is in the same industry as A, but with a different model and user. Both sites are pertient.
Let me simplify this question....
1.Can duplicate content effect more than just the root domain and spread across to other websites on the same IP range? and WHOIS?
2. Moreover does any SEO need to worry about what their clients are doing on sites that are out of their control (assuming no blackhat techniques)?
-
The question is this - why would you want to keep Site A given the current insurmountable challenge you describe?
Do you still hope there's some value in it being kept alive?
Do you still hope there's some SEO value or that the site will or does continue to bring some traffic you believe to be valuable?
Because (and this is just my opinion) if you are convinced you cannot or will not (for whatever reason) work to clean the mess up, you'd be better off completely killing off Site A.
If you don't even if you migrate site B to a different server, the links still exist. The footprint remains.
-
If You can't address them can you create a sitemap or robot that doesn't count them?
https://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
IP Redirect causing Indexing Issue
Hi, I am trying to redirect any IP from outside India that comes to Store site (https://store.nirogam.com/) to Global Store site (https://global.nirogam.com/) using this methodThis is causing various indexing issues for Store site as Googlebot from US also gets redirected!- Very few pages for "store.nirogam.com/products/" are being indexed. Even after submission of sitemap it indexed ~50 pages and then went back to 1 page etc. Only ~20 pages indexed for now.- After this I tried manually indexing via "Crawl -> Fetch as Google" - but then it showed me a redirect to global.nirogam.com. All have their "status -> Redirected" - This is why bots are not able to index the site.What are possible solutions for this? How can we tell bots to index these pages and not get redirected?Will a popup method where we ask user if they are outside India help in solving this issue?All approaches/suggestions will be highly appreciated.
Technical SEO | | pks3330 -
IP address changed and some rankings drop
I changed my hosting company coz better server hardware and results (google). My website was perfectly for every queries on google but after changing company and ip address some results dropped to second page. What can i do now? These drops caused by changing ip address?
Technical SEO | | umutege0 -
Htaccess - multiple matches by error
Hi all, I stumbled upon an issue on my site. We have a video section: www.holdnyt.dk/video htaccess rule: RewriteCond %{REQUEST_FILENAME} !-f
Technical SEO | | rasmusbang
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^video index.php?area=video [L,QSA] Problem is that these URLs give the same content:
www.holdnyt.dk/anystring/video
www.holdnyt.dk/whatsoever/video Any one with a take on whats wrong with the htaccess line? -Rasmus0 -
Multiple Domains pointing to one IP
Hi we have some issues with Multiple domains pointing to one IP. is this considered duplicate content by Google? If so, what is the best thing can we do to avoid this? thanks
Technical SEO | | solution.advisor0 -
How to Recover from the Exact Match Domain Update last weekend?
Hi, Our main website a geo exact match city dot com domain which is 18 years old got all its rankings washed away from top ten to 300th to 400th position plus on Google. We are still top 10 on Yahoo and Bing for the City term. The site is Dmoz listed , has quality updated daily content and no paid advertising or links of any kind. I joined seomoz today and noticed I via the page optimiser that the main keyword for the city which is the exact match domain was over optimised but its hard to talk about events and news in a city without mentioning the citys name lol Looking at the backlinks in the open site explorer tool I noticed alot of spammy links from chinese website and weird domains that were not related and am beginning to think I was the victim of a negative seo campaign. I submitted a re-consideration request with Google and they replied in the space of a few days saying that it was not a manual penalty and that it was down to there algorithm changing or perhaps I changed cms or made changes. There was no unatural links warning either. Funny thing is the website is wordpress based and we changed to a new theme 2 weeks ago and also moved to a new server. Just wondering people views on what my next move shoud be to try and regain our rankings on Google I already toned down the use of the city keyword on the homepage. Regards Tom
Technical SEO | | glenanail0 -
Moving to the cloud - dynamic or static IP address?
We're looking at moving our websites to the cloud. Most services seem to default to providing a dynamic IP address, with static IP addresses being offered as paid extras. Is there an SEO disadvantage to having a dynamic IP address?
Technical SEO | | heatherrobinson0 -
Is IP-Hosting an effective and worthy SEO solution?
I have seen few websites selling same package of IP-Hosting, I consider that kind of smart SEO solution if it really be effective, But I have doubt. I have a few websites using a dedicated server for 6 years with only 1 valid IP and with that experience, I think rich content, title tags and anchor links are most important factors but I never tried such IP SEO, has any one tried IP hosting? Then comes second questions, If you have experienced and recommend IP Hosting, the reasonable application could be wordpress to manage all the sites in diffrent servers, and by using plugins such as autoblog or RSS importer we can update all the sites with minimum effort, But the problem is that neither of these 2 plugins are really working hassle free, They do not work at all or they stop wroking all the sudden, preferably I need a solution for Wordpress multisite Would you offer any alternative? I appreciate if you share your ideas... P.S.: My objective to apply IP hosting is not geo-targeting I use to launch .COMs normally without any particular geo-target.
Technical SEO | | Pooria0 -
Geotargeting a folder in GWT & IP targeting
I am curently managing a .com that targets Canada and we will soon be launching a .com/us/ that will target the US. Once we launch the /us/ folder, we want to display the /us/ content to any US IP. My concern is that Google will then only index the /us/ content, as their IP is in the US. So, if I set up .com and .com/us/ as two different sites in GWT, and geotarget each to the Country it is targeting, will this take care of the problem and ensure that Google indexes the .com for Canada, and the /us/ for the US? Is there any alternative method (that does not include using the .ca domain)? I am concerned that Google would not be able to see the .com content if we are redirecting all US traffic to .com/us/. Any examples of this online anywhere?
Technical SEO | | bheard0