Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Multiple Domains on 1 IP Address
-
We have multiple domains on the same C Block IP Address. Our main site is an eCommerce site, and we have separate domains for each of the following: our company blog (and other niche blogs), forum site, articles site and corporate site. They are all on the same server and hosted by the same web-hosting company.
They all have unique and different content. Speaking strictly from a technical standpoint, could this be hurting us? Can you please make a recommendation for the best practices when it comes to multiple domains like these and having separate or the same IP Addresses?
Thank you!
-
Sorry, I'm confused about the setup. Hosts routinely run multiple sites off of shared IPs, but each domain name resolves as itself. Users and search bots should never see that redirection at all and shouldn't be crawling the IPs. This isn't an SEO issue so much as a setup issue. Likewise, any rel=canonical tags on each site would be tied to that site's specific domain name.
-
Hello Peter,
We have three sites hosted on the same server with the same IP address. For SEO (to avoid duplicate content) reasons we need to redirect the IP address to the site - but there are three different sites. If we use the "rel canonical" code on the websites, these codes will be duplicates too, as the websites are mirrored versions of the sites with IP address, e.g. www.domainname.com/product-page and 23.34.45.99/product-page. What's the best ways to solve these duplicate content issues in this case? Many thanks!
-
I think that situation's a bit different - if you aren't interlinking and the sites are very different (your site vs. customer sites), there's no harm in shared hosting. If you share the IP and one site is hit with a severe penalty, there's a small chance of bleedover, but we don't even see that much these days. Now that we're running out of IPv4 addresses, shared IPs are a lot more common (by necessity).
-
I have something similar. I'm with Hostgator, I have a VPS level 5. It comes with 4 IP address's and I have about 15 sites, some mine, some customer sites spread out over the addresses. There is very little interlinking between the sites but I was concerned too. I have read that Add-on sites are bad for SEO, but as long as you arent feature building crappy sites and linking them to your main site, should be fine.
-
I think @cgman and @Nakul are both right, to a point. Technically, it's fine. Google doesn't penalize shared IPs (they're fairly common). If you're cross-linking your sites, though, it's very likely Google will devalue those links. That tactic has just been abused too much, and a shared IP is a dead giveaway.
Now, is it worth splitting all these out to gain a little more link-juice? In most cases, probably not. Google knows you own the sites, and may devalue them anyway. Chances are, they've already been devalued a bit. So, I don't think it's worth hours and hours and thousands of dollars to give them all their own homes, in most cases (it is highly situational, though).
The only other potential problem is if one site were penalized - there have been cases where that impacted sites on the same IP, especially cross-linked sites. It's not common, and you may not be at any risk, but it's not unheard of. As @Nakul said, it's a risk calculation.
-
I am presuming all those domains are linking to each other, correct ?
Are they regular or nofollow links ? It boils down how much authority you have on your main domain as well as the other domains. If I were you, I will keep the main e-commerce website on one server and everything else including niche blogs etc on a different server. It's not just SEO, but also security issues.
Essentially, to answer your question, it may not be hurting you to have the niche blogs, a forum with user generated content, the articles site and the corporate site on the same IP/server, but it would help you a lot more if they were on a different server, possibly different Class C IPs. So, you will gain from these links being on a different server. Keep in mind, these links are important for you and its good to increase their value by hosting them separately, because these sites are links that your competition can never get linked from. I would also consider doing a nofollow on them, and that's just my thoughts. I prefer lower risk. Again, it depends on what your e-commerce website's link profile is.
-
There is nothing wrong with having multiple sites / blogs on the same C block IP address. However, if you're trying to use your blogs to link to your products to boost SEO scores then you might want to consider other link building techniques in addition. Building backlinks from sites on same IP is okay, but you'll have greater benefits getting links from sites hosted on other servers.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Redirect multiple domains to 1 domain or not?
Hi there, I have client who has multiple domains that already have some PA and DA. Problem is that most websites have the same content and rank better on different keywords.
Technical SEO | | Leaf-a-mark
I want to redirect all the websites to 1 domain because it’s easier to manage and it removes any duplicate content. Question is if I redirect domain x to domain y do the rankings of domain x increase on domain y? Or is it better to keep domain x separately to generate more referral traffic to domain y? Thanks in advance! Cheers0 -
Redirect typo domains
Hi, What's the "correct" way of redirecting typo domains? DNS A record goes to the same ip address as the correct domain name Then 301 redirects for each typo domain in the .htaccess Subdomains on typo urls still redirect to www or should they redirect to the subdomain on the correct url in case the subdomain exists?
Technical SEO | | kuchenchef0 -
Robots.txt and Multiple Sitemaps
Hello, I have a hopefully simple question but I wanted to ask to get a "second opinion" on what to do in this situation. I am working on a clients robots.txt and we have multiple sitemaps. Using yoast I have my sitemap_index.xml and I also have a sitemap-image.xml I do put them in google and bing by hand but wanted to have it added into the robots.txt for insurance. So my question is, when having multiple sitemaps called out on a robots.txt file does it matter if one is before the other? From my reading it looks like you can have multiple sitemaps called out, but I wasn't sure the best practice when writing it up in the file. Example: User-agent: * Disallow: Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /wp-content/plugins/ Sitemap: http://sitename.com/sitemap_index.xml Sitemap: http://sitename.com/sitemap-image.xml Thanks a ton for the feedback, I really appreciate it! :) J
Technical SEO | | allstatetransmission0 -
How to increase your Domain Authority
Hi Guys, Can someone please provide some pointers on how to best increase your Domain Authority?? Thanks Gareth
Technical SEO | | GAZ090 -
How much will changing IP addresses impact SEO?
So my company is upgrading its Internet bandwidth. However, apparently the vendor has said that part of the upgrade will involve changing our IP address. I've found two links that indicate some care needs to be taken to make sure our SEO isn't harmed: http://followmattcutts.com/2011/07/21/protect-your-seo-when-changing-ip-address-and-server/ http://www.v7n.com/forums/google-forum/275513-changing-ip-affect-seo.html Assuming we don't use an IP address that has been blacklisted by Google for spamming or other black hat tactics, how problematic is it? (Note: The site hasn't really been aggressively optimized yet - I started with the company less than two weeks ago, and just barely got FTP and CMS access yesterday - so honestly I'm not too worried about really messing up the site's optimization, since there isn't a lot to really break.)
Technical SEO | | ufmedia0 -
Block a sub-domain from being indexed
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines? One item i cannot use is the meta "no follow" tag. Thanks! - Kyle
Technical SEO | | kchandler0 -
Why google index my IP URL
hi guys, a question please. if site:112.65.247.14 , you can see google index our website IP address, this could duplicate with our darwinmarketing.com content pages. i am not quite sure why google index my IP pages while index domain pages, i understand this could because of backlink, internal link and etc, but i don't see obvious issues there, also i have submit request to google team to remove ip address index, but seems no luck. Please do you have any other suggestion on this? i was trying to do change of address setting in Google Webmaster Tools, but didn't allow as it said "Restricted to root level domains only", any ideas? Thank you! boson
Technical SEO | | DarwinChinaSEO0