Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Multiple IPs (load balancing) for same domain
-
Hello,
I'm considering moving our main website to a multiple servers, perhaps in multiple different datacenters and use a DNS round robin load balancing by assigning it 4 different IP addresses (probably from 4 different C classes).
example:
ourdomain.com A 1.1.1.1
ourdomain.com A 2.2.2.2
ourdomain.com A 3.3.3.3
ourdomain.com A 4.4.4.4Every time you ping the domain you will get a response from another IP of the group.
Therefore search engines will see a different IP each time they scan the site.
We have used the main IP for our website for past 6 years without changing it. We have a quite good SEO in our niche which I don't want to loose of course.
My question is, will adding more IPs to the domain affect any how on the ranking ? What is the suggested way to do it anyway? What is recommended to do before and after?
Thanks for you attention and help in advance.
Dmitry S.
-
No way. Google may care about DNS repoints, but that's something else entirely.
Google barely cares about shared IPs and 'bad neighborhoods' at this point - so I doubt the IP address matters. Plus, we have several clients using hardware load balancers that do exactly what you describe, and it doesn't hurt them.
-
As far as I know (according to what my colleagues opinions) rank is also linked somehow to the IP address, and from what i've been told, it takes some time until Google learns a new IP address and return the rank to the domain/page.
-
SEO is linked to your domain name, not the IP address. Think of these two scenarios.
1: You move your website hosting company. Your website obviously will get a new IP. Google has stated that this does not affect your SEO.
2: You have your domain hosted on shared hosting. There would then be thousands of sites with the same IP. You clearly don't all share the same SEO values.
-
We work with load balancing a lot using multiple IPs - there are no issues SEO wise however you must be certain that those IPs, in the past, didn't got associated with spam or are on any black lists from previous users as in this case you can get a red flag for bad neighborhood.
In my personal experience this is the only down side - if this is ok - there is no reason for concern - only good things can happen.
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Cleaning up a Spammy Domain VS Starting Fresh with a New Domain
Hi- Can you give me your opinion please... if you look at murrayroofing.com and see the high SPAM score- and the fact that our domain has been put on some spammy sites over the years- Is it better and faster to place higher in google SERP if we create a fresh new domain? My theory is we will spin our wheels trying to get unlisted from alot of those spammy linking sites. And that it would be faster to see results using a fresh new domain rather than trying to clean up the current spammy doamin. Thanks in advance - You guys have been awesome!!
Intermediate & Advanced SEO | | murraycustomhomescom0 -
Multiple H2 tags
Is it advisable to use only one H2 tag? The template designs for some reason is ended up with multiple H2 tags, I realise if any think it's that each one is that are important and it is all relative. Just trying to assess if it's worth the time and effort to rehash the template. Has anyone done any testing or got any experience? Thanks
Intermediate & Advanced SEO | | seoman101 -
:Pointing hreflang to a different domain
Hi all, Let's say I have two websites: www.mywebsite.com and www.mywebsite.de - they share a lot of content but the main categories and URLs are almost always different. Am I right in saying I can't just set the hreflang tag on every page of www.mywebsite.com to read: rel='alternate' hreflang='de' href='http://mywebsite.de' /> That just won't do anything, right? Am I also right in saying that the only way to use hreflang properly across two domains is to have a customer hreflang tag on every page that has identical content translated into German? So for this page: www.mywebsite.com/page.html my hreflang tag for the german users would be: <link < span="">rel='alternate' hreflang='de' href='http://mywebsite.de/page.html' /></link <> Thanks for your time.
Intermediate & Advanced SEO | | Bee1590 -
Duplicate content due to parked domains
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | ajiabs0 -
Unique domains vs. single domain for UGC sites?
Working on a client project - a UGC community that has a DTC model as well as a white label model. Is it categorically better to have them all under the same domain? Trying to figure which is better: XXX,XXX pages on one site vs. A smaller XXX,XXX pages on one site and XX,XXX pages on 10-20 other sites all pointing to the primary site. The thinking on the second was that those domains would likely achieve high DA as well as the primary, and would passing their value to the primary. Thoughts? Any other considerations we should be thinking about?
Intermediate & Advanced SEO | | intentionally0 -
Why would our server return a 301 status code when Googlebot visits from one IP, but a 200 from a different IP?
I have begun a daily process of analyzing a site's Web server log files and have noticed something that seems odd. There are several IP addresses from which Googlebot crawls that our server returns a 301 status code for every request, consistently, day after day. In nearly all cases, these are not URLs that should 301. When Googlebot visits from other IP addresses, the exact same pages are returned with a 200 status code. Is this normal? If so, why? If not, why not? I am concerned that our server returning an inaccurate status code is interfering with the site being effectively crawled as quickly and as often as it might be if this weren't happening. Thanks guys!
Intermediate & Advanced SEO | | danatanseo0 -
Redirect old .net domain to new .com domain
I have a quick question that I think I know the answer to but I wanted to get some feedback to make sure or see if there's additional feedback. The long and short of it is that I'm working with a site that currently has a .net domain that they've been running for 6 years. They've recently bought a .com of the same name as well. So the question is: I think it's obviously preferable to keep the .net and just direct the .com to it. However, if they would prefer to have the .com domain, is 301'ing the .net to the .com going to lose a lot of the equity they've built up in the site over the past years? And are there any steps that would make such a move easier? Also, if you have any tips or insight just into a general transition of this nature it would be much appreciated. Thanks!
Intermediate & Advanced SEO | | BrandLabs0 -
Multiple Authors Google + Authorship
Hello, I took a look through past questions but can't seem to find a definitive answer on setting up Google + Authorship credit (for multiple authors) using a Wordpress blog. Has anyone had experience setting this up? Or could you recommend solid reading/research? I took a look at a couple of Wordpress plug in's but just found them very confusing (so did our IT contact who will ultimately be setting up code for this.) Any direction or advice is appreciated.
Intermediate & Advanced SEO | | SEOSponge0