Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Changing Server IP Addresses. Should I be concerned?
-
Hello Mozers
Our site has been on a dedicated server for about four years now. (no other sites, just ours on the server)
I have made the decision to move it to a much better and faster server than the current server we are on for more than one reason.
My big fear is Google will lose trust for my site because of the IP change. Ip's stay with the server at 1and1 they do not follow the website.
So, I have done my due diligence and copied over all code and databases and have tested it completely to insure there are no issues when I change the DNS to point to the new server. Made sure 1and1 is giving me an IP that has never been used, I am Keeping the old server on until cached DNS records expire for it.
Is there anything else I need to do to make sure I do not lose current rankings in Google? I have heard nightmare stories about making these kinds of changes but at this point for our site there is no turning back this is a change that must take place.
Any pointers and advice would be much appreciated!
Thanks!
-
Hey Robbie,
Of course you're never entirely sure what Google will do, but if you're only changing host - nothing else - you should have no problem.
Do not:
- Change ownership of the domain;
- Make any major content changes (such as titles);
- Add large chunks of content - keep it to a minimum;
- Make any website template changes;
It's very important that all that's changing is the host. And of course keep an eye on your rankings while doing the migration. Perhaps use a SEOmoz campaign for that. They also do crawl tests so that should be good.
Good luck!
-
If you are only changing to a new hosting provider and you had a dedicated server as well as a dedicated IP. In the content will not be changed there’s not much to worry about at all. Google not lose any trust in you because of an IP address change if you are changing to a white listed IP. the only ways you could actually hurt your site would be if
1St If you moved from a dedicated server to a shared server and had a bad neighbor
"Google recognizes the server’s IP address. If the majority of websites are of ill-repute (porn sites are automatically marked as spammers), then unfortunately this law-abiding client gets lumped in with a bad crowd. Read more: http://online-sales-marketing.com/seo-issues-caused-by-bad-neighbors#ixzz22SZ2T5cA
Under Creative Commons License: Attribution”2nd if you keep both sites up at the same time obviously you get duplicate content. You want to index the new site as soon as possible. Thus inform Google that will allow the Google bot to crawl it and therefore like Google no you are no longer on your old IP.
3rd you could move to a slower host I noticed not talk about right often however slow DNS and slow web hosting both play a role in how Google rank your website. I hope whatever deal you made you are on a host that can provideThe same or better speed at delivering your content. Obviously if you lost a content delivery network or happen to luckily add one those types of things matter to Google. You can check with tools like http://tools.pingdom.com/fpt/ or http://www.webpagetest.org I tend to use the hosts SEOmoz recommends in their pro perks you cannot go wrong with any of them.
4th make sure your DNS is as good if not better it should be better if you’re moving this will keep speed up and problems to a minimum. Here a list of hosted DNS providers http://dns.nuvvo.com/lesson/12509-list-of-hosted-dns-providers I use ultraDNS and DYN if you are looking to use a provider with any cast DNS and not spend much money at all and still have fantastic speeds Amazon Route 53 is a couple dollars a month on average and has an excellent reputation. http://aws.amazon.com/route53/
I hope I have been of some help in just remember people who don’t have dedicated IP’s rank extremely high regardless of the IP address changing.
Sincerely,
Thomas Zickell
-
Generally speaking, if you transition it correctly, have the exact same site up and running on the new IP before you change the DNS you should be fine. I did some Googling on the subject, and Mark D. has a much more specific and detailed description of what you should do as far as making sure you have the exact same site running
http://malteseo.com/seo/changing-ip-address-without-losing-google-ranking/
What you do not want to do at this point is change up your URL structure, title tags etc. Those changes alone can impact your rankings and you don't want to compound the issues. Less change, more gradual change is always better.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Pagination Changes
What with Google recently coming out and saying they're basically ignoring paginated pages, I'm considering the link structure of our new, sooner to launch ecommerce site (moving from an old site to a new one with identical URL structure less a few 404s). Currently our new site shows 20 products per page but with this change by Google it means that any products on pages 2, 3 and so on will suffer because google treats it like an entirely separate page as opposed to an extension of the first. The way I see it I have one option: Show every product in each category on page 1. I have Lazy Load installed on our new website so it will only load the screen a user can see and as they scroll down it loads more products, but how will google interpret this? Will Google simply see all 50-300 products per category and give the site a bad page load score because it doesn't know the Lazy Load is in place? Or will it know and account for it? Is there anything I'm missing?
Intermediate & Advanced SEO | | moon-boots0 -
P.O Box VS. Actual Address
We have a website (http://www.delivertech.ca) that uses a P.O Box number versus an actual address as their "location". Does this affect SEO? Is it better to use an actual address? Thanks.
Intermediate & Advanced SEO | | Web3Marketing870 -
Changing from .com to .com.au
Hi All, we are looking for some guidance please, if at all possible. We have .com domain (the domain is older than 10 years), we have been using it for 2 years. We also have .com.au version of the domain (the domain is 2 years old, pointing to the .com domain) and isn't being used. We are an Australian based company. Our question is, should we be using .com.au instead of .com and if so, how would you advise going about doing the change over without having huge SEO impact on our business (negatively). We are on the home page for most of the searches we have optimized for, but we are always below the .com.au's - which is why we are considering the possibility of the move? Any advice would be GREATLY appreciated 🙂
Intermediate & Advanced SEO | | creativeground0 -
Best way to remove full demo (staging server) website from Google index
I've recently taken over an in-house role at a property auction company, they have a main site on the top-level domain (TLD) and 400+ agency sub domains! company.com agency1.company.com agency2.company.com... I recently found that the web development team have a demo domain per site, which is found on a subdomain of the original domain - mirroring the site. The problem is that they have all been found and indexed by Google: demo.company.com demo.agency1.company.com demo.agency2.company.com... Obviously this is a problem as it is duplicate content and so on, so my question is... what is the best way to remove the demo domain / sub domains from Google's index? We are taking action to add a noindex tag into the header (of all pages) on the individual domains but this isn't going to get it removed any time soon! Or is it? I was also going to add a robots.txt file into the root of each domain, just as a precaution! Within this file I had intended to disallow all. The final course of action (which I'm holding off in the hope someone comes up with a better solution) is to add each demo domain / sub domain into Google Webmaster and remove the URLs individually. Or would it be better to go down the canonical route?
Intermediate & Advanced SEO | | iam-sold0 -
Change url structure and keeping the social media likes/shares
Hi guys, We're thinking of changing the url structure of the tutorials (we call it knowledgebase) section on our website. We want to make it shorter URL so it be closer to the TLD. So, for the convenience we'll call them old page (www.domain.com/profiles/profile_id/kb/article_title) and new page (www.domain.com/kb/article_title) What I'm looking to do is change the url structure but keep the likes/shares we got from facebook. I thought of two ways to do it and would love to hear what the community members thinks is better. 1. Use rel=canonical I thought we might do a rel=canonical to the new page and add a "noindex" tag to the old page. In that way, the users will still be able to reach the old page, but the juice will still link to the new page and the old pages will disappear from Google SERP and the new pages will start to appear. I understand it will be pretty long process. But that's the only way likes will stay 2. Play with the og:url property Do the 301 redirect to the new page, but changing the og:url property inside that page to the old page url. It's a bit more tricky but might work. What do you think? Which way is better, or maybe there is a better way I'm not familiar with yet? Thanks so much for your help! Shaqd
Intermediate & Advanced SEO | | ShaqD0 -
Pitfalls when implementing the “VARY User-Agent” server response
We serve up different desktop/mobile optimized html on the same URL, based on a visitor’s device type. While Google continue to recommend the HTTP Vary: User-Agent header for mobile specific versions of the page (http://www.youtube.com/watch?v=va6qtaiZRHg), we’re also aware of issues raised around CDN caching; http://searchengineland.com/mobile-site-configuration-the-varies-header-for-enterprise-seo-163004 / http://searchenginewatch.com/article/2249533/How-Googles-Mobile-Best-Practices-Can-Slow-Your-Site-Down / http://orcaman.blogspot.com/2013/08/cdn-caching-problems-vary-user-agent.html As this is primarily for Google's benefit, it's been proposed that we only returning the Vary: User-Agent header when a Google user agent is detected (Googlebot/MobileBot/AdBot). So here's the thing: as the server header response is not “content” per se I think this could be an okay solution, though wanted to throw it out there to the esteemed Moz community and get some additional feedback. You guys see any issues/problems with implementing this solution? Cheers! linklater
Intermediate & Advanced SEO | | linklater0 -
Google is displaying wrong address
I have a client whose Google Places listing is not showing correctly. We have control of the page, and have the address verified by postcard. Yet when we view the listing it shows a totally different address that is miles away and on a totally different street. We have relogged into manage the business listing and all of the info is correct. We dragged the marker and submitted it to them that they had things wrong and left a note with the right address. Why would this happen and how can we fix it? Right now they rank highly but with a blatantly wrong address.
Intermediate & Advanced SEO | | Atomicx0 -
Serving different content based on IP location
I have city centric website. For sake of simplicity, say I only have 2 cities -- City A and City B. Depending on a user's IP address, they will either get City A or City B. Users can change their location through javascript on pages. But there is no cross-linking between cities. By this, I mean that unless you can read or execute javascript, there is no way for you to get from city A to City B. My concern is this: googlebot comes to my site, and we serve them up City A. How does City B get discovered if Googlebot doesn't read javascript? We have an xml sitemap plus plenty of backlinks to City B. Is this sufficient? Should I provide a static link to City B (and vice versa) on the homepage for crawling purposes?
Intermediate & Advanced SEO | | ChatterBlock0