Moving to the cloud - dynamic or static IP address?
-
We're looking at moving our websites to the cloud. Most services seem to default to providing a dynamic IP address, with static IP addresses being offered as paid extras.
Is there an SEO disadvantage to having a dynamic IP address?
-
Thanks for the quick response Ryan!
We don't use SSL on these sites, and our emails go via an external software provider, so looks like we can do without a static IP to begin with.
-
Is there an SEO disadvantage to having a dynamic IP address?
No.
If you use SSL which is required for accepting payments, a static IP is required.
If I was to reach for a corner case where a dynamic IP could be a problem, it's possible you end up with an IP which was previously used by a "bad" site and has been blocked by e-mail filters or networks. Otherwise there is no inherent disadvantage associated with a dynamic IP.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Two websites, one company, one physical address - how to make the best of it in terms of local visibility?
Hello! I have one company which will be operating in two markets, printing and website design / development. I’m planning on building two websites, each for every market. But I’m a bit confused about how to optimize these websites locally. My thought is to use my physical address for one website (build citations, get listed in directories, etc. ) and PO Box for another. Do you think there is a better idea?
Technical SEO | | VELV1 -
Is this a Risky Blog Move?
I have a client who's thinking of placing their blog on a separate domain because the plug-ins and various other functionality is becoming bulky and slowing things down for the main site. There will be a 'Blog' link on the company's website navigation, just as there is now, that will take people to the blog. As an SEO person, this seems like a bad idea, even if we set up 301s from all the old posts to all the new ones. In my research I came across these two points: All backlinks to blog posts contribute directly to a website’s OVERALL SEO strength because those backlinks are pointing to your main domain. Removing them may reduce overall link juice to the site. Simply having fewer content pages on the site will cause entire site to rank lower because Google loves content-rich authority sites. Does anyone know this to be true for sure? Thanks,
Technical SEO | | Caro-O
~Caro0 -
Closed Address Google Local
While there are some older conversations pertaining to Google Local/Plus, I am not sure if issue is a bit different. The company I work for at one time had two locations. Both are brick & mortar, physical locations. The factory closed several years ago. To my surprise, the old location is coming up in a few Google searches as a Google Plus page (actually just located it toward the end of last week.) It is currently unclaimed. There are a handful of citations out on the web as well. To remove the factory listing (the one we don't want, which I am pretty sure is confusing Google), what is the best approach? Remove/update citations for the old listing? And then claim it and suspend it using our Google Places account? It took a while to claim the listing we actually want and I just want to be sure we handle removing the old one correctly. Any insight or advice is appreciated!
Technical SEO | | SEOSponge0 -
Moving most (not all) content to another domain
Hi there, My company website has 3 main sections, two of those sections (each containing approx. 50 pages) will be moving to a separate website. The new website will also be owned by the same company. The new domain does not yet exist. I read this guide http://www.seomoz.org/blog/seo-guide-how-to-properly-move-domains , its very good, however it refers to moving the whole domain to a new URL. Are there any specific differences to consider in my situation for a partial move? Many thanks in advance! Nigel
Technical SEO | | Richard5550 -
How do we ensure our new dynamic site gets indexed?
Just wondering if you can point me in the right direction. We're building a 'dynamically generated' website, so basically, pages don’t technically exist until the visitor types in the URL (or clicks an on page link), the pages are then created on the fly for the visitor. The major concern I’ve got is that Google won’t be able to index the site, as the pages don't exist until they're 'visited', and to top it off, they're rendered in JSPX, which makes things tricky to ensure the bots can view the content We’re going to build/submit a sitemap.xml to signpost the site for Googlebot but are there any other options/resources/best practices Mozzers could recommend for ensuring our new dynamic website gets indexed?
Technical SEO | | Hutch_e0 -
Is it Helpful to Add a Dynamic Blog Feed to Your Homepage?
Do you think it helps SEO significantly to add a feed on your homepage that shows a snippet of your latest blog posts? If yes, have you seen any results from doing this?
Technical SEO | | ProjectLabs0 -
Is there a penalty for linking to sites that are all hosted on the same IP address?
Hi... We're doing some reciprocal link building and a gentleman has been kind enough to offer me sever additional links for the exchange. All of them (5) are on the same IP address as one of his links to which we have already linked. They are in a related field of endeavor, legal websites. If I make the swap with him, is Google going to disregard, penalize or otherwise marginalize my efforts? Thanks!
Technical SEO | | hornsbylaw0