Using GeoDNS across 3 server locations
-
Hi,
I have multiple servers across UK and USA. I have a web site that serves both areas and was looking at cloning my sites and using GeoDNS to route visitors to the closest server to improve speed and experience
So UK visitors would connect to UK dedicated server, North America - New York server and so on
Is this a good way or would this effect SEO negatively.
Cheers
Keith
-
Hi Keith,
I meant the physical bandwidth - i.e. your time. I probably should've been more clear in a technical forum!
For the architecture, there are a few common setups. What I am in the middle of doing here at my company is through Google Cloud services. Duplicating the website app or script (I.e. Wordpress, Ghost, Drupal, CMS, Python App, Rails app, etc) across the several servers and using a load balancer to determine the fastest server. In the app's configuration I am using a single Database server also set up on Google Cloud, so when one server executes a command, it is reflected for all users on all servers. If you're Cron-jobbing all the servers you have set up but no common database, you're going to have some integrity issues, with some servers having some comments or edits, and some servers not.
-
Hi,
I have quite a lot of servers dotted around UK and USA so hosting and bandwidth is no big issue. if I host soley UK the ping times is a whopping 100ms+ to USA and vice versa so this leads me to hosting at least bother countries and latency will be 10-20ms and TTFB nice and low
I like the idea of creating and maintaining one major site as all will be English based, any backlinks will always be pointed to the dot com as opposed to splitting across multiple domains. Seo wise not too bothered will be focusing on speed and entertaining people with info on what they looking for - too me this is more important then the rest
Al servers are Cpanel based, so will try and find a solution to replicate sites in real-time or cron based intervals. this will be the next challenge
If I can pull this off it will be great for other sites I have too
Regards
Keith
-
Personally, I would use the one domain. And from what you've said, you would prefer it as well.
Thankfully, rankings are on a domain basis and not an IP basis, so there would be no issue in the first scenario. If you are duplicating and synchronizing the servers, you are better off using the one domain because you aren't creating two separate websites with differing content (UK English vs US English).
Do you have the bandwidth or ability to produce separate versions (for each domain) for each area you want to target? If not you are best off generalizing your website to target all English users instead of en-US, en-GB, etc. You're going to have to evaluate your geotargeting goals and budget.
-
Hi,
Many thansk for your input
I was planning to use cloudns GeoIP to send visitors to the server of their region.
So having one web site - www.xyz.com that is duplicated across three server (location) so all people see the same site. this would maintain the backlinks and no matter if google crawls from USA or UK it will see it as one domain with exception of 3 IP's in useor have www.xyz.com and www.xyz.co.uk as duplicates and set this in google webmaster tools.
plus set the language en-US and en-UKNot sure which is the best solution. www.xyz.com has the most backlinks and DA, where www.xyz.co.uk has zero and will be new to the world
I would rather people generate backlinks for the one domain as well
Your thoughts are welcome
Regards
Keith
-
The way GeoDNS works is through one of two methods: split DNS or load balancing. The end result is the same, the user will be directed to their closest or fastest available server.
Theoretically, this helps achieves a major goal of technical SEO - great site speed.
With the new Google Web Core Vitals update of this year, site speed and user experience has been further notched up as ranking factors. To get more technical– LCP, largest contentful paint, the speed of which the largest asset on a page loads, and FCP, first contentful paint, the speed of which the first legible content is produced on the screen, are site speed signals used by Google in their ranking algorithm. By connecting a user to the closest/ fastest server available, you can bring down the time on LCP and FCP and thereby increase your rank. The rank change may not be immediately noticeable depending on the competitiveness of your keywords and industry. You can measure these and other variables here: https://developers.google.com/speed/pagespeed/insights/
In short: No, your SEO won't be negatively impacted, and it will more likely be positively impacted by these optimizations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any crawl issues with TLS 1.3?
Not a techie here...maybe this is to be expected, but ever since one of my client sites has switched to TLS 1.3, I've had a couple of crawl issues and other hiccups. First, I noticed that I can't use HTTPSTATUS.io any more...it renders an error message for URLs on the site in question. I wrote to their support desk and they said they haven't updated to 1.3 yet. Bummer, because I loved httpstatus.io's functionality, esp. getting bulk reports. Also, my Moz campaign crawls were failing. We are setting up a robots.txt directive to allow rogerbot (and the other bot), and will see if that works. These fails are consistent with the date we switched to 1.3, and some testing confirmed it. Anyone else seeing these types of issues, and can suggest any workarounds, solves, hacks to make my life easier? (including an alternative to httpstatus.io...I have and use screaming frog...not as slick, I'm afraid!) Do you think there was a configuration error with the client's TLS 1.3 upgrade, or maybe they're using a problematic/older version of 1.3?? Thanks -
Technical SEO | | TimDickey0 -
Using 410 To Remove URLs Starting With Same Word
We had a spam injection a few months ago. We successfully cleaned up the site and resubmitted to google. I recently received a notification showing a spike in 404 errors. All of the URLS have a common word at the beginning injected via the spam: sitename.com/mono
Technical SEO | | vikasnwu
sitename.com/mono.php?buy-good-essays
sitename.com/mono.php?professional-paper-writer There's about 100 total URLS with the same syntax with the word "mono" in them. Based on my research, it seems that it would be best to serve a 410. I wanted to know what the line of HTACCESS code would be to do that in bulk for any URL that has the word "mono" after the sitename.com/0 -
Does Google still use Meta descriptions?
I've noticed that Google is not using my Meta description in the SERP results but rather text from my page, it seems to be a similar situation with a couple of the other sites in the same search results. Does anyone know why this would be?
Technical SEO | | OUTsurance0 -
Is this a correct use of 302 redirects?
Hi all, here is the situation. A website I'm working on has a small percentage of almost empty pages. Those pages are filled "dynamically" and could have new content in the future, so, instead of 404ing them, we automatically noindex them when they're empty and remove the noindex once they have content again. The problem is that, due to technical issues we can't solve at the moment, some internal links (and URLs listed in sitemaps) to almost empty pages remain live also when pages are noindexed. In order not to waste Google crawler's time, sending it to noindexed pages through those links, someone suggested us to redirect those pages to our homepage with a 302 (not a 301 since they could become indexable again, so it can't be a permanent redirect). We did that, but after some weeks Search Console reported an increase in soft 404s: we checked it and it is 100% related to the 302 implementation. The questions are: is this a correct use of 302 redirects? Is there a better solution we haven't thought about? Maybe is it better to remove 302s and go back to the past situation, since linking to noindexed pages isn't such a big problem? Thank you so much!
Technical SEO | | GabrieleToninelli0 -
Half of my site is private for members only, should I use the Nofollow on these pages?
I'm trying to increase more traffic to my website. However half of my website is for members only and you have to be logged in to see content on those pages. I'm confused on whether I should use "nofollow" on these pages or not.
Technical SEO | | DelcoUSA0 -
302 redirect used, submit old sitemap?
The website of a partner of mine was recently migrated to a new platform. Even though the content on the pages mostly stayed the same, both the HTML source (divs, meta data, headers, etc.) and URLs (removed index.php, removed capitalization, etc) changed heavily. Unfortunately, the URLs of ALL forum posts (150K+) were redirected using a 302 redirect, which was only recently discovered and swiftly changed to a 301 after the discovery. Several other important content pages (150+) weren't redirected at all at first, but most now have a 301 redirect as well. The 302 redirects and 404 content pages had been live for over 2 weeks at that point, and judging by the consistent day/day drop in organic traffic, I'm guessing Google didn't like the way this migration went. My best guess would be that Google is currently treating all these content pages as 'new' (after all, the source code changed 50%+, most of the meta data changed, the URL changed, and a 302 redirect was used). On top of that, the large number of 404's they've encountered (40K+) probably also fueled their belief of a now non-worthy-of-traffic website. Given that some of these pages had been online for almost a decade, I would love Google to see that these pages are actually new versions of the old page, and therefore pass on any link juice & authority. I had the idea of submitting a sitemap containing the most important URLs of the old website (as harvested from the Top Visited Pages from Google Analytics, because no old sitemap was ever generated...), thereby re-pointing Google to all these old pages, but presenting them with a nice 301 redirect this time instead, hopefully causing them to regain their rankings. To your best knowledge, would that help the problems I've outlined above? Could it hurt? Any other tips are welcome as well.
Technical SEO | | Theo-NL0 -
Does server location (IP) help with international SEO?
If I have a .com.br domain with all my content in Portuguese, will it make a difference if I host in the US or if I host on a server in Brazil? I thought I had the answer to this from Rand in one of his Whiteboard Fridays a while back where he said hosting in the target country would help, but the lady who did the international SEO presentation in MozCon 2011 said any SEO who says this helps doesn't know international SEO. Can anybody point me to a case study on this? I have US and Brazilian dedicated servers but I'd prefer to get rid of the Brazil server expense.
Technical SEO | | jargomang0 -
Server located in Canada but site is for local business in Pennsylvania
My friend's website is for a local business in Philadelphia, Pennsylvania but the MOZbar shows that his server is located in Canada. Will this be detrimental to his SERP rankings? The site was just launched a week ago. I saw various opinions in other forums and I could not find similar questions in this forum. Thanks in advance. -Dan
Technical SEO | | superTallDan0