Using GeoDNS across 3 server locations
-
Hi,
I have multiple servers across UK and USA. I have a web site that serves both areas and was looking at cloning my sites and using GeoDNS to route visitors to the closest server to improve speed and experience
So UK visitors would connect to UK dedicated server, North America - New York server and so on
Is this a good way or would this effect SEO negatively.
Cheers
Keith
-
Hi Keith,
I meant the physical bandwidth - i.e. your time. I probably should've been more clear in a technical forum!
For the architecture, there are a few common setups. What I am in the middle of doing here at my company is through Google Cloud services. Duplicating the website app or script (I.e. Wordpress, Ghost, Drupal, CMS, Python App, Rails app, etc) across the several servers and using a load balancer to determine the fastest server. In the app's configuration I am using a single Database server also set up on Google Cloud, so when one server executes a command, it is reflected for all users on all servers. If you're Cron-jobbing all the servers you have set up but no common database, you're going to have some integrity issues, with some servers having some comments or edits, and some servers not.
-
Hi,
I have quite a lot of servers dotted around UK and USA so hosting and bandwidth is no big issue. if I host soley UK the ping times is a whopping 100ms+ to USA and vice versa so this leads me to hosting at least bother countries and latency will be 10-20ms and TTFB nice and low
I like the idea of creating and maintaining one major site as all will be English based, any backlinks will always be pointed to the dot com as opposed to splitting across multiple domains. Seo wise not too bothered will be focusing on speed and entertaining people with info on what they looking for - too me this is more important then the rest
Al servers are Cpanel based, so will try and find a solution to replicate sites in real-time or cron based intervals. this will be the next challenge
If I can pull this off it will be great for other sites I have too
Regards
Keith
-
Personally, I would use the one domain. And from what you've said, you would prefer it as well.
Thankfully, rankings are on a domain basis and not an IP basis, so there would be no issue in the first scenario. If you are duplicating and synchronizing the servers, you are better off using the one domain because you aren't creating two separate websites with differing content (UK English vs US English).
Do you have the bandwidth or ability to produce separate versions (for each domain) for each area you want to target? If not you are best off generalizing your website to target all English users instead of en-US, en-GB, etc. You're going to have to evaluate your geotargeting goals and budget.
-
Hi,
Many thansk for your input
I was planning to use cloudns GeoIP to send visitors to the server of their region.
So having one web site - www.xyz.com that is duplicated across three server (location) so all people see the same site. this would maintain the backlinks and no matter if google crawls from USA or UK it will see it as one domain with exception of 3 IP's in useor have www.xyz.com and www.xyz.co.uk as duplicates and set this in google webmaster tools.
plus set the language en-US and en-UKNot sure which is the best solution. www.xyz.com has the most backlinks and DA, where www.xyz.co.uk has zero and will be new to the world
I would rather people generate backlinks for the one domain as well
Your thoughts are welcome
Regards
Keith
-
The way GeoDNS works is through one of two methods: split DNS or load balancing. The end result is the same, the user will be directed to their closest or fastest available server.
Theoretically, this helps achieves a major goal of technical SEO - great site speed.
With the new Google Web Core Vitals update of this year, site speed and user experience has been further notched up as ranking factors. To get more technical– LCP, largest contentful paint, the speed of which the largest asset on a page loads, and FCP, first contentful paint, the speed of which the first legible content is produced on the screen, are site speed signals used by Google in their ranking algorithm. By connecting a user to the closest/ fastest server available, you can bring down the time on LCP and FCP and thereby increase your rank. The rank change may not be immediately noticeable depending on the competitiveness of your keywords and industry. You can measure these and other variables here: https://developers.google.com/speed/pagespeed/insights/
In short: No, your SEO won't be negatively impacted, and it will more likely be positively impacted by these optimizations.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any crawl issues with TLS 1.3?
Not a techie here...maybe this is to be expected, but ever since one of my client sites has switched to TLS 1.3, I've had a couple of crawl issues and other hiccups. First, I noticed that I can't use HTTPSTATUS.io any more...it renders an error message for URLs on the site in question. I wrote to their support desk and they said they haven't updated to 1.3 yet. Bummer, because I loved httpstatus.io's functionality, esp. getting bulk reports. Also, my Moz campaign crawls were failing. We are setting up a robots.txt directive to allow rogerbot (and the other bot), and will see if that works. These fails are consistent with the date we switched to 1.3, and some testing confirmed it. Anyone else seeing these types of issues, and can suggest any workarounds, solves, hacks to make my life easier? (including an alternative to httpstatus.io...I have and use screaming frog...not as slick, I'm afraid!) Do you think there was a configuration error with the client's TLS 1.3 upgrade, or maybe they're using a problematic/older version of 1.3?? Thanks -
Technical SEO | | TimDickey0 -
How to redirect 302 status to 301 status code using wordpress
I just ran the link opportunity option within site explorer and it shows that 31 pages are currently in a 302 status. Should I try to convert the 302's to 301's? And what is the easiest way to do this? I see several wordpress plugins that claim to do 301 redirects but I don't know which to choose. Any help would be greatly appreciated!
Technical SEO | | vmsolu0 -
Will it make any difference to SEO on an ecommerce site if they use their SSL certificate (https) across every page
I know that e-commerce sites usually have SSL certificates on their payment pages. A site I have come across is using has the https: prefix to every page on their site. I'm just wondering if this will make any difference to the site in the eyes of Search Engines, and whether it could effect the rankings of the site?
Technical SEO | | Sayers1 -
Using Sitemap Generator - Good/Bad?
Hi all I recently purchased the full licence of XML Sitemap Generator (http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html) but have yet used it. The idea behind this is that I can deploy the package on each large e-commerce website I build and the sitemap will be generated as often as I set it be and the search engines will also be pinged automatically to inform them of the update. No more manual XML sitemap creation for me! Now it sounds great but I do not know enough about pinging search engines with XML sitemap updates on a regular basis and if this is a good or bad thing? Can it have any detrimental effect when the sitemap is changing (potentially) every day with new URLs for products being added to the site? Any thoughts or optinions would be greatly appreciated. Kris
Technical SEO | | yousayjump0 -
Using Drupal to Author Websites across 2 Domains
I am new to Drupal and as an organization we are considering using it to author both our corporate site and our blog. In the future we'd like our blog to live at a separate domain and I understand Drupal is capable of publishing across 2 domains. Does anyone know of any SEO implications to this type of infrastructure set up? Are there specific things to be mindful of when setting up the Drupal CMS across 2 domains? thanks for the assistance!
Technical SEO | | Hershel.Miller0 -
Server located in Canada but site is for local business in Pennsylvania
My friend's website is for a local business in Philadelphia, Pennsylvania but the MOZbar shows that his server is located in Canada. Will this be detrimental to his SERP rankings? The site was just launched a week ago. I saw various opinions in other forums and I could not find similar questions in this forum. Thanks in advance. -Dan
Technical SEO | | superTallDan0 -
301 redirect to 1 of 3 locations based on browser languge? Is this ok?
Hi all, I'm taking over a site that has some redirect issues that need addressed and I want to make sure this is done right the first time. The problem: Our current setup starts with us allowing both non-www and www pages. I'll address this with a proper rewrite so all pages will have www. Server info: IIS and runs PHP. The real concern is that we currently run a browser detection for language at the root and then do a 302 redirect to /en, /ge or /fr. There is no page at the www.matchware.com. It's an immediate redirect to a language folder. I'd like to get these to a 301(Permanent) redirect but I'm not sure if a URL can have a 301 redirect that can go to 3 different locations. The site is huge and a site overhaul is not an option anytime soon. Our home page uses this: <%
Technical SEO | | vheilman
lang = Request.ServerVariables("HTTP_ACCEPT_LANGUAGE")
real_lang = Left(lang,2)
'Response.Write real_lang
Select case real_lang
case "en"
Response.Redirect "/en"
case "fr"
Response.Redirect "/fr"
case "de"
Response.Redirect "/ge"
case else
Response.Redirect "/en" End Select
%> Here is a header response test. ++++++++++++++++++++++++++++++++++++++++++++++++++++++++ HTTP Request Header Connect to 87.54.60.174 on port 80 ... ok GET / HTTP/1.1[CRLF] Host: www.matchware.com[CRLF] Connection: close[CRLF] User-Agent: Web-sniffer/1.0.37 (+http://web-sniffer.net/)[CRLF] Accept-Charset: ISO-8859-1,UTF-8;q=0.7,*;q=0.7[CRLF] Cache-Control: no-cache[CRLF] Accept-Language: de,en;q=0.7,en-us;q=0.3[CRLF] Referer: http://web-sniffer.net/[CRLF] [CRLF] HTTP Response Header --- --- --- Status: HTTP/1.1 302 Object moved Connection: close Date: Fri, 13 May 2011 14:28:30 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET Location: /ge Content-Length: 124 Content-Type: text/html Set-Cookie: ASPSESSIONIDQSRBQACT=HABMIHACEMGHEHLLNJPMNGFJ; path=/ Cache-control: private Content (0.12 <acronym title="KibiByte = 1024 Byte">KiB</acronym>) <title></span>Object moved<span class="tag"></title> # Object Moved This object may be found <a< span="">HREF="/ge">here. +++++++++++++++++++++++++++++++++++++++++++++++++++++ To sum it up, I know a 302 is a bad option, but I don't know if a 301 is a real option for us since it can be redirected to 1 of 3 pages? Any suggestions?</a<>1 -
Using a table with tabs to display information on website, work for seo?
When displaying data using a table, and a tab format to seperate different options, for example http://www.mousetraining.co.uk/ms-training/microsoft-excel-training-courses.html - under Standard Excel Training Course Levels / Training Details / Locations / Schedule - at the bottom of the page. Would search engines pick up the keywords from each of the tabs, or are they hidden?? Thanks
Technical SEO | | jpc10040