Same server for different client sites?
-
Hi everyone - I have a question about whether it's OK for us to host several of our client's websites on the same dedicated web server, without this causing problems in SEO. I know the issues with duplicate content etc., but for background - we provide website services to a particular sector (antiques/auctions). All our clients are distinct, and have written their own copy etc., but because they're all in the same sector, their websites will - largely - talk about the same types of things - so the content is not duplicated, but it's similar in topic, I guess. Does anyone feel it would cause a problem if we were to put several (say about of our client's websites on the same dedicated web server, or would we be better spreading the sites over different shared servers? Come to think about it, if we are spreading those same 8 sites across 4 virtual servers - but all hosted by the same company - presumably Google would know that too?
Thanks in advance for your thoughts on this!
Nikki
-
Thinking about this further, Wix, for example - would have multiple sites on one server. The same underlying code runs a wix website, but the content is different. This is kind of like the scenario we have, although obviously we're not as big as Wix and we'll have fewer sites on the same server. But that's the scenario - same underlying CRM that clients use to 'build' their site, so in that sense some of the code/framework is the same, but each client adding their own content. There is no way around that the code is the same, but the content is different - so that should be OK - right?!
-
I think that this is usually fine.
My only concern would be that common page. I would put enough work into it to make sure that it does not have same structure, same code, same anything. Just to be safe.
-
Hi Paul - thanks for your thoughts. Good point about each site being hosted in a separate account.
-
Hi EGOL, thanks for your answer - much appreciated. There is absolutely no linking between the sites and the sites do have their own substantive content - although certain pages exist on both sites. e.g. clients have a 'listing' of the items for sale, and so both of the sites would feature this page (and the structure of the page would be the same), but obviously different items would be offered for sale.Do you think that sounds OK?
-
Agree with EGOL - very common procedure.
The one functional consideration you must take into account though is that each site should be hosted in a separate account on the server so malware or hacks from one cannot contaminate the others. That's also best if individual clients will need hosting access to their own sites - otherwise, it can be difficult to keep clients away from the back-end of each others' sites.
Paul
-
If these sites all have unique, substantive content and are not incestuously interlinking, then it is fine to host them on the same dedicated server.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta refresh for news site?
We have a news site that uses a meta refresh (<meta < span="">http-equiv="refresh" content="600" /> across all content. I understand the reasoning on the homepage and am trying to decide of the cons of using this (slows page, is it treated differently and loses pagerank/link ..) Does anyone have experience with meta refresh being a negative thing or does it no longer matter?</meta <>
Intermediate & Advanced SEO | | KristieWahlquist0 -
Different rankings for same keyword in different geo locations
I am listed at # 11 in google.com.pk for a same keyword but listed 4th when I use &gl=us in the same query or use a US ip address. Could this be because google's recent update and maybe they are taking time to push the update towards other countries or something else? In any case I see decent rankings when I use the query in google.com or when I use gl=us, but lower quality results in google.com.pk so please guide what is going on. Thanks in advance for guidance (y)
Intermediate & Advanced SEO | | hpk0 -
Site Structured Navigated by Cookies
Is it advisable to have a site structure that is navigated via URLs rather than cookies? In a website that has several location based pages - each with their own functions and information? Is this a SEO priority? Will it help to combat duplicate content? Any help would be greatly appreciated!
Intermediate & Advanced SEO | | J_Sinclair0 -
Why my site it's not being indexed?
Hello.... I got to tell that I feel like a newbie (I am, but know I feel like it)... We were working with a client until january this year, they kept going on their own until september that they contacted us again... Someone on the team that handled things while we were gone, updated it´s robots.txt file to Disallow everything... for maybe 3 weeks before we were back in.... Additionally they were working on a different subdomain, the new version of the site and of course the didn't block the robots on that one. So now the whole site it's been duplicated, even it´s content, the exact same pages exist on the suddomain that was public the same time the other one was blocked. We came in changes the robots.txt file on both server, resend all the sitemaps, sent our URL on google+... everything the book says... but the site it´s not getting indexed. It's been 5 weeks now and no response what so ever. We were highly positioned on several important keywords and now it's gone. I now you guys can help, any advice will be highly appreciated. thanks Dan
Intermediate & Advanced SEO | | daniel.alvarez0 -
Site duplication issue....
Hi All, I have a client who has duplicated an entire section of their site onto another domain about 1 year ago. The new domain was ranking well but was hit heavily back in March by Panda. I have to say the set up isn't great and the solution I'm proposing isn't ideal, however, as an agency we have only been tasked with "performing SEO" on the new domain. Here is an illustration of the problem: http://i.imgur.com/Mfh8SLN.jpg My solution to the issue is to 301 redirect the duplicated area of the original site out (around 150 pages) to the new domain name, but I'm worried that this could be could cause a problem as I know you have to be careful with redirecting internal pages to external when it comes to SEO. The other issue I have is that the client would like to retain the menu structure on the main site, but I do not want to be putting an external link in the main navigation so my proposed solution is as follows: Implement 301 redirects for URLs from original domain to new domain Remove link out to this section from the main navigation of original site and add a boiler plate link in another area of the template for "Visit xxx for our xxx products" kind of link to the other site. Illustration of this can be found here: http://i.imgur.com/CY0ZfHS.jpg I'm sure the best solution would be to redirect in URLs from the new domain into the original site and keep all sections within the one domain and optimise the one site. My hands are somewhat tied on this one but I just wanted clarification or advice on the solution I've proposed, and that it wont dramatically affect the standing of the current sites.
Intermediate & Advanced SEO | | MiroAsh0 -
Temporary Duplicate Sites - Do anything?
Hi Mozzers - We are about to move one of our sites to Joomla. This is one of our main sites and it receives about 40 million visits a month, so the dev team is a little concerned about how the new site will handle the load. Dev's solution, since we control about 2/3 of that traffic through our own internal email and cross promotions, is to launch the new site and not take down the old site. They would leave the old site on its current URL and make the new site something like new.sub.site.com. Traffic we control would continue to the old site, traffic that we detect as new would be re-directed to the new site. Over time (the think about 3-4 months) they would shift the traffic all to the new site, then eventually change the URL of the new site to be the URL of the old site and be done. So this seems to be at the outset a duplicate content (whole site) issue to start with. I think the best course of action is try to preserve all SEO value on the old URL since the new URL will eventually go away and become the old URL. I could consider on the new site no-crawl/no-index tags temporarily while both sites exist, but would that be risky since that site will eventually need to take those tags off and become the only site? Rel=canonical temporarily from the new site to the old site also seems like it might not be the best answer. Any thoughts?
Intermediate & Advanced SEO | | Kenn_Gold0 -
Development site crawled
We just found out our password protected development site has been crawled. We are worried about duplicate content - what are the best steps to take to correct this beyond adding to robots.txt?
Intermediate & Advanced SEO | | EileenCleary0 -
Is it allowed to have different alt on same image on different pages?
Hi, I have images that match several different keywords and I wondered if I can give them different alts based on the page that they are displayed or will Google be angry with me? Thanks
Intermediate & Advanced SEO | | BeytzNet0