Shared Hosting Vs VPS Hosting
-
From an SEO perspective what are the advantages of VPS Hosting VS Shared Hosting for a local website that has less that 200 pages and gets max 2000 hits per month?
Is VPS Hosting worth the extra expense for a local Real Estate Website?
-
What is the site built on? If wordpress, check out wpengine (no affiliation). Just saw this post and wanted to respond as I had tons of downtime when I migrated a site to a shared environment on another host. Site kept having downtime no matter what I tried with support. So you can imagine the frustration and how this can impact the business not just from a search perspective. I think the question here is how stable do you need something. Does shared work? Yea sure. But if you are putting something that is the core of your business on a $4.95mth host, than you get $4.95 worth of insurance.
-
The 24- 48 hours time frames are highly overstated. There are numerous methods of transferring a site with no downtime. Even if you have downtime, it is typically a couple hour period (locally) where some users will go to the old site, whereas other users will go to the new site. If you have a relatively static site which services a local area, such as a local realty site, the issue is quite minimal.
Here is the bottom line....your current site is hosted on a shared server which is so restrictive you can't even add a 301 redirect. To be candid, a site which has operated under those conditions could not reasonably be so hyper-sensitive to the potential short transition period from a simple server change. If I am mistaken, then you need to hire a professional developer to manage the migration on your behalf.
-
I was also wondering what is your opinion on Rankings impact from changing Host providers?
If done correctly, there is no negative impact at all. The primary issue is during the switch over itself.
The issues of duplicate content and split links should not be a factor. Those issues can potentially occur when URLs change. That does not happen if you are simply changing hosts. The same applies to the www vs non-www and index.html issue. If these issues are not presently occurring, and you properly move all files associated with your site, and your new server is properly configured, the items you mentioned will not be considerations.
-
I have reached out to several hosting companies and they stated that my site will be down for 24 to 48 hours if I transfer my site over to their service.
I am sure that this would impact my rankings for having the site down. They said it's because I will need to update the domain servers.
Is there any way to prevent downtime while transferring a site?
-
Thank you for your answer. I was also wondering what is your opinion on Rankings impact from changing Host providers?
I ask this because my current Web Host doesn't allow me to do a 301 redirect.
I have heard that changing providers can negatively impact search rankings because of the change of IP Address.
Note: my concern is duplicate content and split links to my site. From www and non-www version of home page along with the index.html version.
-
Is VPS Hosting worth the extra expense for a local Real Estate Website?
In short, absolutely. Longer answer below.
Shared hosting typically involves hundreds of websites on a single server. Shared hosting is the lowest form of hosting offered. It is very common to have all types of bad sites (porn sites, mail spammers, etc) on the same server as your real estate site. Server outages are common along with numerous other issues such as having your mail server flagged for spam.
A VPS typically divides a server's resources like a pizza into 8 - 12 "slices". You have dedicated resources assigned to your site along with over a 90% reduction in the user population. If you care at all about SEO, which you apparently do based on your presence at SEOmoz, you should consider using VPS at a minimum for site hosting.
I have worked with clients in the past who have tried shared hosting with quality hosts. I worked with the hosts to move the sites to other shared servers which they deemed as more mature / stable. The sites still experienced monthly outages. Typical site owners are not even aware of these outages. As part of a solid SEO plan, you should use a monitoring service to notify you of any site outages. I use Alertra but there are many similar services available.
-
For a site of that size and traffic volume I think you'll be fine with shared hosting - as long as you choose a reputable, quality host - which can be difficult when it comes to choosing a shared hosting environment!
I hesitate to recommend specific hosting companies because I've had both good and bad experiences with a number of hosts. Do your research and find a host with great support, reputation and standards.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema Markup Validator vs. Rich Results Test
I am working on a schema markup project. When I test the schema code in the Schema Markup Validator, everything looks fine, no errors detected. However, when I test it in the Rich Results Test, a few errors come back.
Intermediate & Advanced SEO | | Collegis_Education
What is the difference between these two tests? Should I trust one over the other?1 -
Cleaning up a Spammy Domain VS Starting Fresh with a New Domain
Hi- Can you give me your opinion please... if you look at murrayroofing.com and see the high SPAM score- and the fact that our domain has been put on some spammy sites over the years- Is it better and faster to place higher in google SERP if we create a fresh new domain? My theory is we will spin our wheels trying to get unlisted from alot of those spammy linking sites. And that it would be faster to see results using a fresh new domain rather than trying to clean up the current spammy doamin. Thanks in advance - You guys have been awesome!!
Intermediate & Advanced SEO | | murraycustomhomescom0 -
Blog - subdomain vs. subfolderq
Hi everyone I work on an ecommerce site and I'm trying to get more content together for the site & blog. The development team want to put the blog we have on a subdomain of our site, my question is - what is better for SEO Subfolder vs. subdomain I've read a couple of articles to say subfolder is better and a subdomain needs a lot of management to build up authority itself? Thanks!
Intermediate & Advanced SEO | | BeckyKey0 -
New-york-city vs. broadway as a URL parameter
We're a content publisher that writes news and reviews about the theater community, both in New York City (broadway mainly) and beyond. Presently, we display the term 'new-york-city' in news articles about Broadway / New York City theater (see http://screencast.com/t/XlifMdT9QP). Would it be better for us to replace that term with simply 'Broadway' to improve its searchability? I was doing some google trends keyword research and it looks like the search term "Broadway" in various permutations is substantially more popular than "New York City Theater."
Intermediate & Advanced SEO | | TheaterMania0 -
TLDs vs ccTLDs?
*Was trying to get this question answered in another thread but someone marked it as "answered" and no more responses came. So the question is about best practices on TLDs vs ccTLDs. I have a .com TLD that has DA 39 which redirects to the localized ccTLDs .co.id and .com.sg that have DA 17. All link building has been done for the .com TLD. In terms of content, it sometimes overlaps as the same content shows up on both the ccTLDs. What is best practices here? It doesnt look like my ccTLDs are getting any juice from the TLD. Should I just take my ccTLDs and combine them into my TLD in subdomains? Will I see any benefits? Thanks V j3LWnOJ
Intermediate & Advanced SEO | | venkatraman0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Link-building best SEO practice (one-off VS periodic blogging)
Hi all, Generally, what would be best when building a website's ranking through link building? Having the same links from the same bloggers or receiving new links from different bloggers every time? A lot of the SEO services offer 4-8 blog backlinks per month. Would it be best if these links came from different sources every time or most from the same sources each month? I know there's a lot of factors but I hope this question is clear. Happy holidays and thank you for your insightful feedback. Carlos
Intermediate & Advanced SEO | | 90miLLA0 -
International Image SEO - one host vs multiple hosts
I've got 3 sites (same name) located in Australia, US and UK. Currently these sites are all pulling images (I own) from 1 location. I'd like to create image XML sitemaps for each of these sites. As I see it, my options are: 1. Keeping the images hosted in the 1 place and creating image XML sitemaps for each of the 3 sites (which seems to be technically ok because https://support.google.com/webmasters/answer/178636?hl=en&ref_topic=20986 states that if the image URL isn't on the same domain, both domains need to be verified in Webmaster Tools). However, is there a risk here that the sitemaps will conflict because they are pulling from images on the same host? 2. Hosting the images locally (ie. the same images will be hosted in 3 locations) and applying hreflang in the sitemap. Does anyone know which of these options are best (obviously #1 would be more convenient), or whether there are any other options for attacking this issue? Thanks!
Intermediate & Advanced SEO | | oline1230