Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Using a third party server to host site elements
-
Hi guys -
I have a client who are recently experiencing a great deal of more traffic to their site. As a result, their web development agency have given them a server upgrade to cope with the new demand.
One thing they have also done is put all website scripts, CSS files, images, downloadable content (such as PDFs) - onto a 3rd party server (Amazon S3). Apparently this was done so that my clients server just handles the page requests now - and all other elements are then grabbed from the Amazon s3 server. So basically, this means any HTML content and web pages are still hosted through my clients domain - but all other content is accessible through an Amazon s3 server URL.
I'm wondering what SEO implications this will have for my clients domain? While all pages and HTML content is still accessible thorugh their domain name, each page is of course now making many server calls to the Amazon s3 server through external URLs (s3.amazonaws.com).
I imagine this will mean any elements sitting on the Amazon S3 server can no longer contribute value to the clients SEO profile - because that actual content is not physically part of their domain anymore. However what I am more concerned about is whether all of these external server calls are going to have a negative effect on the web pages value overall. Should I be advising my client to ensure all site elements are hosted on their own server, and therefore all elements are accessible through their domain?
Hope this makes sense (I'm not the best at explaining things!)
-
Hello Zeal Digital,
I use a CDN (Content Delivery Network) for images, CSS and javascript.
Doing that adds only about $10 to cost per month for a site that had around 800,000 pageviews per month.
You have complete control over the images. If there is a problem, you can force the CDN to flush a file and reload it from the source. You add code to your .htaccess file that tells the CDN how long to store images before fluching them and getting a new copy. It is all automated, there is generally no work for you to do. I host with softlayer.com and this is part of their service.
The change from self-sourced images, css and scripts had a massive improvement on the server.
- it is a 16-processor linux box with twin 15,000rpm SCSI drives and 12Gb RAM - it is quite fast!
Page delivery times improved by 1-2 seconds.
The server now is so lightly loaded that it could be downgraded to save more money.
It has zero effect on SEO. The CDN is accessed using a CNAME.
- static.domain.com - so don't worry about it looking like components are from other places.
The CDN has servers all over the world, so no matter where the visitors are, it is only a few hops for them to get most of the content, making it much faster for someone in Australia who would normally pull images from a server in the USA.
Your only problem with Amazon S3 is that they have crashed it a few times, but other than that, it is a good thing to do.
I wouldn't advise them to self-host, unless you want to increase their costs, server loading and page delivery times.
-
Great advice, cheers Jeffery!
-
I work with a number of high traffic sites (TB's of data each day, 10's millions page views/month). With many of these sites, we have offloaded static content to either dedicated static content servers (typically cloud based so we can scale up and down) or to content deliver networks. I've not had anyone report any SEO impact.
In contrast, they often see user engagement (page views/user), repeat visitors, and other traffic metrics improve. Users like fast sites. Also, Google apparently likes fast sites too, so while I've not seen it, you could actually get a boost in your SERPs due to faster loading pages.
If you break down a modern web page, you will find numerous elements required. Dozens of images, CSS, javascript as well as the page itself. All of these items require a request to the web server.
With some graphic intensive sites, I've seen as much as 95% of all web server requests (HTTP requests) be attributable to static content. By moving these HTTP requests to other systems, you free your primary server to handle the application. This provides a better user experience and improves scalability.
Content Delivery Networks
I do not use Amazon's Web Services so I do not know specifically what they offer. But here are two CDN's Ihave used with good success:
Internap:
http://www.internap.com/cdn-services-content-delivery-network/
Edgecast:
One method I look for is called "origin pull." With this method, you do not have to upload files to the CDN. The CDN will fetch them automatically from your site as needed. I found this is much easier to manage on sites that have frequent content updates.
-
Hosting images externally never had any impact on cases I had a chance to observe. The only problem I can think of is that you lose control over loading times or if somebody takes an image and links (credits) the image hosting domain instead of your domain.
-
Couple of notes for you
- There isn't any SEO impact on WHERE the data is loaded from. Look at any major website (especially one that ranks well) and they're openly using content delivery (like Akamai, Amazon S3/Cloudfront, etc) for static content. This is good business practice because it takes that load off your web server and often places the content closer to where the client is. Faster content delivery can help SEO if you have a slow server.
- If they're using the raw S3 buckets I would HIGHLY suggest signing up for Cloudfront. There's two benefits to doing this. First, you put the content into Amazon's cloud, where it is more readily available. Second, you can use domain aliasing to help obscure the source. For instance, let's say you have an images bucket. You could add a CNAME DNS record for images.yourdomain.com and then put that into your source code. You can still see where the DNS takes you, but it's not obvious to the general public. The cost difference between raw S3 delivery and Cloudfront is negligible.
Oh, and I use Amazon Cloudfront for my delivery. Never had any SEO issues with doing so.
-
I don't recomend to have the resources and database to other server than files, it makes some flood traffic between servers, the resources are harder to load and the site optimum speed is decreased. Also you can't compress this content so they are downloaded independently.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our clients Magento 2 site has lots of obsolete categories. Advice on SEO best practice for setting server level redirects so I can delete them?
Our client's Magento website has been running for at least a decade, so has a lot of old legacy categories for Brands they no longer carry. We're looking to trim down the amount of unnecessary URL Redirects in Magento, so my question is: Is there a way that is SEO efficient to setup permanent redirects at a server level (nginx) that Google will crawl to allow us at some point to delete the categories and Magento URL Redirects? If this is a good practice can you at some point then delete the server redirects as google has marked them as permanent?
Technical SEO | | Breemcc0 -
Sitemap use for very large forum-based community site
I work on a very large site with two main types of content, static landing pages for products, and a forum & blogs (user created) under each product. Site has maybe 500k - 1 million pages. We do not have a sitemap at this time.
Technical SEO | | CommManager
Currently our SEO discoverability in general is good, Google is indexing new forum threads within 1-5 days roughly. Some of the "static" landing pages for our smaller, less visited products however do not have great SEO.
Question is, could our SEO be improved by creating a sitemap, and if so, how could it be implemented? I see a few ways to go about it: Sitemap includes "static" product category landing pages only - i.e., the product home pages, the forum landing pages, and blog list pages. This would probably end up being 100-200 URLs. Sitemap contains the above but is also dynamically updated with new threads & blog posts. Option 2 seems like it would mean the sitemap is unmanageably long (hundreds of thousands of forum URLs). Would a crawler even parse something that size? Or with Option 1, could it cause our organically ranked pages to change ranking due to Google re-prioritizing the pages within the sitemap?
Not a lot of information out there on this topic, appreciate any input. Thanks in advance.0 -
Server Connection Error when using Google Speed Test Insight and GTMetrix
Hi Guys, Recently got into the issue when testing load speed of my website (https://solvid.co.uk). Occasionally, Google Speed Insights gives me a server connection error which states _"PageSpeed was unable to connect to the server. Ensure that you are using the correct protocol (_http vs https), the page loads in a browser, and is accessible on the public internet." Also, GTMetrix gives me an error as well, which states the following: "An error occurred fetching the page: HTTPS error: SSl connect attempt failed" All of my redirects seem to be set-up correctly as well as the SSL certificate. I've contacted my hosting provider (godaddy), they are saying that everything is fine with the server and the installation. Also, tried in different browsers in incognito mode, still gives me the same error. Until yesterday I haven't had such a problem. I've also attached the error screenshot links. I would really appreciate your help! Dmytro UxchPYR M52iPDf
Technical SEO | | solvid1 -
How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
I know this is kind of a newbie question but I am having an amazing amount of trouble creating a sitemap for our site Bestride.com. We just did a complete redesign (look and feel, functionality, the works) and now I am trying to create a site map. Most of the generators I have used "break" after reaching some number of pages. I am at a loss as to how to create the sitemap. Any help would be greatly appreciated! Thanks
Technical SEO | | BestRide0 -
Effective use of hReview
Hi fellow Mozzers! I am just in the process of adding various reviews to our site (a design agency), but I wanted to use the ratings in different ways depending on the page. So for the home page and the services (branding, POS, direct mail etc) I wanted to aggregate relevant reviews (giving us an average of all reviews for the home page, an average of ratings from all brand projects and so on). Then, I wanted to put specific reviews on our portfolio pages, so the review relates specifically to that project. This is the easiest to do as the hReview generator is geared up for reviews that come from one source, but I can't find a way of aggregating the star ratings to make an average rating rich snippet. Anyone know where I can get the coding for this? Thanks in advance! Nick.
Technical SEO | | themegroup0 -
Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools. However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
Technical SEO | | askotzko0 -
Index forum sites
Hi Moz Team, somehow the last question i raised a few days ago not only wasnt answered up until now, it was also completely deleted and the credit was not "refunded" - obviously there was some data loss involved with your restructuring. Can you check whether you still find the last question and answer it quickly? I need the answer 🙂 Here is one more question: I bought a website that has a huge forum, loads of pages with user generated content. Overall around 500.000 Threads with 9 Million comments. The complete forum is noindex/nofollow when i bought the site, now i am thinking about what is the best way to unleash the potential. The current system is vBulletin 3.6.10. a) Shall i first do an update of vbulletin to version 4 and use the vSEO tool to make the URLs clean, more user and search engine friendly before i switch to index/follow? b) would you recommend to have the forum in the folder structure or on a subdomain? As far as i know subdomain does take lesser strenght from the TLD, however, it is safer because the subdomain is seen as a separate entity from the regular TLD. Having it in he folder makes it easiert to pass strenght from the TLD to the forum, however, it puts my TLD at risk c) Would you release all forum sites at once or section by section? I think section by section looks rather unnatural not only to search engines but also to users, however, i am afraid of blasting more than a millionpages into the index at once. d) Would you index the first page of a threat or all pages of a threat? I fear duplicate content as the different pages of the threat contain different body content but the same Title and possibly the same h1. Looking forward to hear from you soon! Best Fabian
Technical SEO | | fabiank0