Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Does image domain name matter when using a CDN?
-
Has anyone does studies on using a different CDN domain name for images on a site? Here is an example:
or
http://cdn.mydomain.com/image.jpg>
mydomain.com ranks highly and many images show up in Google/Bing image searches. Is there any actual data that says that using your real domain name for the CDN has benefits versus the default domain name provided by the CDN provider? On the surface, it feels like it would, but I haven't experimented with it.
-
No, I understand. Thanks for jumping in. The only reason why said it was a uglier subdomain, just to express that - even that didn't have a impact impressions, traffic, or CTR on images in my case.
Note, I get 80k+ UV a month and I spend more time monitoring traffic sources and do date comparisons than I should. I have alerts that tell me if I loose x% of traffic on a landing page.
As you can see - a lot of GB - we get a lot of image traffic - no impact, with my ugly url that I wont even make a effort to change.
Joseph
-
It might be an ugly url but the content is still on your domain. In the question above it's about the image on their domain vs on the CDN's domain.
PS: didn't mean to hijack this question, I am also very interested to know the answer for the same question
-
WP engine is a wonderful host I use them as well.
A content delivery network will have 0 negative effect because of the way the code has changed on your website.
Simply use a C name. Or don't it really doesn't matter
for instance you could have www.example.com
that in your DNS
create a C name cdn.example.com to the right allow for username-wpengine.domain.com
then everything looks like CDN.example.com
I believe WP engine will change that for you even however you guys are worried about things that do not matter
a content delivery network will make your website much faster and regardless of if you think the code is ugly or not
it does not make a difference nothing negative will happen only good things happen when you use a CDN I strongly suggest you use them on pretty much anything and do not worry about the coding.
I have a lot of sites with content delivery networks and every one of them ranked better after using CDN than before.
Sincerely,
Thomas
-
No, I recently switched to wpengine ( http://moz.com/perks - we get 4 months free ) and they have a CDN. I haven't noticed any impact on my image results/impressions and the url for the image is pretty ugly like: username-wpengine.domain.com/2013/05/image.jpg or something like that.
From what I seen, I dont think that matters. I hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not Indexing images on CDN.
My URL is: https://bit.ly/2hWAApQ We have set up a CDN on our own domain: https://bit.ly/2KspW3C We have a main xml sitemap: https://bit.ly/2rd2jEb and https://bit.ly/2JMu7GB is one the sub sitemaps with images listed within. The image sitemap uses the CDN URLs. We verified the CDN subdomain in GWT. The robots.txt does not restrict any of the photos: https://bit.ly/2FAWJjk. Yet, GWT still reports none of our images on the CDN are indexed. I ve followed all the steps and still none of the images are being indexed. My problem seems similar to this ticket https://bit.ly/2FzUnBl but however different because we don't have a separate image sitemap but instead have listed image urls within the sitemaps itself. Can anyone help please? I will promptly respond to any queries. Thanks
Technical SEO | May 2, 2018, 12:55 PM | TNZ
Deepinder0 -
Image Sitemap
I currently use a program to create our sitemap (xml). It doesn't offer creating an mage sitemaps. Can someone suggest a program that would create an image sitemap? Thanks.
Technical SEO | Jan 23, 2024, 5:22 AM | Kdruckenbrod0 -
English and French under the same domain
A friend of mine runs a B&B and asked me to check his freshly built website to see if it was <acronym title="Search Engine Optimization">SEO</acronym> compliant.
Technical SEO | Apr 27, 2017, 3:37 PM | coolhandluc
The B&B is based in France and he's targeting a UK and French audience. To do so, he built content in english and french under the same domain:
https://www.la-besace.fr/ When I run a crawl through screamingfrog only the French content based URLs seem to come up and I am not sure why. Can anyone enlighten me please? To maximise his business local visibility my recommendation would be to build two different websites (1 FR and 1 .co.uk) , build content in the respective language version sites and do all the link building work in respective country sites. Do you think this is the best approach or should he stick with his current solution? Many thanks1 -
Are images stored in Amazon S3 buckets indexable to your domain?
We're storing all our images in S3 bucket, common practice, but we want to get these images to drive traffic back to our site -- and credit for that traffic. We've configured the URLs to be s3.owler.com/<image_name>/<image_id>. I've not seen any of these images show in our web master tools. I am wondering if we're actually not going to get the credit for these images because technically they do sit on another domain. </image_id></image_name>
Technical SEO | May 4, 2016, 8:04 PM | mindofmiller0 -
Help: domain name change and Google News
Hi. I work for a regional news source, and our (separate) Spanish-language news publication recently changed its domain name. The publication lost its Google News inclusion. Most of their traffic came from Google News, so traffic tanked. They're trying to get back in. They reapplied but didn't get approved. They're now in the 30-day waiting period to reapply again. The website is run by a third-party company, which handled the domain name change in April (2015). That company has been running their site for a couple of years. Our in-house devs' hands are tied on helping, because we (at the mother company) don't manage their site. This third party has not been responsive. The Spanish pub folks have reached out to me to help them prepare for Round 2 of reapplication. I'm the mothership in-house SEO, but I've never experienced this situation before. Because everything seems to be in order besides the ham-handed changes, my best advice to them so far is: You'll have to wait until Google gets to know you again, unfortunately. Does that sound right? Any pointers out there for bringing their best possible A-game to the next round?
Technical SEO | Jun 16, 2015, 6:13 PM | christyrobinson1 -
Umlaut in domain
Hi, My client wants to expand it's business to Germany and logically we need a domain name to match. We've found a great one and regsiterd several variants to it. However I just found out that in Germany it is possible (while here it's not) to register a domain with an umlaut. My question is: will google assign more value to: schädlinge.de than schadlinge.de when users search for schädlinge? If yes, how large will the difference be? (I will use an umlaut in the title etc) Kind regards,
Technical SEO | Sep 6, 2012, 2:52 PM | media-surfer
Jason.0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | Oct 6, 2011, 10:55 PM | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0