Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
I want to load my ecommerce site xml via CDN
-
Hello Experts.
My ecommerce site - abcd.com
My ecommrece site sitemap abcd.com/sitemap.xml
My subdomain - xyz.abcd.com ( this is blank page but status is 200 which runs from cdn)My ecommerce site sitemap abcd.com/sitemap.xml contains only 1 link of subdomain sitemap- xyz.abcd.com/sitemap.xml
And this sitemap- xyz.abcd.com/sitemap.xml contains all category and product links of abcd.comSo my query is :-
- Above configuration is okay?
- In search console I will add new property - xyz.abcd.com. and add sitemap xyz.abcd.com/sitemap.xml So Google will able to give errors for my website abcd.com
Purpose - I want to run my xml sitemap from cdn that's why i have created subdomain like xyz.abcd.com
Hope you understood my query.
Thanks!
-
Hello Micey123,
That sounds good except you should put the sitemap reference for xyz.abcd.com within that subdomain's robots.txt file as well: xyz.abcd.com/robots.txt, as each subdomain should have its own robots.txt file.
-
The Google Sitemap protocol enables you to provide details about your pages to search engines, sitemaps provide additional information about site pages beyond just the URLs. Typically, it is best practice to submit XML sitemaps for pages, images and videos. In the case of a CDN there are additional steps needed to implement XML sitemap submission.
The Google Sitemap protocol enables you to provide details about your pages to search engines, sitemaps provide additional information about site pages beyond just the URLs. Typically, it is best practice to submit XML sitemaps for pages, images and videos. In the case of a CDN there are additional steps needed to implement XML sitemap submission.
To submit Sitemaps for multiple hosts from a single host, you need to “prove” ownership of the host(s) for which URLs are being submitted in a Sitemap.
Example: To submit Sitemaps for 3 hosts:
www.host1.com with Sitemap file sitemap-host1.xml
www.host2.com with Sitemap file sitemap-host2.xml
www.host3.com with Sitemap file sitemap-host3.xmlMoreover, you want to place all three Sitemaps on a single host: www.sitemaphost.com. So the Sitemap URLs will be:
http://www.sitemaphost.com/sitemap-host1.xml
http://www.sitemaphost.com/sitemap-host2.xml
http://www.sitemaphost.com/sitemap-host3.xmlBy default, this will result in a “cross submission” error since you are trying to submit URLs for www.host1.com through a Sitemap that is hosted on www.sitemaphost.com (and same for the other two hosts). One way to avoid the error is to prove that you own (i.e. have the authority to modify files) www.host1.com. You can do this by modifying the robots.txt file on www.host1.com to point to the Sitemap on www.sitemaphost.com.
In this example, the robots.txt file at http://www.host1.com/robots.txt would contain the line “Sitemap: http://www.sitemaphost.com/sitemap-host1.xml”.
By modifying the robots.txt file on www.host1.com and having it point to the Sitemap on www.sitemaphost.com, you have implicitly proven that you own www.host1.com. In other words, whoever controls the robots.txt file on www.host1.com trusts the Sitemap at http://www.sitemaphost.com/sitemap-host1.xml to contain URLs for www.host1.com. The same process can be repeated for the other two hosts. Finally, submit the Sitemaps from www.sitemaphost.com. The same process can/should be repeated for the other two hosts.
CONCLUSION
Above configuration is okay? Yes It will work, just remember to set it up properly in Search Console
I believe that Google will only follow that below to the TLD domain so subdomains such as
cdn.yourdomain.comwill work but a completely different domain will not. Most if not all CDN networks you can setup your domain to work in harmony with one another and should be no reason to use another domain. -
Hello Expert,
Can anyone reply me please?
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help Setting Up 301 Redirects from Coldfusion Site to Wordpress Site.
I have created a new website and need to redirect all of the previous pages to the new one. The old website was built in coldfusion and the new site is built in wordpress. One of the pages I'm trying to redirect is www.norriseal.com/products.cfm to http://norrisealwellmark.com/products/. This is what I have in my .htaccess file <ifmodule mod_rewrite.c="">Options +FollowSymlinks
Technical SEO | | MarketHubb
RewriteEngine On
RewriteBase /
Redirect 301 /products.cfm http://norrisealwellmark.com/products/</ifmodule> The result of this redirect is http://norrisealwellmark.com/products.cfm How do I prevent the .cfm from appending to the destination URL?1 -
Duplicate content on job sites
Hi, I have a question regarding job boards. Many job advertisers will upload the same job description to multiple websites e.g. monster, gumtree, etc. This would therefore be viewed as duplicate content. What is the best way to handle this if we want to ensure our particular site ranks well? Thanks in advance for the help. H
Technical SEO | | HiteshP0 -
Why xml generator is not detecting all my urls?
Hi Mozzers, After adding 3 new pages to example.com, when generating the xml sitemap, Iwasn't able to locate those 3 new url. This is the first time it is happening. I have checked the meta tags of these pages and they are fine. No meta robots setup! Any thoughts or idea why this is happening? how to fix this? Thanks!
Technical SEO | | Ideas-Money-Art0 -
Staging site and "live" site have both been indexed by Google
While creating a site we forgot to password protect the staging site while it was being built. Now that the site has been moved to the new domain, it has come to my attention that both the staging site (site.staging.com) and the "live" site (site.com) are both being indexed. What is the best way to solve this problem? I was thinking about adding a 301 redirect from the staging site to the live site via HTACCESS. Any recommendations?
Technical SEO | | melen0 -
How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
I know this is kind of a newbie question but I am having an amazing amount of trouble creating a sitemap for our site Bestride.com. We just did a complete redesign (look and feel, functionality, the works) and now I am trying to create a site map. Most of the generators I have used "break" after reaching some number of pages. I am at a loss as to how to create the sitemap. Any help would be greatly appreciated! Thanks
Technical SEO | | BestRide0 -
Way to spider Wordpress site
I have an old Wordpress site and I want to move it to a new server and take it off Wordpress (too many hacks). I am trying to spider the site so as to get static, non-Wordpress, pages. I am having trouble doing this. When I spider the site, it changes the URLs. For instance, if the URL is www.domain.com/page/ the URL I get out of the spider is /page/index.html And those are not the URLs in the search engine indices. There are about 2000 pages on this site, so it is not feasible to set up 301 redirects. I tried using these spidering programs: WinHTTack Website Copier and PageNest Does anyone know of another method of turning a Wordpress site into a non Wordpress site?
Technical SEO | | DanCrean0 -
CDN Being Crawled and Indexed by Google
I'm doing a SEO site audit, and I've discovered that the site uses a Content Delivery Network (CDN) that's being crawled and indexed by Google. There are two sub-domains from the CDN that are being crawled and indexed. A small number of organic search visitors have come through these two sub domains. So the CDN based content is out-ranking the root domain, in a small number of cases. It's a huge duplicate content issue (tens of thousands of URLs being crawled) - what's the best way to prevent the crawling and indexing of a CDN like this? Exclude via robots.txt? Additionally, the use of relative canonical tags (instead of absolute) appear to be contributing to this problem as well. As I understand it, these canonical tags are telling the SEs that each sub domain is the "home" of the content/URL. Thanks! Scott
Technical SEO | | Scott-Thomas0 -
XML Sitemap without PHP
Is it possible to generate an XML sitemap for a site without PHP? If so, how?
Technical SEO | | jeffreytrull11