Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Should sitemap include https pages?
-
Hi guys,
Trying to figure out some onsite issues I've been having. Would appreciate any feedback on the following 2 questions:
My homepage (http://mysite.com) is a 301 redirect to https://mysite.com, which is under SSL. Only 2 pages of my site are https, the rest are http.
-
Should the directory of my sitemap be https://mysite.com/sitemap.xml or should it be kept with http (even though the redirected homepage is to https)?
-
Should my sitemap include the https pages (only 2 pages) as well as the http?
Thanks,
G
-
-
Hi Frederico,
On the google Sitemaps Errors help page, they include the following information:
"You should also check that the URLs all begin with the same domain as your Sitemap location. For instance, if your Sitemap is listed under http://www.example.com/sitemap.xml, the following URLs are not valid for that Sitemap:
http://www.google.com
— it's in the google.com domain rather than the example.com domainhttp://example.com/
— it's missing the initialwww
www.example.com/
— it's missing the protocol (http), and will generate an Invalid URL warninghttps://www.example.com/
— it's using a different protocol (https
rather thanhttp
)
Any URLs in the Sitemap that are not denied are processed normally."
This leads me to understand that Google don't want you to put http urls in an https sitemap and also vice-versa. What makes you believe otherwise??
Hoping to get to the bottom of this - thanks for the ongoing feedback
-
Those suggesting not to add the SSL pages to the HTTP sitemap are using data back from 2007, when indeed Google showed an error on those sitemaps listing both HTTP and HTTPS pages as they were being recognized as different domains. Those days are long gone. Google had evolved and can now handle sitemaps with both HTTP and HTTPS pages just fine.
-
Thanks for the input Frederico. I've been receiving various different answers to this question.
Most responses have said that we should submit 2 sitemaps: 1 sitemap listed under http that only includes the http pages of the site (which means we wouldn't include our homepage since it's under https!!!).
And 1 sitemap listed on the https version which only includes the https pages (which is only 2 pages!).
To be honest, I still don't know what to do here. Really frustrating that there is no clear cut answer to our situation, which I can't believe is even that unique.
-
G,
It wouldn't do any difference to serve the sitemap over HTTP or HTTPS. As for the http and https pages within the same sitemap, it isn't a problem either.
The only reason I can find for creating multiple sitemaps is for HTML pages, images or videos that do require separate sitemaps.
Does you site uses PHP? If yes, I suggest you test xml-sitemaps.com and it will create the full sitemap for you. If you have a dynamic site, then I suggest getting their commercial version. I've been using it for over 7 years I think and I always get a copy for each site I create. And they offer lots of extras in case you need them (news sitemaps, etc).
-
Hey Federico,
Thanks again for the insight - much appreciated.
So there's no problem for us to create a sitemap that has the https homepage and then the rest of the pages in http? From reading previous Q&As on this topic it seems as though people felt you shouldn't have https and http pages under the same sitemap - I am a novice here so that's why I'm just looking for advice.
Is there any reason why we would need to have the two sitemaps available - as in, why wouldn't we just remove the old http sitemap (that didn't include the https homepage) and just go with the https homepage sitemap?
I just wanted to make sure I understood your response before we take action.
Cheers,
-G
-
Hey G!
You can serve your sitemap in both versions, that won't be any problem and won't trigger the duplicate content issue. So you are safe both ways.
As for the second question: Yes, you should, unless you don't want your pages indexed (any HTTP or HTTPS). I think I saw your site before, and if I remember correctly you had your homepage and login script under SSL, right? Then you should definitely include your homepage in the sitemap but you can leave the login script file out as you don't need that indexed nor google will index it either.
Once you have your sitemap ready, consider including a path in the robots file, like this:
User-agent: *
Sitemap: http://[your website address here]/sitemap.xmlHope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Images on their own page?
Hi Mozers, We have images on their own separate pages that are then pulled onto content pages. Should the standalone pages be indexable? On the one hand, it seems good to have an image on it's own page, with it's own title. On the other hand, it may be better SEO for crawler to find the image on a content page dedicated to that topic. Unsure. Would appreciate any guidance! Yael
Intermediate & Advanced SEO | | yaelslater1 -
We are redirecting http and non www versions of our website. Should all versions http (non www version and www version) and https (non www version) should just have 1 redirect to the https www version?
We are redirecting http and non www versions of our website. Should all versions http (non www version and www version) and https (non www version) should just have 1 redirect to the https www version? Thant way all forms of the website are pointing to one version?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
Should I use https schema markup after http-https migration?
Dear Moz community, Noticed that several groups of websites after HTTP -> HTTPS migration update their schema markup from, example : {
Intermediate & Advanced SEO | | admiral99
"@context": "http://schema.org",
"@type": "WebSite",
"name": "Your WebSite Name",
"alternateName": "An alternative name for your WebSite",
"url": "http://www.your-site.com"
} becomes {
"@context": "https://schema.org",
"@type": "WebSite",
"name": "Your WebSite Name",
"alternateName": "An alternative name for your WebSite",
"url": "https://www.example.com"
} Interesting to know, because Moz website is on https protocol but uses http version of markup. Looking forward for answers 🙂0 -
How important are sitemap errors?
If there aren't any crawling / indexing issues with your site, how important do thing sitemap errors are? Do you work to always fix all errors? I know here: http://www.seomoz.org/blog/bings-duane-forrester-on-webmaster-tools-metrics-and-sitemap-quality-thresholds Duane Forrester mentions that sites with many 302's 301's will be punished--does any one know Googe's take on this?
Intermediate & Advanced SEO | | nicole.healthline0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
Where to link to HTML Sitemap?
After searching this morning and finding unclear answers I decided to ask my SEOmoz friends a few questions. Should you have an HTML sitemap? If so, where should you link to the HTML sitemap from? Should you use a noindex, follow tag? Thank you
Intermediate & Advanced SEO | | cprodigy290 -
Stuck on Page 2 - What Would You Do?!?
My site is : http://goo.gl/JgK1e My main keyword is : Plastic Bins i have been going back and forth between page 1 and 2 for this keyword and i was wondering if any of you could provide any guidance as to why i can't get on the top of page 1, and stay there... My site has been around for a while, we believe we have a great user experience, all unique, fresh content, and the lowest prices... I must be missing out on something major if I cannot get a steady page 1 ranking... Any thoughts? Thanks in advance...
Intermediate & Advanced SEO | | Prime850 -
301 - should I redirect entire domain or page for page?
Hi, We recently enabled a 301 on our domain from our old website to our new website. On the advice of fellow mozzer's we copied the old site exactly to the new domain, then did the 301 so that the sites are identical. Question is, should we be doing the 301 as a whole domain redirect, i.e. www.oldsite.com is now > www.newsite.com, or individually setting each page, i.e. www.oldsite.com/page1 is now www.newsite.com/page1 etc for each page in our site? Remembering that both old and new sites (for now) are identical copies. Also we set the 301 about 5 days ago and have verified its working but haven't seen a single change in rank either from the old site or new - is this because Google hasn't likely re-indexed yet? Thanks, Anthony
Intermediate & Advanced SEO | | Grenadi0