Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Which Sitemap to keep - Http or https (or both)
-
Hi,
Just finished upgrading my site to the ssl version (like so many other webmasters now that it may be a ranking factor).
FIxed all links, CDN links are now secure, etc and 301 Redirected all pages from http to https.
Changed property in Google Analytics from http to https and added https version in Webmaster Tools. So far, so good.
Now the question is should I add the https version of the sitemap in the new HTTPS site in webmasters or retain the existing http one? Ideally switching over completely to https version by adding a new sitemap would make more sense as the http version of the sitemap would anyways now be re-directed to HTTPS.
But the last thing i can is to get penalized for duplicate content. Could you please suggest as I am still a rookie in this department. If I should add the https sitemap version in the new site, should i delete the old http one or no harm retaining it.
-
Hi Ashish,
Add the new version and delete the old. Retaining the old sitemap shouldn't flag you for duplicate content, however, it doesn't make sense to tell Google to crawl those pages if they are just going to redirect to new pages that are in the new sitemap. Extra work for GoogleBot.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GoogleBot still crawling HTTP/1.1 years after website moved to HTTP/2
Whole website moved to https://www. HTTP/2 version 3 years ago. When we review log files, it is clear that - for the home page - GoogleBot continues to only access via HTTP/1.1 protocol Robots file is correct (simply allowing all and referring to https://www. sitemap Sitemap is referencing https://www. pages including homepage Hosting provider has confirmed server is correctly configured to support HTTP/2 and provided evidence of accessing via HTTP/2 working 301 redirects set up for non-secure and non-www versions of website all to https://www. version Not using a CDN or proxy GSC reports home page as correctly indexed (with https://www. version canonicalised) but does still have the non-secure version of website as the referring page in the Discovery section. GSC also reports homepage as being crawled every day or so. Totally understand it can take time to update index, but we are at a complete loss to understand why GoogleBot continues to only go through HTTP/1.1 version not 2 Possibly related issue - and of course what is causing concern - is that new pages of site seem to index and perform well in SERP ... except home page. This never makes it to page 1 (other than for brand name) despite rating multiples higher in terms of content, speed etc than other pages which still get indexed in preference to home page. Any thoughts, further tests, ideas, direction or anything will be much appreciated!
Technical SEO | | AKCAC1 -
Move a Wordpress Site to HTTPS with Bluehost
HI Guys, do you think that the following guide is enoght to move a bluehost wordpress site to https in a seo best practive way? https://www.shoutmeloud.com/free-ssl-certificate-bluehost-hosting.html Basically their steps are: Install SSL on Bluehost panel Install Really Simple SSL Wp Plugin Edit Your .htacess File & Add The Code For HTTP To HTTPS Redirection Update All HTTP URLs In Database To HTTPS Using Search and Replace Plugin Use Broken Link Checker plugin & use its redirection module to find links to 3rd party sites with HTTP that should now be HTTPS. Last thing to do Submit your new HTTPS site to Google Search Console & submit your sitemap. Update your profile link on Google Analytics. Update your website links on social media profiles & anywhere else they exist. This step you can do in pieces in the coming days. Read this guide to learn more about HTTP to HTTPS migration & fixing mixed content. If you disabled Who.Is guard for your domain name, you can enable it now. Do you know a better practical guide for wordrpess? in term of usefull plugins to handle the migration? Tx to everyone!
Technical SEO | | Dreamrealemedia0 -
How to change 302 redirect from http to https
Hi gang. Our site currently has a 302 redirect from the HTTP version of the homepage to the HTTPS version of the homepage. I understand this really should be changed to a 301 redirect but I'm having a little trouble figuring out exactly how this should be done. Some places on the internet are telling me I can edit our htaccess file to specify the type of redirect, however our htaccess file seems to be missing some of the information in theirs. Can anyone tell me what needs to be changed in the htaccess file - or if there's a simpler way to change the 302 to a 301? Many thanks 🙂 htaccess: BEGIN WordPress RewriteEngine On RewriteBase / RewriteRule ^index.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] END WordPress EXPIRES CACHING ExpiresActive On ExpiresByType image/jpg "access plus 6 months" ExpiresByType image/jpeg "access plus 6 months" ExpiresByType image/gif "access plus 6 months" ExpiresByType image/png "access plus 6 months" ExpiresByType text/css "access plus 10 days" ExpiresByType application/pdf "access plus 10 days" ExpiresByType application/x-shockwave-flash "access plus 10 days" ExpiresByType image/x-icon "access plus 6 months" ExpiresDefault "access plus 2 days" EXPIRES CACHING
Technical SEO | | davedon0 -
XML Sitemap and unwanted URL parameters
We currently don't have an XML sitemap for our site. I generated one using Screaming Frog and it looks ok, but it also contains my tracking url parameters (ref=), which I don't want Google to use, as specified in GWT. Cleaning it will require time and effort which I currently don't have. I also think that having one could help us on Bing. So my question is: Is it better to submit a "so-so" sitemap than having none at all, or the risks are just too high? Could you explain what could go wrong? Thanks !
Technical SEO | | jfmonfette0 -
My Rankings Keep Going Down - Needs some more ideas on why...
Hi Everyone, I have been working on the website www.PetsSpark.com for a while now. I used to be on the first page for many of my keywords, now I am lucky if i am on the 2nd or 3rd. I stopped my SEO efforts for about 3 months while making changes to the website, its whithin these 3 months that I started dropping from the first page. The biggest thing I notice is the "type" of website now ranking for my main keywords. Mostly Forums, Blogs, or Product Review websites. Take for instance Dog Tear Stain Remover. I used to rank at about #5 for this keyword and now i rank at #20. I'm not sure if my loss of ranking is a combination of things like who Google is letting rank now or if there is something wrong on my website, etc. Can anyone give me a little insight?? Please.... I will also be happy to give more information about what has been going on if needed.
Technical SEO | | DTOSI0 -
Hosting sitemap on another server
I was looking into XML sitemap generators and one that seems to be recommended quite a bit on the forums is the xml-sitemaps.com They have a few versions though. I'll need more than 500 pages indexed, so it is just a case of whether I go for their paid for version and install on our server or go for their pro-sitemaps.com offering. For the pro-sitemaps.com they say: "We host your sitemap files on our server and ping search engines automatically" My question is will this be less effective than my installing it on our server from an SEO perspective because it is no longer on our root domain?
Technical SEO | | design_man0 -
Keep the blog separate or incorporate into main domain?
So my organization currently has both a main site and on a separate domain and separate host a wordpress blog. (our own domain not a wordpress.com) the content posted on this blog is local, community driven, and related to our business but it is not used in anyway as a "sales" tool. It's more for interaction purposes with members and employees. This blog has a lot of content and is updated with new posts very often. (generate traffic from a pretty wide variety of searches some related some not) My plan has been to 301 the old domain and move the wordpress blog over to our root domain in a subdirectory such as oursite.com/blog. Does anyone have tips for moving a blog over like this? I'm concerned about any link juice it has dropping off and since it does provide some links to our root site currently (since it's basically a separate site). Basically i'm wondering if it'll be worth the effort or if i should just keep it separate and focus on other content gen strategies.
Technical SEO | | SCFederal0 -
Is "last modified" time in XML Sitemaps important?
My Tech lead is concerned that his use of a script to generate XML sitemaps for some client sites may be causing negative issues for those sites. His concern centers around the fact that the script generates a sitemap which indicates that every URL page in the site was last modified at the exact same date and time. I have never heard anything to indicate that this might be a problem, but I do know that the sitemaps I generate for other client sites can choose server response or not. What is the best way to generate the sitemap? Last mod from actual time modified, or all set at one date and time?
Technical SEO | | ShaMenz0