Landing pages showing up as HTTPS when we haven't made the switch
-
Hi Moz Community,
Recently our tech team has been taking steps to switch our site from http to https. The tech team has looked at all SEO redirect requirements and we're confident about this switch, we're not planning to roll anything out until a month from now.
However, I recently noticed a few https versions of our landing pages showing up in search. We haven't pushed any changes out to production yet so this shouldn't be happening. Not all of the landing pages are https, only a select few and I can't see a pattern. This is messing up our GA and Search Console tracking since we haven't fully set up https tracking yet because we were not expecting some of these pages to change.
HTTPS has always been supported on our site but never indexed so it's never shown up in the search results. I looked at our current site and it looks like landing page canonicals are already pointing to their https version, this may be the problem.
Anyone have any other ideas?
-
What I would do is the following: change the rel canonical back, remove the https version from Search Console (you need to add the https version of the website as well in Search Console) and then fetch and reindex the http version (also from Search Console). So basically, help Google understand this mistake and go back to the http version. Also, check your sitemaps and be sure that you are not including https links there. Hope this helps.
-
Hi Christian,
Thanks for the reply. HTTPS rel canonical were added to live pages, as I expected this is why some are showing up in the search results. It's a problem through for GA and Search console tracking since we haven't made the switch server side yet and currently http pages don't redirect to their https version yet. So we're seeing no sessions for our http versions.
If I change the rel=canonical back to http on the live site I'm guessing the non secure pages will show up again after being crawled?
Thanks!
-
Hi! I don't seem to understand the question. Is it that you added a https rel canonical to live pages and are wondering why it is indexed? If so, this is the normal behavior since your website already supports https and you have linked to it. The reason why only a few landing pages show up as https for now might be related to how and when the crawler got there. I hope I didn't totally misunderstand the question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No: 'noindex' detected in 'robots' meta tag
Pages on my site show No: 'noindex' detected in 'robots' meta tag. However, when I inspect the pages html, it does not show noindex. In fact, it shows index, follow. Majority of pages show the error and are not indexed by Google...Not sure why this is happening. The page below in search console shows the error above...
Technical SEO | | Sean_White_Consult0 -
Can't get my site recognised for keyword
My site prettycool.co.uk and primary we sell fascinators, the problem is I can't get the word fascinators to be listed by Google. We are on the 1st page for most colours ie. pink fascinators, blue fascinators etc. but for the term fascinators even if we fetch we are listed for a couple of hours and then disappear. I've checked for keyword stuffing but our site sell fascinators and we need to have this word in our site and other sites have a lot more references to the term and are listed on the 1st or 2nd pages. We used to be listed on page 1 for many years but the last 2 or 3 years dropped back to page 4 but now nothing. Any help or suggestions would be fantastic!
Technical SEO | | Rutts0 -
What to do when half of my pages aren't being viewed?
My site is roughly 1000 pages. I've begun refreshing older content. I noticed about half of my pages have no incoming traffic. Should I look at combining some of these pages and 301 redirecting the former links to that new "bigger" page and then having my home page show that new consolidated content? They don't have good back links either. Example layout now: Home Page - Restaurants [show list of cuisines] - User clicks on Italian [show list of all Italian restaurants] - Choice 1 - Choice 2 Even though my main page is seen by about 100,000 people a month, it doesn't seem like anyone is interested in going down that path so none of the restaurants are clicked. How could I improve the user interface/experience and incorporate best Google practices? Thanks, Steve
Technical SEO | | recoil0 -
If the order of products on a page changes each time the page is loaded, does this have a negative effect on the SEO of those pages?
Hello, a client of mine has a number of category pages that each have a list of products. Each time the page is reloaded the order of those products changes. Does this have a negative effect on the pages' rankings? Thank you
Technical SEO | | Kerry_Jones2 -
Medium sizes forum with 1000's of thin content gallery pages. Disallow or noindex?
I have a forum at http://www.onedirection.net/forums/ which contains a gallery with 1000's of very thin-content pages. We've currently got these photo pages disallowed from the main googlebot via robots.txt, but we do all the Google images crawler access. Now I've been reading that we shouldn't really use disallow, and instead should add a noindex tag on the page itself. It's a little awkward to edit the source of the gallery pages (and keeping any amends the next time the forum software gets updated). Whats the best way of handling this? Chris.
Technical SEO | | PixelKicks0 -
Should I Use the Disavow Tool to for a Spammy Site/Landing Page?
Here's the situation... There's a site that is linking to about 6 articles of mine from about 225 pages of theirs (according to info in GWT). These pages are sales landing pages looking to sell their product. The pages are pretty much identical but have different urls. (I actually have a few sites doing this to me.) Here's where I think it's real bad -- when they are linking to me you don't see the link on the page, you have to view the page source and search for my site's url. I'm thinking having a hidden url, and it being my article that's hidden, has got to be bad. That on top of it being a sales page for a product I've seen traffic to my site dropping but I don't have a warning in GWT. These aren't links that I've placed or asked for in any way. I don't see how they could be good for me and I've already done what I could to email the site to remove the links (I didn't think it would work but thought I'd at least try). I totally understand that the site linking to me may not have any affect on my current traffic. So should I use the Disavow tool to make sure this site isn't counting against me?
Technical SEO | | GlenCraig0 -
My 404 page shows in the report as an error.
How can i make my actual 404 page not show up as a 404 error in the report?
Technical SEO | | LindseyNewman0 -
404 errors on a 301'd page
I current have a site that when run though a site map tool (screaming frog or xenu) returns a 404 error on a number of pages The pages are indexed in Google and when visited they do 301 to the correct page? why would the sitemap tool be giving me a different result? is it not reading the page correctly?
Technical SEO | | EAOM0