Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Why google stubbornly keeps indexing my http urls instead of the https ones?
-
I moved everything to https in November, but there are plenty of pages which are still indexed by google as http instead of https, and I am wondering why.
Example: http://www.gomme-auto.it/pneumatici/barum correctly redirect permanently to https://www.gomme-auto.it/pneumatici/barum
Nevertheless if you search for pneumatici barum: https://www.google.it/search?q=pneumatici+barum&oq=pneumatici+barum
The third organic result listed is still http.
Since we moved to https google crawler visited that page tens of time, last one two days ago. But doesn't seems to care to update the protocol in google index.
Anyone knows why?
My concern is when I use API like semrush and ahrefs I have to do it twice to try both http and https, for a total of around 65k urls I waste a lot of my quota.
-
Thanks again Dirk! At the end I used xenu link sleuth and I am happy with the result.
-
Hi Massimiliano,
In Screaming Frog there is the option: Bulk Export > All inlinks -> this generates the full list of all your internal links with both source & destination. In Excel you just have to put a filter on the "Destination" column - to show only the url's starting with "http://" and you get all the info you need. This will probably not solve the issues with the images. For this the next solution below could be used.
The list can be quite long depending on the total number of url's on your site. An alternative would be to add a custom filter under 'Configuration>Custom' - only including url's that contain "http://www.gomme-auto.it" or "http://blog.gomme-auto.it" in the source, but in your case this wouldn't be very helpful as all the pages on your site contain this url in the javascript part. If you change the url's in the Javascript to https this could be used to find references to non https images.
If you want to do it manually, it's also an option - in the view 'internal' of the crawler you put "http://" in the search field - this shows you the list of all the http:// url's. You have to select the http url's one by one. For each of the url's you can select "Inlinks" at the bottom of the screen & then you see all the url's linking to the http version. This works for both the html & the images.
Hope this helps,
rgds
Dirk
-
Forgot to mention, yes I checked the scheme of the serp results for those pages, is not just google not displaying it, it really still have the http version indexed.
-
Hi DC,
in screaming frog I can see the old http links. Usually are manually inserted links and images in wordpress posts, I am more than eager to edit them, my problem is how to find all the pages containing them, in screaming frog I can see the links, but I don't see the referrer, in which page they are contained. Is there a way to see that in screaming frog, or in some other crawling software?
-
Hi,
First of all, are you sure that Google didn't take the migration into account?I just did a quick check on other https sites. Example: when I look for "Google Analytics" in Google - the first 3 results are all pointing to Google Analytics site, however only for the 3rd result the https is shown, even when all three are in https. So it's possible it is just a display issue rather than a real issue.
Second, I did a quick crawl of your site and I noticed that on some pages you still have links to the http version of your site (they are redirected but it's better to keep your internal links clean - without redirections).
When I checked one of these pages (https://www.gomme-auto.it/pneumatici/pneumatici-cinesi) I noticed that this page has some issues as it seems to load elements which are not in https - possible there are others as well.
example: /pneumatici/pneumatici-cinesi:1395 Mixed Content: The page at 'https://www.gomme-auto.it/pneumatici/pneumatici-cinesi' was loaded over HTTPS, but requested an insecure image 'http://www.gomme-auto.it/i/pneumatici-cinesi.jpg'. This content should also be served over HTTPS.
The page you mention as example: the http version still receives two internal links from https://www.gomme-auto.it/blog/pneumatici-barum-gli-economici-che-assicurano-ottime-prestazioni and https://www.gomme-auto.it/pneumatici/continental with anchor texts 'pneumatici Barmum' & 'Barum'
Guess google reasons, if the owner of the site is not updating his internal links, I'm not going to update my index
On all your pages there is a part of the source which contains calls to the http version - it's inside a script so not sure if it's really important, but you could try to change it to https as well
My advice would be to crawl your site with Screaming Frog, and check where links exist to http versions and update these links to https (or use relative links - which is adviced by Google (https://support.google.com/webmasters/answer/6073543?hl=en see part 'common pitfalls')
rgds
Dirk
-
Mhhh, you are right theoretically could be the crawler budget. But if that is the case I should see that from the log, I should miss crawler visits on that page. Instead the crawler is happily visiting them.
By the way, how would you "force" the crawler to parse these pages?
I am going to check the sitemap now to remove that port number and try to split them. Thanks.
-
Darn it, you are right, we added a new site, not a change of address, sorry about that. Apparently my coffee is no longer effective!
-
As far as I know the change of address for http to https doesn't work, the protocol is not accepted when you do a change of address. And somewhere I read google itself saying when moving to https you should not do a change of address.
But they suggest to add a new site for the https version in GWT, which I did, and in fact the traffic slowly transitioned from the http site to the https site in GWT in the weeks following the move.
-
Are you sure? On https://support.google.com/webmasters/answer/6033080?hl=en&ref_topic=6033084 it says: "No need to submit a change of address if you are only moving your site from HTTP to HTTPS."
I dont think you are given the option to select the same domain for change of address in GWT.
-
Looks like you are doing everything right (set up 301 redirects, updated all links on the site, updated canonical urls) - just need to force the crawlers to parse those pages more. perhaps crawler is hitting its budget before it gets to recrawl all of your new urls?
You should also update your sitemap as it contains a bunch of links that look like: https://www.gomme-auto.it:443/pneumatici/estivi/pirelli/cinturato-p1-verde/145/65/15/h/72
I recommend creating several sitemaps for different sections of the site and seeing how they are indexed via GWT.
-
Did you do a change of address in Google Webmaster Tools? Http and Https are considered different URLs, and you will have to do a change of address if you switched to a full https site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Analytics Real Time Not Working! :(
Hello Everyone, Today, the real time feature in my google analytics stopped working. I am able to see that traffic is visiting my site, but not in real time. The real time count is usually at 0. But, there are some instances where real time will come back online, but there will be only 1 unique visitor. At any given time, our website usually has 20 visitors or so..Has anyone encountered this issue? Where should i start looking for fixes? What are the possible problems? Thanks!
Reporting & Analytics | | KarlMarxTheBear0 -
Google Analytics - Next Page Path is the Same URL?
Hey Everyone, I have a Google analytics question. I'm looking through a client's site and when I look at the next page path, I get the same URL as the next path. For example, on the homepage, the next page path I get is the homepage again? This happens for all URL's, is this an implementation error? Is there a way to fix this? Thanks!
Reporting & Analytics | | EvansHunt0 -
Is it possible to use Google Tag Manager to pass a user’s text input into a form field to Google analytics?
Hey Everyone, I finally figured out how to use auto event tracking with Google Tag Manager, but didn't get the data I wanted. I want to see what users are typing into the search field on my site (the URL structure of my site isn't set up properly to use GA's built-in site search tracking). So, I set up the form submit event tracking in Google Tag Manager and used the following as my event tracking parameters: Category: Search Action: Search Value When I test and look in Google Analytics I just see: "search" and "search value." I wanted to see the text that I searched on my site. Not just the Action and Category of the event.... Is what I'm trying to do even possible? Do I need to set up a different event tracking parameter? Thanks everyone!
Reporting & Analytics | | DaveGuyMan0 -
How do I manually add transactions to Google Analytics
We are seeing Google Analytic's drop transaction on our site so therefore all the figures are skewed. Is there a way I can manually add transactions to GA to cover the missing one?
Reporting & Analytics | | Towelsrus0 -
Easiest way to get out of Google local results?
Odd one this, but what's the easiest way to remove a website from the Google local listings? Would removing all the Google map listings do the job? A client of ours is suffering massively since the Google update in the middle of last month. Previously they would appear no1 or no2 in the local results and normally 1 or 2 in the organic results. However, since the middle of last month any time they rank on the first page for a local result, their organic result has dropped massively to at least page 4. If I set my location as something different in google, say 100 miles away, they then rank well for the organic listings (obviously not appearing for local searches). When I change it back to my current location the organic listing is gone and they are back to ranking for the local. Since the middle of July the traffic from search engines has dropped about 65%. All the organic rankings remain as strong as ever just not in the areas where they want to get customers from!! The idea is to remove the local listing and get the organics reranking as the ctr on those is much much higher. On a side note, anyone else notice very poor ctr on google local listings? Maybe users feel they are adverts thanks
Reporting & Analytics | | ccgale0 -
Setting Up Google Analytic with Sub Folder Sites
What is the best way of setting up Google Analytic for a website that has many sub folders? The main site is example.com and it has 40 sub folder sites like example.com/uk example.com/France etc etc Would it be advised to track a single domain in Google Analytic then create filters for the sub folder sites. Filters > Include traffic from > Sub directories Also with this method is it possible to view overall incoming website stats for everything? Previous experience would be great with this thanks 🙂
Reporting & Analytics | | daracreative0 -
Why are Seemingly Randomly Generated URLs Appearing as Errors in Google Webmaster Tools?
I've been confused by some URLs that are showing up as errors in our GWT account. They seem to just be randomly generated alphanumeric strings that Google is reporting as 404 errors. The pages do 404 because nothing ever existed there or was linked to. Here are some examples that are just off of our root domain: /JEzjLs2wBR0D6wILPy0RCkM/WFRnUK9JrDyRoVCnR8= /MevaBpcKoXnbHJpoTI5P42QPmQpjEPBlYffwY8Mc5I= /YAKM15iU846X/ymikGEPsdq 26PUoIYSwfb8 FBh34= I haven't been able to track down these character strings in any internet index or anywhere in our source code so I have no idea why Google is reporting them. We've been pretty vigilant lately about duplicate content and thin content issues and my concern is that there are an unspecified number of urls like this that Google thinks exist but don't really. Has anyone else seen GWT reporting errors like this for their site? Does anyone have any clue why Google would report them as errors?
Reporting & Analytics | | kimwetter0 -
Google Analytics: how many visits from country Google domains?
Hello, I manage a site with visitors from many different countries. With Google Analytics, it is normal to see the number of visitors from each search engine. However, I would like to identify the number of visitors from each Google-search contry domain. How many visitors from Google.com? How many from Google.co.uk. And from Google.co.zm? And so on. Anybody knows if this is possible and if yes, how can it be done? Thank you in advance, Dario
Reporting & Analytics | | Darioz0