Drupal, http/https, canonicals and Google Search Console
-
I’m fairly new in an in-house role and am currently rooting around our Drupal website to improve it as a whole. Right now on my radar is our use of http / https, canonicals, and our use of Google Search Console. Initial issues noticed:
- We serve http and https versions of all our pages
- Our canonical tags just refer back to the URL it sits on (apparently a default Drupal thing, which is not much use)
- We don’t actually have https properties added in Search Console/GA
I’ve spoken with our IT agency who migrated our old site to the current site, who have recommended forcing all pages to https and setting canonicals to all https pages, which is fine in theory, but I don’t think it’s as simple as this, right? An old Moz post I found talked about running into issues with images/CSS/javascript referencing http – is there anything else to consider, especially from an SEO perspective?
I’m assuming that the appropriate certificates are in place, as the secure version of the site works perfectly well.
And on the last point – am I safe to assume we have just never tracked any traffic for the secure version of the site?
Thanks
John
-
OK I gotcha now. You can submit the sitemap in all versions of Search Console, won't hurt anything to have it referenced in multiple profiles of SC.
Another thing you can do to make sure crawlers find your XML is add this line to your robots.txt file:
Sitemap: http://yoursitecom/sitemap.xml
-
Thanks so much, this is so helpful!
About the search console question, I may have confused you. This is what I mean: I have a www and non-www property of the website in Search Console (from before my time), which looks like this:
|
property
|
Sitemap
|
http://www.mysite.com/sitemap.xml
|
NO SITEMAP LINKED
|
(apologies that has not formatted well, I hope you can decipher!)
With a sitemap linked to the www version and nothing to the non-www version. The sitemap is located on the non-www version of the site, so I was just wondering if the above scenario has essentially meant we've had no sitemap submissions to date (that said, the sitemap appears to be pulling through despite being the "wrong" address, so I can only think there are either 2 separate sitemap files, OR the redirect we have set from www to non-www is having an effect?)
-
Hi John, always glad to help!
For your Search Console question: When you get the redirects setup and have committed to your site being all HTTPS, you'll want to move the location of your XML sitemap to https://yoursite.com/sitemap.xml. As Cyrus mentions in that article, don't update the URLs in the sitemap yet, let search engines hit them as non-secure for a while, I think he recommends 30 days, to give them a chance to learn your new protocol and for them to hit your redirects multiple times.
For your www question: There's no difference in SEO-value whether you choose www or non-www, simply a preference. The only thing that matters here is that you pick one and stick with it.
For your GA question: That is correct, you are seeing traffic from both in GA. GA will collect and report on any page/URL/website that your UA-ID is on. If someone scraped your site and took the GA script with it, you'd start seeing their traffic in your reporting view (that's why appending hostname is always a good idea ). You can specify in the View Settings of GA what your protocol is.
-
Hi Logan,
Thanks for your quick response, that’s very helpful and the article you provided is great.
I hadn’t thought of the purpose of self-referring canonicals, thanks for clarifying.
Re: Search Console: I’ve just noticed we only have a sitemap linked for the http://www property. Currently, all www. traffic is redirected to the non-www version of any given page (forgetting https for a second). Is this an issue in terms of pagerank?
And my last question, I promise! If our UA tag is firing on both http and https versions of the site, should we be seeing traffic from both in GA, if the property/view default url is set to http:// ? By my understanding, that setting is just a vanity thing for reporting purposes, but I’m not sure where, if anywhere, I need to specify in a particular view that http:// and https:// traffic should be treated as the same thing?
-
Hi John,
For the most part, your IT partner is correct, 2 of the most important things are to 301 all HTTP requests to HTTPS and to update canonicals. I often refer to people with questions about HTTPS to this post written by Cyrus Shepard, he covers all the bases needed for an SEO-friendly secure migration: https://moz.com/blog/seo-tips-https-ssl.
Regarding your specific comments:
- We serve http and https versions of all our pages - A 301 redirect rule will correct this
- Our canonical tags just refer back to the URL it sits on (apparently a default Drupal thing, which is not much use) - Self-referring canonicals like this serve plenty of purpose, they just need to match your preferred version www/non-www http/https, etc. etc. Self-referring canonicals help prevent duplicates caused by parameters, case-sensitive URLs, and the aformentioned HTTP/S and www/non-www.
- We don’t actually have https properties added in Search Console/GA - You should add another profile for HTTPS, verification should be simple since you've already proven you're the site owner. You want to have both profiles in GSC so you can monitor the shift of indexed URLs from HTTP to HTTPS. Also good for future troubleshooting should you see and issue with indexing of HTTP in the future for some reason.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home Page Being Indexed / Referral URLs /
I have a few questions related to home page URLs being indexed, canonicalization, and GA reporting... 1. I can view the home page by typing in domain.com , domain.com/ and domain.com/index.htm There are no redirects and it's canonicalized to point to domain.com/index.htm -- how important is it to have redirects? I don't want unnecessary redirects or canonical tags, but I noticed the trailing slash can sometimes be typed in manually on other pages, sometimes not. 2. When I do a site search (site:domain.com), sometimes the HP shows up as "domain.com/", never "domain.com/index.htm" or "domain.com", and sometimes the HP doesn't show up period. This seems to change several times a day, sometimes within 15 minutes. I have no idea what is causing it and I don't know if it has anything to do with #1. In a perfect world, I would ask for the /index.htm to be dropped and redirected to .com/, and the canonical to point to .com/ 3. I've noticed in GA I see / , /index.htm, and a weird Google referral URL (/index.htm?referrer=https://www.google.com/) all showing up as top pages. I think the / and /index.htm is because I haven't setup a default URL in GA, but I'm not sure what would cause the referrer. I tracked back when the referrer URL started to show up in the top pages, and it was right around the time they moved over to https://, so I'm not sure what the best option is to remove that. I know this is a lot - I appreciate any insight anyone can provide.
Technical SEO | | DigMS0 -
Search Console - Should I request to index redirected URL or Mark as fixed?
Hi all, Many blog posts used to be showing 404s when doing crawl tests and in search console (despite being there when visited.) I realized it was an issue with URL structure. It used to be example.com/post-name I've fixed the issue by changing the URL structure in Wordpress so that they now follow the structure of example.com/post-type/post-name According to sitemaps, Google has now indexed all posts in /post-type/post-name. My question is what to do with crawl errors in Search Console that are still there for example.com/postname. When I fetch, I get a redirect status (which is accurate). At this point should I request to index or mark as fixed? Thank you!
Technical SEO | | MouthyPR0 -
Local Google vs. default Google search
Hello Moz community, I have a question: what is the difference between a local version of Google vs. the default Google in regards to search results? I have a Mexican site that I'm trying to rank in www.google.com.mx, but my rankings are actually better if I check my keywords on www.google.com The domain is a .mx site, so wouldn't it make more sense that this page would rank higher on google.com.mx instead of the default Google site, which in theory would mean a "broader" scope? Also, what determines whether a user gets automatically directed to a local Google version vs. staying on the default one? Thanks for your valuable input!
Technical SEO | | EduardoRuiz0 -
How to remove my cdn sub domins on Google search result?
A few months ago I moved all my Wordpress images into a sub domain. After I purchased CDN service, I again moved that images to my root domain. I added User-agent: * Disallow: / to my CDN domain. But now, when I perform site search on the Google, I found that my CDN sub domains are indexed by the Google. I think this will make duplicate content issue. I already hit by the Panguin. How do I remove these search results on Google? Should I add my cdn domain to webmaster tools to request URL removal request? Problem is, If I use cdn.mydomain.com it shows my www.mydomain.com. My blog:- http://goo.gl/58Utt site search result:- http://goo.gl/ElNwc
Technical SEO | | Godad1 -
Domain Forwarding / Multiple Domain Names / or Rebuild Blogs on them
I am considering forwarding 3 very aged and valuable domain names to my main site. There were once over 100 blog posts on each blog and each one has a page authority of 45 and domain authority of 37. My question is should i put up three blogs on the domains and link them to my site or should i just forward the domains to my main site? Which will provide me with more value. I have the capability to have some one blog on them every day. However, i do not have access to any of the old blog posts. I guess i could scrape it of archive.org. Any advice would be appreciated. Scott
Technical SEO | | WindshieldGuy-2762210 -
WordPress - How to stop both http:// and https:// pages being indexed?
Just published a static page 2 days ago on WordPress site but noticed that Google has indexed both http:// and https:// url's. Usually I only get http:// indexed though. Could anyone please explain why this may have happened and how I can fix? Thanks!
Technical SEO | | Clicksjim1 -
How does google know a search result is a search result?
In the google webmaster forums, google specifically states that you should not include search results in the google index. What is the best way to make dynamic, great content show in search results without receiving a penalty?
Technical SEO | | nicole.healthline0 -
Do search engines still index/crawl private content?
If you have a membership site, which requires a payment to access specific content/images/videos, do search engines still use that content as a ranking/domain authority factor? Is it worth optimizing these "private" pages for SEO?
Technical SEO | | christinarule1