Do I need to use a trailing slash to homepage in canonical and hreflang?
-
Currently I have a 301 redirect from
https://www.mysite.com/
to
https://www.mysite.comAnd in my canonical and hreflang and also insite links I use consistently https://www.mysite.com without trailing slash. Is this OK? Or do I need to add a trailing slash?
-
Hi there,
Your intention is that https://www.mysite.com be the principal page.
So you've done everything just fine. Even more, you've said that have used consistenly the https://www.mysite.com, that's fine in my opinion.Hope it helps.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Breadcrumbs markup need to be fixed
Hi there, Google is up to mass spamming, the latest one refers o an Enhancements > Breadcrukbs report, the message is: "...Google systems show that your site is affected by 24 instances of Breadcrumbs markup issues. This means that your Breadcrumbs pages might not appear as rich results in Google Search. Search Console has created a new report just for this rich result type..." I've used their Structured Data Testing Tool, no errors were highlighted. Can anyone fathom out what they're referring to, please?
Intermediate & Advanced SEO | | jasongmcmahon0 -
Do I need to worry about sub-domains?
Hi Moz commnity, Our website ranking was good and dropped for couple of recent months. We have around 10 sub-domains. I doubt them if they are hurting us. Being said all over in SEO industry like the sub-domains are completely different websites; will they hurt if they are not well optimised? And we have many links from our sub-domains to website top pages, is this wrong for Google? How to well maintain the sub-domains? Do I need to worry about them? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Only homepage is ranking after site re-launch
We've been moving all our sites over to a new platform (Demandware) this year. In the process, they've all gotten updated designs (from the same template), on-page optimizations, etc. Since they're all on the same platform and are essentially copies from one template, any technical issues found have been fixed across all sites. The problem I'm seeing is there are a few sites that haven't really seen much/any recovery from the site launch, and these are sites that were done 4-5 months ago. There's one in particular that's especially concerning, since it's showing issues that none of the other sites seem to have. In my Moz reports, it looks like of all the keywords that are ranking, they're only ranking the https version of the homepage (and from what I'm seeing, the https version wasn't picked up and ranked until the beginning of October, which was also the time that WMT shows a huge drop in clicks and impressions). I've crawled the site (ScreamingFrog), done a site search in Google (all pages look to be indexed), etc. and I haven't come across any specific problems there that would suggest a technical issue. We're wondering if it might be a link authority problem, since this site had the most dramatic change in navigation. The navigation used to be product based (Boots, Shoes, etc.) and is now broken up by gender. I've noticed that a few other pages that are ranking are dual gender pages that also existed on the old site, whereas all of these new categories aren't ranking at all and I'm not seeing this happen with any of our other sites. I've gone down a bunch of different paths trying to figure this out, but I haven't come up with any concrete answers as to why this is happening and how to fix it. Any thoughts as to what else I can look into or try for this?
Intermediate & Advanced SEO | | WWWSEO0 -
Site Redesign Inconsistent Trailing Slash Issue
I'm looking at a site that has implemented trailing slashes inconsistently across multiple pages. For instance:
Intermediate & Advanced SEO | | GrouchyKids
http://www.examplesite.co.uk/ (WITH)
http://www.examplesite.co.uk/product-range (WITHOUT)
http://www.examplesite.co.uk/product (WITHOUT)
http://www.examplesite.co.uk/blog/ (WITH)
http://www.examplesite.co.uk/blog/blog-article/ (WITH) The blog was created later in Wordpress which is one of the reasons why this issue exists. Looking at the inbound links unsurprisingly the lions share go to the home page but lots of other pages have links as well, particularly the product pages, no to many to the blog pages. This pattern is similar in terms of which pages rank, the home page ranks well for a variety of phrases, the product pages also do quite well. I know that ideally the URL's should be identical to the existing site, or if you have to you should 301 redirect old to new. The client wants to switch the whole site over to Wordpress which will be default implement a consistent URL structure across the board, thereby changing at least some of the URL's no matter what I do. I remember a Matt Cutts video that stated that even a 301 redirect will loose a clicks worth of link juice see: https://www.youtube.com/watch?v=Filv4pP-1nw The existing site has a poor UX compared to the new proposed design so this should help us. Has anyone got any experience with a similar issue or any advice about how best to proceed?0 -
Using disavow tool for 404s
Hey Community, Got a question about the disavow tool for you. My site is getting thousands of 404 errors from old blog/coupon/you name it sites linking to our old URL structure (which used underscores and ended in .jsp). It seems like the webmasters of these sites aren't answering back or haven't updated their sites in ages so it's returning 404 errors. If I disavow these domains and/or links will it clear out these 404 errors in Google? I read the GWT help page on it, but it didn't seem to answer this question. Feel free to ask any questions that may help you understand the issue more. Thanks for your help,
Intermediate & Advanced SEO | | IceIcebaby
-Reed0 -
HTTP Header Canonical Tags
I want to be able to add canonical tags to http headers of individual URL's using .htacess, but I can't find any examples for how to do this. The only example I found was when specifying a file: http://www.seomoz.org/blog/how-to-advanced-relcanonical-http-headers N.B. It's not possible to add regular canonical tags to the of my pages as they're dynamically generated. I was trying to add the following to the .htaccess in order to add a canonical tag in the header of the page http://frugal-father.com/is-finance-in-the-uk-too-london-centric/, but I've checked with Live HTTP headers and the canonical line isn't showing : <files "is-finance-in-the-uk-too-london-centric="" "="">Header add Link "<http: frugal-father.com="">; rel="canonical"'</http:></files> Any ideas?
Intermediate & Advanced SEO | | AndrewAkesson0 -
Using a 302 instead of a 301
I am trying to figure out the best way to garner the most amount of link value. We have an app that lives on a sub-domain ... For the purposes of this question, let's call it app.mydomain.com. We provide a service with this app that requires clients (with very high ranking websites) to link into app located on the sub-domain. Would I garner more authority if had the high ranking client website link into a url that wasn't a sub-domain and redirect it using a 302? For example: What if I created a 302 that was www.mydomain.com/app and have it redirected to the sub-domain version of app.mydomain.com? Additionally am I correct to assume that a 301 would merely pass that value to the sub domain and NOT provide much value to the root?
Intermediate & Advanced SEO | | NextGenEDU0 -
Use rel=canonical to save otherwise squandered link juice?
Oftentimes my site has content which I'm not really interested in having included in search engine results. Examples might be a "view cart" or "checkout" page, or old products in the catalog that are no longer available in our system. In the past, I'd blocked those pages from being indexed by using robots.txt or nofollowed links. However, it seems like there is potential link juice that's being lost by removing these from search engine indexes. What if, instead of keeping these pages out of the index completely, I use to reference the home page (http://www.mydomain.com) of the business? That way, even if the pages I don't care about accumulate a few links around the Internet, I'll be capturing the link juice behind the scenes without impacting the customer experience as they browse our site. Is there any downside of doing this, or am I missing any potential reasons why this wouldn't work as expected?
Intermediate & Advanced SEO | | cadenzajon1