Base HREF set without HTTP. Will this cause search issues?
-
The base href has been set in the following format:
<base href="//www.example.com/">
I am working on a project where many of the programming team don't believe that SEO has an impact on a website. So, we often see some strange things. Recently, they have rolled out an update to the website template that includes the base href I listed above. I found out about it when some of our tools such as Xenu link checker - suddenly stopped working.
Google appears to be indexing the the pages fine and following the links without any issue - but I wonder if there is any long term SEO considerations to building the internal links in this manner?
Thanks!
-
Thanks for the comment. I was able to get them to make the changes, but I think I have made some new enemies. Oh well, I will move on in a few months anyhow.
Thanks again,
Joe
-
The W3C standards might allow for no protocol, but you would never just put "//" - that's part of the protocol ("http://", "https://", "ftp://", etc.). This usage is technically incorrect. It could cause minor issues on some browsers (although probably not on newer ones).
Does it matter for SEO? Well, that's a bit trickier. Google tend to ignore base href unless there are ambiguous relative URLs, like canonical tags that have no base URL and are unclear. Practically speaking, it's probably not a huge problem, but it is possible for it to cause issues down the road.
Either way, if it's on a sitewide template, it's a 5-minute job, and what they have is wrong. I'm not one to knock devs (I've been a dev and I've managed devs), but they need to stop arguing and just fix it.
-
They have put it on every page. The programming manager is quick to point out that according to W3C neither http nor https are required for proper links. I have just never seen anyone purposely make all internal links begin with double slashes (//). It certainly makes xenu die, but I am not sure if there is any downside other than xenu and a few other tools not working.
Thanks!
-
Is that code on every page or just the homepage?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Discovered - currently not indexed issues
Hi there We're having an issue with our site recently where our new blog posts are not indexing properly from Google.
Technical SEO | | Syte
Inspecting it with Google Search Console gives up the errors "Discovered - currently not indexed" and "Excluded". I.e. it seems like Google sees our sitemap but chooses not to crawl and index for some reason. Does anybody have any ideas why this might be, and what we could do to fix it?
Thanks0 -
My website is not avaliable, will i lose ranking?
My website was not available during 12 hours and i think that i will lose ranking by that. What do you think about it? Will i lose rankings? Some URL were lost during the drop of server, what should i do? Create again? Delete on GWT? Thanks so much.
Technical SEO | | pompero990 -
Http and https issue in Google SERP
Hi, I've noticed that Google indexing some of my pages as regular http, like this: http://www.example.com/accounts/ and some pages are being indexed as https, like this: https://www.example.com/platforms/ When I've performed site audit check in various SEO tools I got something around +450 pages duplicated and showing me pairs of the same URL pages, one time with http and one time with https. In our site there is the possibility for people to register and and open an account, later on to login to our website with their login details. In our company I'm not the one that is responsible for the site's maintenance and I would like to know if this is an issue, and if this is an issue - to know what causing it and how to fix it so I'll be able to forward the solution to the person in charge. Additionally I would like to know in general, what is the real purpose of https vs. http and to know what is the preferred method that our website should use. Currently when URLs are typed manually to the address bar, all the URLs are loading fine - with or without https written at the start of each URL. I'm not allowed to expose our site's name, this is why I wrote example.com instead, I hope you can understand that. Thank you so much for your help and I'm looking forward reading your answers.
Technical SEO | | JonsonSwartz0 -
Indexing Issue
Hi, I am working on www.stjohnswaydentalpractice.co.uk Google only seems to be indexing two of the pages when i search site:www.stjohnswaydentalpractice.co.uk I have added the site to webmaster tools and created a new sitemap which is showing that it has only submitted two of the pages. Can anyone shed any light for why these pages are not being indexed? Thanks Faye
Technical SEO | | dentaldesign0 -
Canonical Issue?
Hi, I was using the On Page Report Card Tool here on SEOMOZ for the following page: http://www.priceline.com/eventi-a-kimpton-hotel-new-york-city-new-york-ny-1614979-hd.hotel-reviews-hotel-guides and it claims there is a canonical issue or improper use of it. I looked at the element and it seems to be fine: <link rel="canonical" href="http://www.priceline.com/eventi-a-kimpton-hotel-new-york-city-new-york-ny-1614979-hd.hotel-reviews-hotel-guides" /> Can you spot the issue and how it would be fixed? Thanks. Eddy
Technical SEO | | workathomecareers0 -
Duplicate Pages Issue
I noticed a problem and I was wondering if anyone knows how to fix it. I was a sitemap for 1oxygen.com, a site that has around 50 pages. The sitemap generator come back with over a 2000 pages. Here is two of the results: http://www.1oxygen.com/portableconcentrators/portableconcentrators/portableconcentrators/services/rentals.htm
Technical SEO | | chuck-layton
http://www.1oxygen.com/portableconcentrators/portableconcentrators/1oxygen/portableconcentrators/portableconcentrators/portableconcentrators/oxusportableconcentrator.htm These are actaully pages somehow. In my FTP there in the first /portableconentrators/ folder there is about 12 html documents and no other folders. It looks like it is creating a page for every possible folder combination. I have no idea why you those pages above actually work, help please???0 -
Warnings on Pages excluded from Search Engines
I am new to this, so my question may seem a little rookie type... When looking at my crawl diagnostic errors there are 1604 warnings for "302 redirects". Of those 1604 warnings 1500 of them are for the same page with different product ID's on them such as: www.soccerstop.com/EMailproduct.aspx?productid=999
Technical SEO | | SoccerStop
www.soccerstop.com/EMailproduct.aspx?productid=998 In our robots.txt file we have Disallow: /emailproduct.aspx Wouldn't that take care of this problem? If so, why is it still giving me these warning errors? It does take into account our robots.txt file when generating this report does it not? Thanks for any help you can provide.
James0