Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Few pages without SSL
-
Hi,
A website is not fully secured with a SSL certificate.
Approx 97% of the pages on the website are secured.A few pages are unfortunately not secured with a SSL certificate, because otherwise some functions on those pages do not work.
It's a website where you can play online games. These games do not work with an SSL connection.
Is there anything we have to consider or optimize?
Because, for example when we click on the secure lock icon in the browser, the following notice.
Your connection to this site is not fully secured Can this harm the Google ranking?Regards,
Tom -
It may potentially affect the rankings on:
-
pages without SSL
-
pages linking to pages without SSL
At first, not drastically - but you'll find that you'll get more and more behind until you had wished you just embraced HTTPS.
The exception to this of course, is if no one who is competing over the same keywords, is fully embracing SSL. If the majority of the query-space's ranking sites are insecure, even though Google frowns upon that - there's not much they can do (they can't just rank no one!)
So you need to do some legwork. See if your competitors suffer from the same issue. If they all do, maybe don't be so concerned at this point. If they're all showing signs of fully moving over to HTTPS, be more worried
-
-
Just to be sure, i would secure every page with an SLL certificate. When Google finds out that not every page is secure, this it may raise some eyebrows and even effect the whole site.
-
Yes that can hurt Google rankings. Insecure pages tend to rank less well and over time, that trend is only set to increase (with Google becoming less and less accepting of insecure pages, eventually they will probably be labelled a 'bad neighborhood' like gambling and porn sites). Additionally, URLs which link out to insecure pages (which are not on HTTPS) can also see adverse ranking effects (as Google knows that those pages are likely to direct users to insecure areas of the web)
At the moment, you can probably get by with some concessions. Those concessions would be, accepting that the insecure URLs probably won't rank very well compared with pages offering the same entertainment / functionality, which have fully embraced secure browsing (which are on HTTPS, which are still responsive, which don't link to insecure addresses)
If you're confident that the functionality you are offering, fundamentally can't be offered through HTTPS - then that may be only a minor concern (as all your competitors are bound by the same restrictions). If you're wrong, though - you're gonna have a bad time. Being 'wrong' now, may be more appealing than being 'dead wrong' later
Google will not remove the warnings your pages have, unless you play ball. If you think that won't bother your users, or that your competition is fundamentally incapable of a better, more secure integration - fair enough. Google is set to take more and more action on this over time
P.S: if your main, ranking pages are secure and if they don't directly link to this small subset of insecure pages, then you'll probably be ok (at least in the short term)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are best page titles for sub-domain pages?
Hi Moz communtity, Let's say a website has multiple sub-domains with hundreds and thousands of pages. Generally we will be mentioning "primary keyword & "brand name" on every page of website. Can we do same on all pages of sub-domains to increase the authority of website for this primary keyword in Google? Or it gonna end up as negative impact if Google consider as duplicate content being mentioned same keyword and brand name on every page even on website and all pages of sub domains? Thanks
Intermediate & Advanced SEO | | vtmoz0 -
Redirecting homepage to internal page (2nd Tier page)
We are planning to experiment redirecting our homepage to one of the 2nd tier page. I mean....example.com to example.com/page. We need this page to rank well, but it doesn't have much internal links or external back-links, so we opt for this redirect. Advantage with this page is, it has "keyword" we want to rank for in URL. "page" in example.com/page. Will this help or hurt us in SEO? I think we are missing keyword in our root domain, so interested to highlight this page. Thanks, Satish
Intermediate & Advanced SEO | | vtmoz0 -
Can noindexed pages accrue page authority?
My company's site has a large set of pages (tens of thousands) that have very thin or no content. They typically target a single low-competition keyword (and typically rank very well), but the pages have a very high bounce rate and are definitely hurting our domain's overall rankings via Panda (quality ranking). I'm planning on recommending we noindexed these pages temporarily, and reindex each page as resources are able to fill in content. My question is whether an individual page will be able to accrue any page authority for that target term while noindexed. We DO want to rank for all those terms, just not until we have the content to back it up. However, we're in a pretty competitive space up against domains that have been around a lot longer and have higher domain authorities. Like I said, these pages rank well right now, even with thin content. The worry is if we noindex them while we slowly build out content, will our competitors get the edge on those terms (with their subpar but continually available content)? Do you think Google will give us any credit for having had the page all along, just not always indexed?
Intermediate & Advanced SEO | | THandorf0 -
Location Pages On Website vs Landing pages
We have been having a terrible time in the local search results for 20 + locations. I have Places set up and all, but we decided to create location pages on our sites for each location - brief description and content optimized for our main service. The path would be something like .com/location/example. One option that has came up in question is to create landing pages / "mini websites" that would probably be location-example.url.com. I believe that the latter option, mini sites for each location, would be a bad idea as those kinds of tactics were once spammy in the past. What are are your thoughts and and resources so I can convince my team on the best practice.
Intermediate & Advanced SEO | | KJ-Rodgers0 -
When should you 410 pages instead of 404
Hi All, We have approx 6,000 - 404 pages. These are for categories etc we don't do anymore and there is not near replacement etc so basically no reason or benefit to have them at all. I can see in GWT , these are still being crawled/found and therefore taking up crawler bandwidth. Our SEO agency said we should 410 these pages?.. I am wondering what the difference is and how google treats them differently ?. Do anyone know When should you 410 pages instead of 404 ? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Targeting local areas without creating landing pages for each town
I have a large ecommerce website which is structured very much for SEO as it existed a few years ago. With a landing page for every product/town nationwide (its a lot of pages). Then along came Panda... I began shrinking the site in Feb last year in an effort to tackle duplicate content. We had initially used a template only changing product/town name. My first change was to reduce the amount of pages in half by merging the top two categories, as they are semantically similar enough to not need their own pages. This worked a treat, traffic didn't drop at all and the remaining pages are bringing in the desired search terms for both these products. Next I have rewritten the content for every product to ensure they are now as individual as possible. However with 46 products and each of those generating a product/area page we still have a heap of duplicate content. Now i want to reduce the town pages, I have already started writing content for my most important areas, again, to make these pages as individual as possible. The problem i have is that nobody can write enough unique content to target every town in the UK via an individual page (times by 46 products), so i want to reduce these too. QUESTION: If I have a single page for "croydon", will mentioning other local surrounding areas on this page, such as Mitcham, be enough to rank this page for both towns? I have approx 25 Google local place/map listings and grwoing, and am working from these areas outwards. I want to bring the site right down to about 150 main area pages to tackle all the duplicate content, but obviously don't want to lose my traffic for so many areas at once. Any examples of big sites that have reduced in size since Panda would be great. I have a headache... Thanks community.
Intermediate & Advanced SEO | | Silkstream0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
There's a website I'm working with that has a .php extension. All the pages do. What's the best practice to remove the .php extension across all pages?
Client wishes to drop the .php extension on all their pages (they've got around 2k pages). I assured them that wasn't necessary. However, in the event that I do end up doing this what's the best practices way (and easiest way) to do this? This is also a WordPress site. Thanks.
Intermediate & Advanced SEO | | digisavvy0