IO Error - what does this mean?
-
I did a quick check on https://validator.w3.org
I got this error IO Error -
java.security.cert.CertificateException: Certificates do not conform to algorithm constraints
What does this mean?
-
Hi, it is hard to tell because I do not know the URL of the site so cant look at anything really. It appears to be a security certificate error and may have to do with an SSL Handshake Exception.
I would ask the developer to make sure that the Certificate and everything on the site is set correctly to use the certificate.
Here is a link to a Stackoverflow.com article on that error message. If you go a Google search for that error it can be caused by many different things I hope this helps.
Best Regards
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
"No Information Available" Error for Homepage in Google
Hi Everyone, Been racking my brain around this one. Not sure why it is happening. Basically Google is showing the "www" version of the homepage, when 99% of the site is "non-www". It also says "No Information Available". I have tried submitting it through GSC, but it is telling me it is blocked through the Robots.txt file. I don't see anything in there that would block it. Any ideas? shorturl.at/bkpyG I would like to get it to change to the regular "non-www" and actually be able to show information.
Intermediate & Advanced SEO | | vetofunk0 -
Google Detecting Real Page as Soft 404 Error
We've migrated my site from HTTP to HTTPS protocols in Sep 2017 but I noticed after migration soft 404 granularly increasing. Example of soft 404 page: https://bit.ly/2xBjy4J But these soft 404 error pages are real pages but Google still detects them as soft 404. When I checked the Google cache it shows me the cache but with HTTP page. We've tried all possible solutions but unable to figure out why Google is still indexing to HTTP pages and detecting HTTPS pages as soft 404 error. Can someone please suggest a solution or possible cause for this issue or anyone same issue like this in past.
Intermediate & Advanced SEO | | bheard0 -
Search engine blocked by robots-crawl error by moz & GWT
Hello Everyone,. For My Site I am Getting Error Code 605: Page Banned by robots.txt, X-Robots-Tag HTTP Header, or Meta Robots Tag, Also google Webmaster Also not able to fetch my site, tajsigma.com is my site Any expert Can Help please, Thanx
Intermediate & Advanced SEO | | falguniinnovative0 -
Why are "noindex" pages access denied errors in GWT and should I worry about it?
GWT calls pages that have "noindex, follow" tags "access denied errors." How is it an "error" to say, "hey, don't include these in your index, but go ahead and crawl them." These pages are thin content/duplicate content/overly templated pages I inherited and the noindex, follow tags are an effort to not crap up Google's view of this site. The reason I ask is that GWT's detection of a rash of these access restricted errors coincides with a drop in organic traffic. Of course, coincidence is not necessarily cause. Should I worry about it and do something or not? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Should we get our W3 Validation Errors Fixed for SEO. How important is it ?
Hi All, We implement most things on our Website that is recommended and most recently we did Schema.org. However, one area which we haven't done is fix our W3 Validation Errors. My developer thinks they are not so as such and it's more about ticking the boxes but does anymore have any experience whereby fixing all these did actually have an SEO /Ranking Benefit ?.. Most of our URL'S are indexed and google recrawls regularly so I am not sure as to it's importance. Also we have a mobile responsive version so I wasn't sure if it more important because of this. From what I read, I can't see to any benefit from fixing it all but just wanted some other opinions? thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
404 Error on Blog Pages that Look Like Loading Fine
There was recently a huge increase in 404 errors on Yandex Webmasters corresponding with a drop in rankings. Most of the pages seem to be from my blog (which was updated around the same time). When I click on the links from Yandex the page looks like it is loading normal, expect that it has the following message from the Facebook plugin I am using for commenting Any ideas about what the problem is or how to fix it? Critical Errors That Must Be Fixed | Bad Response Code: | URL returned a bad HTTP response code. | Open Graph Warnings That Should Be Fixed | Inferred Property: | The 'og:url' property should be explicitly provided, even if a value can be inferred from other tags. |
Intermediate & Advanced SEO | | theLotter
| Inferred Property: | The 'og:title' property should be explicitly provided, even if a value can be inferred from other tags. |
| Small og:image: | All the images referenced by og:image should be at least 200px in both dimensions. Please check all the images with tag og:image in the given url and ensure that it meets the recommended specification. |0 -
Wordpress error
On our Google Webmaster Tools I'm getting a Severe Health Warning regarding our Robot.txt file reading: User-agent: *
Intermediate & Advanced SEO | | NileCruises
Crawl-delay: 20 User-agent: 008
Disallow: / I'm wondering how I can fix this and stop it happening again. The site was hacked about 4 months ago but I thought we'd managed to clear things up. Colin0 -
Does Google penalize for having a bunch of Error 404s?
If a site removes thousands of pages in one day, without any redirects, is there reason to think Google will penalize the site for this? I have thousands of subcategory index pages. I've figured out a way to reduce the number, but it won't be easy to put in redirects for the ones I'm deleting. They will just disappear. There's no link juice issue. These pages are only linked internally, and indexed in Google. Nobody else links to them. Does anyone think it would be better to remove the pages gradually over time instead of all at once? Thanks!
Intermediate & Advanced SEO | | Interesting.com0