Frasier wrote:I do believe encrypted traffic to be useful.
How is encrypted traffic useful if the
cert errors are just going to be ignored? In this particular instance, I would not term this as a '
minor cert error.' Did you follow
satrow's link?
Just to be clear, I am not talking about the site as posted by the OP. The OP's question does raise a broader question, and since my concern and reason for coming to the forum concerns the same behavior I didn't think it productive to open a new topic, which may be my bad. That being said, the broader question remains: Is it appropriate for browser code to have the final say on whether (or not) a user can look at a web site? This is the crux of the matter.
Encrypted traffic from unknown sites over which the user does not have control or internal knowledge should, as you state, be suspect. This is a good reason to have browser warnings and to cause the user to question the web site they are attempting to access. These warnings should not be ignored. But, once the reason for the error is known and identified, then with informed consent, if users wish to proceed, then be it on their heads and not an authoritarian browser.
In the above referenced situation, I own the web site, I program the web site and I know its every internal working. It is mainly for internal development and rarely is used on the internet at large except when I go to my favorite watering hole for lunch and want to keep on working or do a presentation at a meeting. In this case, I am the sole user, and I really don't care if anyone else in the world wants to look at the information on the server, and I know why the cert fails. So, I ask a simple thing: let me tell my browser to override what would be unwise in the broader world.
This site runs FreeBSD-R12 which does not use SSL v2 it uses TLS and supports TLS 1.3. The problem is in the apache config files because I only have but one public facing <Virtual Host> defined. And I only got one or two certs from LetsEncrypt. As it is a playground, I access it from my internal nets, which being a development system and network architecture, changes from time to time. If I were to change the Apache config files every time I dinked with a network router it would increase the time it takes to test new or novel approaches to solving problems, like using IPV6.
I came to NewMoon because mozilla decided to be thought/developmental/access police and all around big brother. These draconian moves by Mozilla forced out a cert maker in Australian and forced a lot of people to either pay danegeld to the Verisigns of the world or drop https altogether. That's fine if you're ebay or amazon, but not so fine if you're an R&D firm on the fringes or a hobbyist.
I think all people are asking is this: If I determine it is ok to proceed to a website that is protected by possibly flawed https then let me do it, even if the bonehead (in this case me) who set it up didn't do it perfectly, then let me instruct my browser to do just that. Otherwise, we will have to go to Chrome and let Google/Alphabet be evermore snoopy in our affairs. Is that so much to ask?
PS.
Frasier wrote: Did you follow satrow's link?
Yes I did. I ran my own webserver against the link. (Not the one in the OP, which gave rise to this thread.) It passed with a B on its main public facing page, the B only because of the Forward Key exchange issue and the fact that I didn't set the browser to prefer ECDHE. I have modified my server and all is good for now. But I maintain that browsers should not censure websites. Users should be warned, but have the final say.