I’m just so annoyed of fighting this all the time.
If I can’t figure this out I’m going to disable all https redirecting and all certificate errors off so I can have some peace
For the certificate errors, just add a root CA of your own making.
Disabling auto-https, no idea. Maybe fix the source?does not sound like a vood idea. your own CA can sign certs for any other sites too, and it’s dangerous.
I would say it’s even more dangerous of you just think “nah, it’ll be fine”
What do you mean?
Of course their own CA can sign certificates for whatever the fuck it wants, but it’s their CA so why would they do that?
You obviously shouldn’t trust anyone else’s CA unless you actually trust it. But if you don’t trust your own CA what’s the point of having a CA?
P.S. I’m guessing OP doesn’t actually have a CA and is just using simple self signed certificates without any private CA that has signed them.
P.S. I’m guessing OP doesn’t actually have a CA and is just using simple self signed certificates without any private CA that has signed them.
You’re right. I’m talking about making a certificate using
gpg
and storing it on your system. Then adding it to the root CA list and signing all your Local SSH stuff with it.but it’s their CA so why would they do that?
I don’t mean them specifically, but that to me managing access to such a CA cert’s keys is security nightmare, because if I somehow get an infection, and it finds the cert file and the private key, it’ll be much easier for it to make itself more persistent than I want it.
But if you don’t trust your own CA what’s the point of having a CA?
That’s the point. I don’t recommend having one. I recommend self signed certs that are
- limited to a lan (sub)domain or a wildcard of it
- you verified by the fingerprint (firefox can show this)
- you only allowed for those of your internal services for the cert was intended
Or if you don’t want to deal with self signed certs, buy a domain and do lets encrypt with the DNS challenge.
That’s also more secure, but can be more of a hassle, though I guess it depends on preference.But then I would use this latter one too if I had opened any services to the internet, but I didn’t because I don’t need to.
I don’t mean them specifically, but that to me managing access to such a CA cert’s keys is security nightmare, because if I somehow get an infection, and it finds the cert file and the private key, it’ll be much easier for it to make itself more persistent than I want it.
If you can’t resist installing random shit on your CA server then sure. No attacker will really try to compromise a home CA so you really only have to worry about viruses which should be kept extremely far from the CA anyways. And obviously follow all other security precautions like good passwords or even passwordless with certificate login (remember that you have a CA server so you can easily issue authentication certificates and enroll them on a smart card or Yubikey)
The private key should also be in TPM (or a HSM like we do at work, but that’s a bit extreme for home use) and be non-exportable. Managing access to the private key isn’t really that hard, it should just never ever leave the CA server and you are pretty much good to go.
You can also do a two tier PKI with an offline CA and an issuing CA like I’m planning to do for an AD DS, AD CS, AD FS lab.
Personally I think wildcard certificates sound like a bigger security problem than a CA since that certificates will likely be placed on a lot of servers and if just a single one gets compromised the attacker can impersonate whatever subdomain they feel like. With a CA server you could issue individual certificates to each server/service
Private CA servers are very common and is actually a security positive. I’m not saying that everyone needs one at home, but you shouldn’t be afraid to setup one if you want too.
I’m in a home environment. I don’t have a TPM*, I don’t have yubikeys. And no, certificates won’t be placed on a lot of servers, as
- I have only one, 2 if you count the raspberry
- both of them uses a wildcard for its own subdomain, so other servers wouldn’t be affected anyway
Yeah I was about to say, just do https? It’s not like getting a certificate is still a big deal in modern times, hasn’t in years.
My router doesn’t have an HTTPS control page.
Sometimes frustrating.
you should only need to allow this once for each domain/subdomain, surely that can’t be that much of a pain.
yes that has to be repeated when the certificate changes, but make it with a 2-5 year expiration and it’ll be safer than attempting to disable these security measures for all domains, which would be just very silly and careless
Once for every VM I create, in every VM I create on every computer in the house Every single time it happens, a frustrating interruption with a probability of losing my train of thought Also this has to work fully offline and without infrastructure My prognosis based on this thread only solution, disable all https certificate checking and http to http redirecting everything else is multi-weekend long cast in the fire
You can get rid of the certificate errors by adding your CA to Firefox. Just make sure you keep the private key secure.
Set
browser.fixup.fallback-to-https
tofalse
to stop Firefox from trying https if http doesn’t work.worth repeating the KEEP YOUR PRIVATE KEY SECURE part if you’re trusting a root - if you trust a root, it may be able to issue a TRUSTED cert for other domains - mybank.com, etc and leave you open to attack
But honestly, you shouldn’t need to do this, you can just use LetsEncrypt to get a real cert. Here’s what I do:
- route external traffic to your devices - I use a VPS w/ a VPN because I’m behind CGNAT, but if you have a publicly routable address, you can probably just use your router
- configure LetsEncrypt for your services
- configure the DNS your router provides to swap the public IP (i.e. the one for your VPS if you have it) to your LAN address, and have all of your devices use that DNS name
Boom, you get all the benefits of a proper TLS setup, along with all of the benefits of local traffic. You can even turn off external access to the services between cert renewals.
Install valid certs. Problem solved.
Pretty sure there’s not a per-domain setting for that. If you have HTTPS-Only Mode turned on in the settings it will always try to use HTTPS first and present a warning before switching to HTTP.
If you want to continue using HTTPS you can setup your own CA certificate to sign certificates for your .LAN domain names. All you need to do then is add the CA certificate to your trusted certificates in Firefox and the signed certificate to the device hosting the HTTPS service.
EDIT: TIL there an exclusion feature. Neat. I didn’t see this on Firefox for Android though. https://support.mozilla.org/en-US/kb/https-only-prefs
Install the cert using settings->privacy->certs. Use server option to download and install the cert
For https/hrtp as default either ask google for sertings for prefered protocol, type :80 behind address or specify protocol. I believe they changed it to default to https if none is specified a while back
IMO it’s easiest to just use a real domain for your local network. For example, I use subdomains of
int.example.com
, whereexample.com
is my blog.Then, you can get Let’s Encrypt or ZeroSSL certificates for all the hosts. Systems do not need to be accessible over the internet - you can use an ACME DNS challenge instead of a HTTP one. Use something like certbot or acme.sh and renewals will be automated.
The only cost is for one domain, and some TLDs are less than $5/year. Check tld-list.com and sort by renewal price, not registration price (as some are only cheap for the first year).
This is the way to do it - actual valid certs, with actual working TLS.
OP’s issue is they don’t understand how SSL works and fighting Firefox, which is actually trying to protect them and steer they e in the right direction.
So you get a wildcard cert for the public domain, and only go one level deep on your LAN, reusing the wildcard cert? That’s a pretty cool trick.
I use a wildcard cert in some places, but most of them are individual certs. You can have multiple ACME DNS challenges on a single domain, for example
_acme-challenge.first.int.example.com
and_acme-challenge.second.int.example.com
forfirst.int.example.com
andsecond.int.example.com
respectively.The DNS challenge just makes you create a TXT record at that
_acme-challenge
subdomain. Let’s Encrypt follows CNAMES and supports IPv6-only DNS servers, so I’m using some software called “acme-dns” to run a DNS server specifically for ACME DNS challenges.
.LAN is not an official top level domain. So I assume this is either your home network or work network? In any case your problem has nothing to do with the .LAN doman.
Maybe you have “https everywhere” activated. If so, Firefox will always default to https unless you specify http in the URL. Again, unrelated to .LAN.
For the certificate: what do you mean “when available”? A self signed cert is a self signed cert. There is no “available” or not. You can import the certificate into the Firefox trust store so Firefox will trust that one specific cert but any other self signed cert will cause an error. That is expected and save behaviour (and unrelated to .LAN).
I don’t think you should disable self signed warnings. It would be better to import those than disabling the warning as it is a very important warning.
As for disabling https only mode for certain URLs, I don’t know, and it would be a useful feature. Some of my corporate stuff oddly redirects to HTTPS but just gives a blank screen rather than a connection refused or something. Not sure what it is. Probably something is misconfigured somewhere but it’s not something in my control. I didn’t have time to really inspect it so I just disabled the https only mode for my work laptop.