This just in: threat actors compromising your devices is bad. More at 11.
Obviously the keys could be stored more securely, but if you’ve got malware on your machine that can exploit this you’ve already got bigger problems.
That’s not how this works.
This sort of “dismissive security through ignorance” is how we get so many damn security breaches these days.
I see this every day with software engineers, a group that you would think would be above the bar on security. Unfortunately a little bit of knowledge results in a mountain of confidence (see Dunning Kruger effect). They are just confident in bad choices instead.
“We don’t need to use encryption at rest because if the database is compromised we have bigger problems” really did a lot to protect the last few thousand companies from preventable data exfiltration that was in fact the largest problem they had.
Turns out that having read access to the underlying storage for the database doesn’t necessarily mean that the database and all of your internal systems are more compromised. It just means that the decision makers were making poor decisions based on a lack of risk modeling knowledge.
Are you confident in your omnipotence in that you can enumerate all risks and attack factors that can result in data being exfiltrated from a device?
If not, then why comment as if you are?
Does anyone know how iMessage handles this on desktop (on Macs) as they (as far as I know) upgraded their encryption recently?
I have three things to say:
- Everyone, please make sure you’ve set up sound disk encryption
- That’s not a suprise (for me at least)
- It’s not much different on mobile (db is unecrypted) - check out molly (signal fork) if you want to encrypt it. However encrypted db means no messages until you decrypt it.
Sure, I was aware. You have the same problem with ssh keys, gpg keys and many other things
However, you can save encrypted ssh, gpg keys and save that encryption key in the OS keyring.
Is it possible to seamlessly integrate, so when something requests those keys you’ll get a prompt?
With SSH at least you can password protect the key itself so that you always get a prompt.
Nice, didn’t know, I’ll look into it
Yes, but you STILL need to enter password on every reboot.
You are telling me this has been going on for almost a decade now, and no one ever noticed ?
So we trust open source apps under the premise that if malicious code gets added to the code, at least one person will notice ? Here it shows that years pass before anyone notices and millions of people’s communications could have been compromised by the world’s most trusted messaging app.
I don’t know which app to trust after this, if any?
Everyone knew that already tbh
Why is this a shock? Someone would need to have already compromised your device. Even if it was encrypted with a password they still could install a key logger
It is easier to compromise a device than to try and compromise encrypted communications.
Matrix. You can host any version you want, and when you have to update, just do a version diff between you current and latest versions and check yourself.
Summary:
- Signal’s desktop app stores encryption keys for chat history in plaintext, making them accessible to any process on the system
- Researchers were able to clone a user’s entire Signal session by copying the local storage directory, allowing them to access the chat history on a separate device
- This issue was previously highlighted in 2018, but Signal has not addressed it, stating that at-rest encryption is not something the desktop app currently provides
- Some argue this is not a major issue for the “average user”, as other apps also have similar security shortcomings, and users concerned about security should take more extreme measures
- However, others believe this is a significant security flaw that undermines Signal’s core promise of end-to-end encryption
- A pull request was made in April 2023 to implement Electron’s safeStorage API to address this problem, but there has been no follow-up from Signal
Oh wow that’s quite a red flag ngl
If your system is compromised to such an extend, it really doesn’t make much difference how the keys are stored at rest.
If the keys are accessible to any process, your system doesn’t need to be compromised. All it takes is an App that you”trust” to break that trust and snatch everything up. Meta has already been caught fucking around with other social media apps on device. They even intercepted Snapchat traffic on some users devices in order to collect that data. It could be as simple as you installed WhatsApp and they went and pillaged your Signal files.
All it takes is an App that you”trust” to break that trust
I get what you’re trying to say, but that’s something I’d classify as “compromised” as well.
For sure, just suggesting that “compromised” doesn’t necessarily mean you got hacked by someone because they tricked you into giving a password, or they scraped it from another website, or you installed something sketchy. It could be as simple as Microsoft scans all your files with AI, or Meta snoops other social media (which it has been caught doing).
So you’re saying that the os itself is compromised? Gee, good luck protecting your processes from the fucking os, no matter how you do it.
But my system is not compromised?
How do you know?
Because I check my system and I don’t even use Signal?
“checking” does not prevent anything bad from happening. and if that file were read by a malicious actor, it would likely be immediate and you’d never even notice.
Did you see that I said “I don’t use Signal”?
Did you read the article?
Why? They would need access to the device
Thanks ChatGPT.
Ah yes, another prime example that demonstrates that Lemmy is no different than Reddit. Everyone thinks they are a professional online.
Nothing sensitive should ever lack encryption especially in the hands of a third party company managing your data claiming you are safe and your privacy is protected.
No one is invincible and it’s okay to criticize the apps we hold to high regards. If your are pissed people are shitting on Signal you should be pissed Signal gave people a reason to shit on them.
especially in the hands of a third party company managing your data claiming you are safe and your privacy is protected.
Yeah, especially in this specific situation that isn’t relevant to this situation.
I presume keys are already sort of encrypted?
Nope. Your presumption is wrong.
👍
Where are you going to store the encryption key? At the end of the day the local machine is effectively pwded anyway
If your device gets compromised, it’s no longer the company’s problem.
Bruh windows and linux have a secrets vault (cred manager and keyring respectively, iirc) for this exact purpose.
Even Discord uses it on both OSs no problem
The real problem is that the security model for apps on mobile is much better than that for apps on desktop. Desktop apps should all have private storage that no other non-root app can access. And while we’re at it, they should have to ask permission before activating the mic or camera.
Firejail and bwrap. Flatpaks. There are already ways to do this, but I only know of one distro that separates apps by default like Android does (separate user per app), which is the brand new “EasyOS”.
macOS has nailed it*, even though it’s still not as good as iOS or Android, but leagues and bounds better than Windows and especially Linux.
ETC: *sandboxing/permission system
What does Windows do? Genuine question, I’ve not used it since the 7 days. Regarding Linux, that’s true for stuff installed through regular package managers and whatnot, but Flatpak is pushing a more sandboxed and permission oriented system, akin to Android.
You have granular control over universal windows apps (ie windows 8+ apps) and one global lock over all desktop apps (non uwp), and one global lock over everything. It’s pretty solid considering how little control Microsoft has and it’s wonderful fetish for compatibility.
Tldr basically same as Linux, except app distribution in Linux was bad enough for so long that more stuff is in the new restricted format while windows still has tons of things which will never go away and aren’t in the sandbox. I think not finding a way to sandbox all desktop apps was a mistake.
What’s wrong with the Flatpak permissions system on Linux?
It’s a joke. Apps have defined permissions already allowed on install and some of them have too many things set to allow like home or host access. Also, changing any permission requires restarting the app. It’s heading in the right direction, but it has a looooong way to go to catch up with macOS, let alone Android and iOS.
They do sort of with Flatpak
How in the fuck are people actually defending signal for this, and with stupid arguments such as windows is compromised out of the box?
You. Don’t. Store. Secrets. In. Plaintext.
There is no circumstance where an app should store its secrets in plaintext, and there is no secret which should be stored in plaintext. Especially since this is not some random dudes random project, but a messenger claiming to be secure.
All your session cookies are stored in plaintext.
How in the fuck are people actually defending signal for this
Probably because Android (at least) already uses file-based encryption, and the files stored by apps are not readable by other apps anyways.
And if people had to type in a password every time they started the app, they just wouldn’t use it.
AFAIK Android encrypts entire fs with one key. And ACL is not encryption.
Popular encrypted messaging app Signal is facing criticism over a security issue in its desktop application.
Emphasis mine.
I think the point is the developers might have just migrated the code without adjustments since that is how it was implemented before. Similar to how PC game ports sometimes run like shit since they are a close 1-1 of the original which is not always the most optimized or ideal, but the quickest to output.
Been a few days since using electron, but AFAIK electron can’t be used as a wrapper for android apps, or can it? Or is their android app a web app wrapped into a “native” android app too?
Also, since this seems to be an issue since 2018, 6 years should be plenty to rewrite using a native secure storage…
You. Don’t. Store. Secrets. In. Plaintext.
SSH stores the secret keys in plaintext too. In a home dir accessible only by the owning user.
I won’t speak about Windows but on Linux and other Unix systems the presumption is that if your home dir is compromised you’re fucked anyway. Effort should be spent on actually protecting access to the home personal files not on security theater.
Not true, SSH keys need their passphrase to be used. If you don’t set one, that’s on you.
Come on, 95% of users don’t set passwords on their ssh keys
Where are these stays from lmao.
Counting my friends
You can count me too
Well yes, but also how would users react if they had to type in their passphrase every time they open the app? This is also exactly what we’re giving up everywhere else by clicking ‘remember this device’.
If someone gets access they can delete your keys, or set up something that can intercept your keys in other ways.
The security of data at rest is just one piece of the puzzle. In many systems the access to the data is considered much more important than whether the data itself is encrypted in one particular scenario.
SSH has encrypted keys
Kinda expected the SSH key argument. The difference is the average user group.
The average dude with a SSH key that’s used for more than their RPi knows a bit about security, encryption and opsec. They would have a passphrase and/or hardening mechanisms for their system and network in place. They know their risks and potential attack vectors.
The average dude who downloads a desktop app for a messenger that advertises to be secure and E2EE encrypted probably won’t assume that any process might just wire tap their whole “encrypted” communications.
Let’s not forget that the threat model has changed by a lot in the last years, and a lot of effort went into providing additional security measures and best practices. Using a secure credential store, additional encryption and not storing plaintext secrets are a few simple ones of those. And sure, on Linux the SSH key is still a plaintext file. But it’s a deliberate decision of you to keep it as plaintext. You can at least encrypt with a passphrase. You can use the actual working file permission model of Linux and SSH will refuse to use your key with loose permissions. You would do the same on Windows and Mac and use a credential store and an agent to securely store and use your keys.
Just because your SSH key is a plaintext file and the presumption of a secure home dir, you still wouldn’t do a ~/passwords.txt.
If someone has access to your machine you are screwed anyway. You need to store the encryption key somewhere
Yes, in your head, and in your second factor, if possible, keeping them always encrypted at rest, decrypting at the latest possible moment and not storing (decrypted) secrets in-memory for longer than absolutely necessary at use.
You. Don’t. Store. Secrets. In. Plaintext.
Ok. Enter password at every launch.
While it would certainly be nice to see this addressed, I don’t recall Signal ever claiming their desktop app provided encryption at rest. I would also think that anyone worried about that level of privacy would be using disappearing messages and/or regularly wiping their history.
That said, this is just one of the many reasons why whole disk encryption should be the default for all mainstream operating systems today, and why per-app permissions and storage are increasingly important too.
Does encrypting your disks change something for the end user in day to day usage? I’m honest, I’ve never used encrypted disks in my life.
No, the average user will never know the difference. I couldn’t tell you exactly what the current performance impact is for hardware encryption, but it’s likely around 1-4% depending on the platform (I use LUKS under Linux).
For gamers, it’s likely a 1-5 FPS loss, depending on your hardware, which is negligible in my experience. I play mostly first and third person shooter-style games at 1440p/120hz, targeting 60-90 FPS, and there’s no noticeable impact (Ryzen 5600 / RX 6800XT).
If it has to go to disk for immediate loading of assets while playing a video game you’re losing more than 1-5 fps
Yeah, I’m sure there are a lot of variables there. I can only say that in my experience, I noticed zero impact to gaming performance when I started encrypting everything about 10 years ago. No stuttering or noticeable frame loss. It was a seamless experience and brings real peace of mind knowing that our financial info, photos, and other sensitive files are safely locked away.
Maybe, but not every frame while you’re playing. No game is loading gigs of data every frame. That would be the only way most encryption algorithms would slow you down.
You’re more likely going to get stuttering or asset streaming issues which are going to have more impact than losing a few fps.
For gamers, it’s likely a 1-5 FPS loss
I highly doubt it… would love to see some hard data on that. Most algorithms used for disk encryption these days are already faster than RAM, and most games are not reading gigabytes/sec from the disk every frame during gameplay for this to ever matter.
Whole disk encryption wouldn’t change your daily usage, no. It just means that when you boot your PC you have to enter your passphrase. And if your device becomes unbootable for whatever reason, and you want to access your drive, you’ll just have to decrypt it first to be able to read it/write to it, e.g. if you want to rescue files from a bricked computer. But there’s no reason not to encrypt your drive. I can’t think of any downsides.
If any part of the data gets corrupted you lose the whole thing. Recovery tools can’t work with partially corrupted encrypted data.
I don’t think that’s a big deal with Signal data. You can log back into your account, you’d just lose your messages. idk how most people use Signal but I have disappearing messages on for everything anyway, and if a message is that important to you then back it up.
It’s transparent for end user basically, but protects the laptop at least when outside and if someone steals the computer. As long as it was properly shutdown.
Define properly shut down. Do your thieves usually ask first?
I think they’re just referring to an outdated concept of OSes with non-journaling filesystems that can cause data corruption if the disk is shut off abruptly, which in theory could corrupt the entire disk at once if it was encrypted at a device level. But FDE was never used in the time of such filesystems anyways.
If you suspend the laptop when moving locations instead of shutting down or hibernating to disk then disk encryption is useless.
Most operating systems will require your desktop password upon resume, and most thieves are low-functioning drug users who are not about to go Hacker Man on your laptop. They will most likely just wipe the system and install something else; if they can even figure that out.
It depends on how you set it up. I think the default in some cases (like Windows Bitlocker) is to store the key in TPM, so everything becomes transparent to the user at that point, although many disagree with this method for privacy/security reasons.
The other method is to provide a password or keyfile during bootup, which does change something for the end user somewhat.
Exactly.
I’ll admit to being lazy and not enabling encryption on my Windows laptops. But if I deployed something for someone, it would be encrypted.
Full disk encryption doesn’t help with this threat model at all. A rogue program running on the same machine can still access all the files.
It does help greatly in general though, because all of your data will be encrypted when the device is at rest. Theft and B&Es will no longer present a risk to your privacy.
Per-app permissions address this specific threat model directly. Containerized apps, such as those provided by Flatpak can ensure that apps remain sandboxed and unable to access data without explicit authorization.
I don’t recall Signal ever claiming their desktop app provided encryption at rest.
I’m not sure if they’ve claimed that, but it does that using SQLCipher.
E2EE is not supposed to protect if device get compromised.
Plaintext should never be used in any application that deals with security, ever.
Oh no, tell that to SSH.
It doesn’t use plain text. It is end to end encrypted but that isn’t what this “issue” is
unless you’re reading ciphertext yourself, this doesn’t make sense
Indeed, End-to-End Encryption protects data between those ends, not ends themselves. If ends are compromised, no math will help you.
Intrinsically/semantically no but the expectation is that the texts are encrypted at rest and the keys are password and/or tpm+biometric protected. That’s just how this works at this point. Also that’s the government standard for literally everything from handheld devices to satellites (yes, actually).
At this point one of the most likely threat vectors is someone just taking your shit. Things like border crossings, rubber stamped search warrants, cops raid your house because your roommate pissed them off, protests, needing to go home from work near a protest, on and on.
If your device is turned on and you are logged in, your data is no longer at rest.
Signal data will be encrypted if your disk is also encrypted.
If your device’s storage is not encrypted, and you don’t have any type of verified boot process, then thats on you, not Signal.
Signal data will be encrypted if your disk is also encrypted.
True.
and you don’t have any type of verified boot process
How motherboard refusing to boot from another drive would protect anything?
Its more about protecting your boot process from malware.
Well, yes. By refusing to boot. It can’t do anything if motherboard is replaced.
Thats correct. Thats one of the many perks.
EDIT: s/do anything/prevent booting/
That’s not how this works.
If the stored data from signal is encrypted and the keys are not protected than that is the security risk that can be mitigated using common tools that every operating system provides.
You’re defending signal from a point of ignorance. This is a textbook risk just waiting for a series of latent failures to allow leaks or access to your “private” messages.
There are many ways attackers can dump files without actually having privileged access to write to or read from memory. However, that’s a moot point as neither you nor I are capable of enumerating all potential attack vectors and risks. So instead of waiting for a known failure to happen because you are personally “confident” in your level of technological omnipotence, we should instead not be so blatantly arrogant and fill the hole waiting to be used.
Also this is a common problem with framework provided solutions:
https://www.electronjs.org/docs/latest/api/safe-storage
This is such a common problem that it has been abstracted into apis for most major desktop frameworks. And every major operating system provides a key ring like service for this purpose.
Because this is a common hole in your security model.
Having Signal fill in gaps for what the OS should be protecting is just going to stretch Signal more than it already does. I would agree that if Signal can properly support that kind of protection on EVERY OS that its built for, go for it. But this should be an OS level protection that can be offered to Signal as an app, not the other way around.
Having Signal fill in gaps for what the OS should be protecting is just going to stretch Signal more than it already does. I would agree that if Signal can properly support that kind of protection on EVERY OS that its built for, go for it. But this should be an OS level protection that can be offered to Signal as an app, not the other way around.
Damn reading literacy has gone downhill these days.
Please reread my post.
But this should be an OS level protection that can be offered to Signal as an app, not the other way around.
- OSs provide keyring features already
- The framework signal uses (electron) has a built in API for this EXACT NEED
Cmon, you can do better than this, this is just embarrassing.
TPM isn’t all that reliable. You will have people upgrading their pc, or windows update updating their bios, or any number of other reasons reset their tpm keys, and currently nothing will happen. In effect people would see Signal completely break and loose all their data, often seemingly for no reason.
Talking to windows or through it to the TPM also seems sketchy.
In the current state of Windows, the sensible choice is to leave hardware-based encryption to the OS in the form of disk encryption, unfortunate as it is. The great number of people who loose data or have to recover their backup disk encryption key from their Microsoft account tells how easily that system is disturbed (And that Microsoft has the decryption keys for your encrypted date).
Mfw end to end can be compromised at the end.
That said, they should fix this anyway
One could argue that Windows is compromised right out of the box.
Source:
“The computer” decides when to install updates and which ones to install.
Microsoft are integrating adware and spyware straight into the os.
Source:
Try setup fresh windows 11 system.
I don’t understand how that would prove anything.
A lot of tracker and spyware already mention in setup. And without bypass you cannot setup without microsoft account.
source: 93% of ransomware are windows based
Correlation is not causation.
Causation was never stated nor implied
99% of people in France are French
BUT WHAT OF QUEBEC
Are they in france?
The backlash is extremely idiotic. The only two options are to store it in plaintext or to have the user enter the decryption key every time they open it. They opted for the more user-friendly option, and that is perfectly okay.
If you are worried about an outsider extracting it from your computer, then just use full disk encryption. If you are worried about malware, they can just keylog you when you enter the decryption key anyways.
A better thing to be worried about IMO is that Signal contains proprietary code. Also to my knowledge nobody is publicly verifying the supposed “reproducible builds” if they even still exist.
The third option is to use the native secret vault. MacOS has its Keychain, Windows has DPAPI, Linux has has non-standardized options available depending on your distro and setup.
Full disk encryption does not help you against data exfil, it only helps if an attacker gains physical access to your drive without your decryption key (e.g. stolen device or attempt to access it without your presence).
Even assuming that your device is compromised by an attacker, using safer storage mechanisms at least gives you time to react to the attack.
Linux has the secret service API that has been a freedesktop.org standard for 15 years.
Secret service API. Damn. That’s how FSB knows what it knows.
The alternative is safeStorage, which uses the operating system’s credential management facility if available. On Mac OS and sometimes Linux, this means another process running in the user’s account is prevented from accessing it. Windows doesn’t have a protection against that, but all three systems do protect the credentials if someone copies data offline.
Signal should change this, but it isn’t a major security flaw. If an attacker can copy your home directory or run arbitrary code on your device, you’re already in big trouble.
Whatever its stores and however it stores it doesn’t matter to me: I moved its storage space to my ~/.Private encrypted directory. Same thing for my browser: I don’t use a master password or rely on its encryption because I set it up so it too saves my profile in the ~/.Private directory.
See here for more information. You can essentially secure any data saved by any app with eCryptfs - at least when you’re logged out.
Linux-only of course. In Windows… well, Windows.
Or ext4 encrytion. Which is overpowered. You can have different keys for different files and directories.
That applies to pretty much all desktop apps, your browser profile can be copied to get access to all your already logged in cookie sessions for example.
IIRC this is how those Elon musk crypto livestream hacks worked on YouTube back in the day, I think the bad actors got a hold of cached session tokens and gave themselves access to whatever account they were targeting. Linus Tech Tips had a good bit in a WAN show episode
And there are ways to mitigate this attack (essentially the same as a AiTM or pass-the-cookie attacks, so look those up). Thus rendering your argument invalid.
Just because “something else might be insecure”, doesn’t in any way imply “everything else should also be insecure as well”.
It is the same concept. Once your machine is pwded you are screwed
That’s all hinges on the assumption that your computer is pwned. Which is wrong
You don’t necessarily have to have privileged access to read files or exfiltrated information.
That point doesn’t matter anyways though because you’re completely ignoring the risk here. Please Google “Swiss cheese model”. Your comment is a classic example of non-security thinking… It’s the same comment made 100x in this thread with different words
Unless you can list out all possible risks and exploits which may affect this issue, then you are not capable of making judgement calls on the risk itself.
You act as though you somehow have more knowledge than everyone else. They problem is that you don’t understand encryption and permissions. You can’t just magically make something unreadable by programs with the same permission level. If you encrypt it there will need to be a key to decrypt it. That can could conceivably be encrypted with a password but that would require someone to enter a password. If they don’t enter a password they key will be stored plain text so anyone could easily decrypt your messages. Programs running as a user have the same permissions as that user. Does that make sense? You can’t just make something selectively unreadable with the current security model. I guess you could try to implement some sort or privileged daemon but that would open up more issues than it solved.
I would have a problem if Signal claimed that the desktop messages were encrypted at rest. However, they don’t make any such claim. If you are concerned about security I would recommend running everything in virtual machines and flatpaks. This way the chances of something misbehaving in a way that causes harm is minimized.
I’m not claiming some grand level of knowledge here. I also cannot enumerate all risks. The difference is that I know that I don’t know, and the danger that poses towards cognitive biases when it comes to false confidence, and a lack of effective risk management. I’m a professional an adjacent field, mid way into pivoting into cybersecurity, I used to think the same way, that’s why I’m so passionate here. It’s painful to see arguments and thought processes counter to the fundamentals of security & safety that I’ve been learning the past few years. So, yeah, I’m gonna call it out and try and inform.
All that crap said:
And you are right, the problem gets moved. However, that’s the point, that’s how standardization works, and how it’s supposed to work. It’s a force multiplier, it smooths out the implementation. Moving the problem to the OS level means that EVERYONE benefits from advanced in Windows/Macos/Linux. Automatically.
It’s not signal’s responsibility, it shouldn’t be unless that’s a problem they specifically aim to solve. They have the tools available to them already, electron has a standardized API for this,
secureStorage
. Which handles the OS interop for them.I’m not arguing that signal needs to roll their own here. The expectation is that they, at least, utilize the OS provided features made available to their software.