mirror of
https://github.com/kusti8/proton-native.git
synced 2026-05-15 14:15:50 -06:00
[GH-ISSUE #188] Verify code integrity #123
Labels
No labels
bug
documentation
enhancement
libui issue
pull-request
question
wait for libui implementation
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference: github-starred/proton-native#123
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @kontrollanten on GitHub (Dec 2, 2018).
Original GitHub issue: https://github.com/kusti8/proton-native/issues/188
Really interesting project! I'm thinking about the security issues when using JS in desktop applications. A hacker who gets access to a computer with an proton native (or electron) app installed, can manipulate the source code without the users knowledge. NW.js has solved it with snapshots and electron has chosen to not solve it (for now).
Would it be a solution to generate a file content hash during the build process? This hash will then be published together with the assets (like electron publishes app-update.yml) and each time the proton native application starts (with an internet connection) it'll check the hash to validate that the source code hasn't been manipulated. The hash verification will be done in a binary file.
It is not bullet proof, but it will at least add an extra security layer and may be a good starting point. To make it even more secure it may be possible to isolate the internet connection within the proton native application until the hash has been validated.
As I understand the application doesn't build any binaries today. But I think it'd be pretty simple to write a binary that acts as a wrapper to handle this; the binary verifies the hashes and then starts the application.
@manashathparia commented on GitHub (Dec 3, 2018):
asar will be also ok as it is a read-only archive, so no one can modify the archive without unpacking. But sadly proton native have no support for it, but it can be ported from electron!
@kontrollanten commented on GitHub (Dec 3, 2018):
How do you mean with "read-only"? It's possible to just edit the content of the asar file and the application will change behaviour. Like this example https://www.howtogeek.com/368976/how-to-install-the-unofficial-dark-mode-for-slack/
@manashathparia commented on GitHub (Dec 3, 2018):
see the path they mentioned "resources/app.asar.unpacked/src/static/" its not "app.asar", its "app.asar.unpacked" means it is unpacked. and by read-only i mean you cannot change files without unpacking the asar file
@mischnic commented on GitHub (Dec 3, 2018):
(You can just unpack it, modify some file and repack it.)
First of all, asar won't help at all:
As for the first suggestion, adding a wrapper could be done without any changes to proton itself. If it's native code, adding it proton would make building significantly more complex.
At least on macOS, it's possible to sign an application regardless of it's architecture, same for AppImage (though a costly certificate is required). That doesn't seem possible on Windows.
It's really hard to make your application secure: your potential hacker could just replace the wrapper that verifies the hashes. Or it could download malware and run it, no need for infecting a proton app directly.
Just don't start the app itself until the hashes have been verified.
I have a feeling there's an issue regarding this topic already, but I can't find it now.
@manashathparia commented on GitHub (Dec 3, 2018):
Yeah it is right but still, it involves doing some extra steps than just editing the js file
@RobertZenz commented on GitHub (Dec 3, 2018):
At that point you've already lost, though. Nothing hinders the cracker at this point from swapping out the executable completely, or changing the
PATHto override system defaults etc..Which would mean that there is a server who is always aware whenever the user starts an application. Also, as said, an attacker might at that point simply swap out the whole application.
Having signed applications would be interesting, similar to how jars are being signed. But as said, the moment the attacker can change the application archive, they can also overwrite the file. At that point you've already lost.
@kontrollanten commented on GitHub (Dec 3, 2018):
If the process is to just download a txt file from GitHub (or similar provider), then I can't see that it's a problem?
Sure, but then the hacker needs to rewrite the whole application to trick the user that it's the real application, right? If it's possible to add a security layer in a binary file (which can be code signed?), then it's at least a bit harder?
I'm not a security expert though, so I don't know where it's relevant to put the effort. It just feels unsafe that it is so easy to manipulate the application code.
@kusti8 commented on GitHub (Dec 3, 2018):
Download a hash from online leaves a lot more issues, the biggest being that it always need to be online, similar to DRM.
If an attacker already has administrative access to be able to change the files, then they could do anything that they want. The same thing is with other non compiled languages, such as Python. I'm pretty sure electron builder allows you to sign your applications if you want that.
@mischnic commented on GitHub (Dec 3, 2018):
The only real solution that is more or less secure is "os-level" code signing. On macOS, this will cost you 100$/year. For Windows even more (at least 150$). electron-builder supports that on macOS and Windows.
@kontrollanten commented on GitHub (Dec 3, 2018):
An idea the came to me, the hash validation should be possible to do from the compiled file, so no network connection is needed?
Yes, but it's harder.
Code signing won't help if you change the content of the asar files? Isn't it just the installation file (at least for MacOS)?
I've changed my Slack application in seconds without repackaging and I can run it as normal. I may be missing something? If we could have a process where we have code signed binaries who validates the hash of the non-compiled files, wouldn't that make it a lot harder for an attacker? Or do you mean it isn't worth the effort?
Yes, it costs money to buy code signing certificates, but then at least the developers/users of Proton Native has an option to make it safer. I'm pretty sure that there's a lot of people who think it's worth it.
@RobertZenz commented on GitHub (Dec 3, 2018):
If the machine is under control of an attacker, can you still trust network traffic?
No, not really. Replacing parts of the file or the whole file doesn't matter at that point anymore. The best you can do is similar to jar signing. The whole jar is being signed (with all files in it) and if one file changes, Java refuses to execute the jar. That still leaves a lot of possibilitie, for example removing the whole signing from the jar (which will make it start as unsigned jar) or sign it yourself (which will make it seemed signed, but under a different name (so not "Awesome Company Ltd." but "Avvesome Company Ltd.")).
If the cracker has write-access to the machine, you've lost. Fullstop. You can't stop the attacker at the point anymore. You can only make sure that the user has the ability to detect the attack, for example by the jar signing mechanism above. But nothing hinders the attacker from spoofing any of that.
Even if you have a full application signed, with a starter that checks that signing, there still isn't a guarantee that the file wasn't modified earlier (as stated above). Or, worse, the starter has been modified.
So I say again, if the attacker has gained write-access to the machine, it's over. Just over. At that point you can't even trust things you see on the screen anymore. What you can do is make sure that the application arrives at the user machine without having been tampered with.