Every piece of software as complex as a browser that receives unsanitized information from the internet will eventually be compromised. How often that happens and how damaging the compromises are can be mitigated through a careful design process. Some information about decreasing the damage that a compromise can inflict is discussed on the Core Privacy Principles page under the heading about minimizing the amount of information stored on the device. The purpose of this post is to discuss one of the methods of minimizing how often compromises happen.
An app’s attack surface comprises all the parts of the code that process information from an outside source. A flaw in any one of those pieces of code can allow the app to be compromised. Privacy Browser is composed of code from a number of sources:
- Code that I have written or reviewed myself. This code is hosted at gitweb.stoutner.com and is released under the GPLv3+ license.
- Google’s AndroidX and Material Maven repositories, which are released under the Apache License 2.0. These libraries backport features from newer version of Android to older ones. The specific libraries in use can be seen in the build.gradle file. The code from these libraries is added to Privacy Browser when it is compiled.
- The Kotlin standard library, which is also released under the Apache License 2.0. This library enables programming in the Kotlin language on Android.
In addition to the above code that is compiled into the APK, Privacy Browser also uses a number of standard Android OS system libraries. In the context of Privacy Browser, WebView is probably the most important. But there are a large number of system libraries required for an app to function correctly, including Menu, TextView, Bundle, Activity, and many others. The import section of each Java file shows which system components are used by code in that class. The actual source code of these system components varies based on the version of Android that is installed on your device, which is one of the reasons why it is important to use a system with up-to-date security patches.
Each additional piece of code that is included with Privacy Browser increases the attack surface. I do not personally have the time to read through all of Android’s source code and verify that it does what they say it does and that it doesn’t contain any vulnerabilities. However, because I am only using open source components that are commonly used by many projects, there likely are a number of people who have taken the time to do so. At a fundamental level, a program must trust the operating system it runs under.
From time to time it is tempting to include a library from another source that implements a useful feature. For example, when adding OpenKeychain support for exporting and importing settings, I noticed that there was two ways to do so. One involved compiling a library of their source code into Privacy Browser. This would allow OpenPGP cryptographic operations without the need of the user having OpenKeychain installed. The other was to communicate with the OpenKeychain app using Android’s built-in intent mechanism. This method, of course, requires that the user has OpenKeychain installed. It also requires additional user interaction to accomplish each function.
Now, I have nothing but the utmost respect for the team that builds OpenKeychain. I have no reason to suspect that their code is compromised in any way. But I also am not able to verify myself that their code is not compromised. I do not have the necessary background to audit cryptographic source code. And even if I were able to certify a particular version of their library, I would not have the time to re-audit each new release. This is not just an academic concern; malware infections of common libraries, either through the addition of malicious code by a contributor or via a compromise of the platform where they are hosted has happened many times in the past.
Even if there is no intentionally malicious code in a library, from time to time there will almost inevitably be security bugs. If the code is included in Privacy Browser, when that bug is exploited the attacker can run any command and access any code in Privacy Browser’s context. But if the code is running in a separate context (like inside the OpenKeychain app), then it can only access that app’s data. Thus, keeping the code in a separate app leverages Android’s built-in security apparatus to decrease the negative impact of any successful exploit.
2 responses to “Minimizing Privacy Browser’s Attack Surface”
[…] It should be handled entirely by the OS. I have written a bit about this topic in relation to minimizing Privacy Browser’s attack surface. The idea is that, from a security perspective, the browser shouldn’t try to recreate core OS […]
[…] also be open source and I must be able to host it on my own hardware. This is part of my efforts to minimize my attack surface, not only in the software I produce, but also in the supply chain of software used to host the code […]