This guidance contains recommendations for the secure development, procurement and deployment of Android applications. Please familiarise yourself with the generic application development guidance before continuing.
Contents
1. Secure Android application development
2. Questions for application developers
3. Secure deployment of Android applications
4. Application wrappers
1. Secure Android application development
1.1 Datastore hardening
Android, by default, provides each application on a device with access to a private directory to store its files. This protection is implemented using Linux user and group permissions. The security model is further enforced by applying Security-Enhanced Linux mandatory access control policies and leveraging a seccomp system call filter.
Android, as of version 7.0, on devices with file-based encryption (FBE), provides two storage locations on devices with FBE and Direct Boot. On FBE-enabled devices, developers should only store sensitive data in the Credential Encrypted (CE) storage.
Applications are able to access other areas of the device, such as contacts and SMS, by requesting permission from the user at runtime. The user can choose to permit the application access to areas such as the device’s calendar and phonebook, as well as features such as making phone calls or reading the current location. Once permitted, the application may use these features without further interaction from the user.
Despite protection offered by Android’s sandboxing, it remains the responsibility of the application to store its data securely and to not undermine any protections that are in place by (for instance):
- writing data to publicly readable locations such as the external storage
- handling intents that can be called by any other application on the same device
- creating files with world readable/writable permissions
Remember that a process running on the device with sufficient permissions, will always be able to read and write any data in any application’s sandbox. We strongly recommended that applications holding sensitive data should build upon the sandbox with more secure functionality by (for example) leveraging the hardware-backed KeyStore.
Ultimately, it is not possible to guarantee the security of data on a device. You should assume that if a user continues to use a device after it has been compromised, the malware will be able to access the data. Android provides an API called SafetyNet for assessing the health and safety of the device. This API examines both hardware and software information about the device, to help determine if it has been tampered with or otherwise modified. We recommend that application developers use the API, then send the signed SafetyNet API results to their own servers to be validated, rather than on the device. SafetyNet should be used as a means to gain confidence about the integrity of the device, but it is not guaranteed to detect a compromise.
1.2 Network protection
The diagram below, taken from the EUD Security Guidance for Android, illustrates the recommended network configuration for Android devices which handle sensitive information. In summary, a VPN is used to bring device traffic back to the enterprise. Access to internal services is brokered through a reverse proxy server, which protects the internal network from attack.
To prevent the application from accessing sensitive internal resources, it is important that the reverse proxy server authenticates any requests from devices. This means that applications on the device which are trusted to access sensitive data must provide authentication with each request so that the reverse proxy can validate the request. Stored credentials must be private to only the trusted applications accessing those resources.
Internet requests from the application should be routed via the standard corporate internet gateway, to permit traffic inspection.
1.3 Secure application development
The following section contains recommendations that an Android application should conform to in order to store, handle or process sensitive data. Many of these recommendations are general good-practice behaviours for applications on Android. A number of documented code snippets and examples are available on the Android developer portal.
Secure data storage
In order to store sensitive data in a secure manner, Android applications should conform to the following:
- Applications should minimise the amount of data stored on the device. When needed, data should be retrieved from the server over a secure connection, and erased when it is no longer required.
- Sensitive information, if required, should only be stored in the hardware-backed KeyStore.
- The device’s external storage (for example the SD card) should not be used by the application to store sensitive data.
Server-side controls
Applications which store credentials should have robust server-side control procedures in place to revoke the credential, should the device or data be compromised.
Secure data transmission
In order to transmit sensitive data securely, Android applications should conform to the following:
- All off-device communications handling sensitive data should take place over a mutually-authenticated, cryptographically protected, connection.
- For TLS connections, the application should perform certificate pinning to a known endpoint. This process should leverage the Network Security Configuration. For more information refer to the NCSC’s TLS documentation.
- Certificates used by the application should be stored on the device using the Android KeyStore provider.
Note that at present there is no API on Android to check the status of the VPN. To securely check the status of the VPN, the internal service with which the application is communicating must be authenticated. The recommended way of performing this authentication is TLS with a pinned certificate. If mutual authentication is required to the internal service, mutual TLS with pinned certificates should be used.
Application security
To hinder the exploitation of any potential memory corruption vulnerabilities and protect against reverse engineering, the following recommendations should be followed:
- The application should be compiled using the latest supported compiler security flags.
- The application should not be compiled with the debug flag enabled.
- The application should not use any private APIs.
- The application should be compiled in release mode with all debug information stripped from the binaries.
- The application should be compiled using obfuscation tools to make analysis harder.
- If Android Studio is used, it should be configured to shrink, optimise, and obfuscate Java code.
Security recommendations
The behaviours listed below can increase the overall security of an application.
- Any data that is deemed necessary to store on the device should be encrypted either with keys that are not stored on the device, or that are stored in the Android KeyStore. Furthermore, key attestation should be used when hardware-backed key storage is available.
- Where possible, applications should sanitise in-memory buffers of sensitive data after use (if the data is no longer required for operation).
- Applications that require authentication on application launch should also request this authentication credential when returning back to the foreground after previously being backgrounded by a user, allowing for a small grace period.
- As the standard Android clipboard is shared between all applications on the device, do not not use it when accessing sensitive data. A private clipboard can be implemented if required by the application.
- The application should disable both manual and automatic screenshots within activities that display sensitive data by setting secure flags of the window within the application.
- Applications that use a shared UID will share the same sandbox. This means that if one application was compromised, all data in any application with a shared UID would also be compromised. Developers should share functionality between applications using intents, restricted by permissions.
- Intents created for IPC between trusted applications should use signature permissions to restrict access by other applications on the device.
- Applications that use Web Views should limit the features and capabilities to the minimum functionality required.
- JavaScript and local file access should be disabled unless specifically required.
- Caching should be disabled to prevent unnecessary exposure of sensitive data.
- The application should ensure that debugging output has been removed and sensitive information prevented from appearing within the device log files.
2. Questions for application developers
For anyone procuring an application built by a third party, you can ask developers the example questions below. Their answers will help you gain more (or less) confidence about the security of their products.
The most thorough way to assess an application before deploying it would be to conduct a full source code review to ensure it meets the security recommendations and contains no malicious or unwanted functionality. Unfortunately, for the majority of third party applications, this will be infeasible or impossible. However, the responses from the third party should help provide confidence that the application is well written and likely to protect information properly.
2.1 Secure data storage
The following questions will help you establish how confident you can be that an Android application stores sensitive data securely.
The following questions will help you establish how confident you can be that an Android application stores sensitive data securely.
Questions | What to look for in answers |
---|---|
What is the flow of data through the application – source(s), storage, processing, and transmission? | Answers should cover all forms of input including data entered by the user, network requests, and inter-process communication. |
How is sensitive data stored on the device? | Data should be stored in a location that cannot be accessed by other applications on the device.
Data should not be accessible to other applications on the device through inter-process communication provided by the application, e.g. content providers. Data should be encrypted when stored on the device. Encryption of sensitive data should be performed using a key that is not stored on the device. Either the key is derived from user input, or returned from a server following authentication. |
What device or user credentials are being stored? Are these stored in the Android Keystore? What key is used? |
If certificates are stored on the device then they should be stored using the Android Keystore. |
Are cloud services used by the app? What data is stored there? How is it protected in transit? |
Sensitive data should not be transmitted to unassured cloud services. If non-sensitive data is transmitted, or data is transmitted to accredited data centres then questions should be asked about data in transit protection. |
Secure data transmission
The following questions will help you gain confidence in how Android applications transmit sensitive data securely:
Questions | What to look for in answers |
---|---|
Is transmitted and received data protected by cryptographic means, using well-defined protocols? If not, why not? |
Mutually authenticated TLS, Secure Chorus, or other mutually authenticated secure transport should be used to protect information as it travels between the device and other resources. This should be answered specifically for each service the application communicates with to send and receive sensitive information. |
2.3 IPC mechanisms
The following questions will help you gain confidence in how Android applications share sensitive data securely.
securely.
Questions | What to look for in answers |
---|---|
Are any URL schemes or exported intents declared or handled? What actions can these invoke? | Answers should cover any instances of use and suitably explain how and why they are used, rather than a ‘yes’ or ‘no’ reply.
Any inability to explain rationale should be taken as concern about the application. |
Are there any exported content or file providers? Is any sensitive data accessible? Are there any bespoke implementations for querying, updating, or deleting data in the provider? What custom actions are performed? Are prepared statements used for querying the provider? |
Answers should cover any instances of use and suitably explain how and why they are used, rather than a ‘yes’ or ‘no’ reply.
Any inability to explain rationale should be taken as concern about the application. |
Is input from statically or dynamically registered broadcast receivers treated as untrusted? | Answers should cover any instances of use and suitably explain how and why they are used, rather than a ‘yes’ or ‘no’ reply.
Any inability to explain rationale should be taken as concern about the application. |
Can other applications cause the application to perform a malicious action on its behalf, or request access to sensitive data? | Interactions with other applications should be limited to only those essential for the application to function. |
2.4 Binary protection
The following questions will help you gain confidence in how Android applications protect their data within a binary.
Questions | What to look for in answers |
---|---|
Is the application compiled in release mode and has all debug information been removed from the binary? | All debugging information should have been removed from the application. |
Does the application make use of obfuscated code or other protections against reverse engineering? | Code obfuscation should be used to help hinder reverse engineering of applications. |
Does the application make use of binary anti-tamper or anti-hooking protections? | These protections improve the security of the application, if they are not used reasons should be given to suitably explain why. |
Is the application compiled to be debuggable? | The debuggable flag in the manifest file should be set to false. |
2.5 Server side controls
The following questions will help you gain confidence in how Android applications protect their data on the server side.
Questions | What to look for in answers |
---|---|
If the application connects to remote services to access sensitive data, how can that access be revoked? How long does that revocation take? What is the window of opportunity for theft? |
Assessors should be aware of the length of window of opportunity for data theft. |
2.6 Client side controls
The following questions are will help you gain confidence in how Android applications protect their data on the client side.
Questions | What to look for in answers |
---|---|
Does the application make use of Web Views? Is JavaScript enabled within the WebView? |
By default, Web Views support JavaScript. Therefore, where it is not needed, we recommend that this is explicitly disabled to protect against malicious JavaScript injected into the Web View. |
Are there any JavaScript bridges implemented within the application? | JavaScript bridges may significantly increase the overall attack surface of the application if implemented without care. |
2.7 Other
The following set of additional questions will help you gain confidence in how Android applications protect themselves.
Questions | What to look for in answers |
---|---|
Does the application implement root detection, and if so how? | Root detection can never be completely protected against, but the more advanced the detection, the more effort is required to bypass it. Google’s SafetyNet Attestation API can be used to help detect rooting. |
Are applications allowed to run in an Android Emulator when in production build? | Applications that can run in the emulator are easier to reverse engineer. The sandboxed directory of the application can be inspected and manipulated, allowing for greater understanding of the security state of the application. Additionally, the simulator may contain additional testing functionality or have more verbose logging. |
How is session timeout managed? | The application should include a timeout following inactivity by the user. |
How is copy and paste managed? | Whichever solution is used to manage copy and paste, the appropriate risk owner should understand and accept how data could leak. |
Is potentially sensitive data displayed within the screenshot when the application is backgrounded? Are there configuration options which may cause the security of the solution to be weakened or disabled? |
Backgrounding should be performed to ensure that sensitive data is not leaked in screenshots taken of apps for task switching. |
What configuration options are available to end users, and what is the impact to the solution’s security if the user were to change those settings? | Configuration options that users are able to change should not prevent the application from working correctly (or affect the application’s security features). |
Are there debug messages logged to the console output? | No debugging information should be logged to console. |
3. Secure deployment of Android applications
This section recommends how to securely deploy the application, should it be from third party organisation or via an in-house application.
3.1 Third party app store applications
Android supports a number of methods to install new third party applications. The following section divides these into two categories, trusted and untrusted:
Untrusted third party applications
Untrusted applications are those that have been produced by developers that your organisation does not have a relationship with. This includes applications hosted by both Google Play and on third party application stores. In these instances you should assume that the third party application may have unwanted functionality. While this functionality may not necessarily be malicious, these applications should be viewed as potential sources of leakage for sensitive data. You should evaluate whether or not an application can run on the device.
Network architecture components such as the reverse proxy can be used to help restrict third party applications from accessing corporate infrastructure. However, these features should be regarded as techniques to help mitigate the potential threat posed by the installation of third-party applications, they cannot guarantee complete protection.
The ideal method of mitigation is to not allow any third party applications to be installed on the device, though in reality this must be taken on a ‘per application’ basis. Where possible, developers of the application should be consulted in order to understand better the limitations and restrictions of the application. To help your evaluation, you can use the questions given above (feel free to ask more, these represent the minimum you should find out).
Trusted third party applications
You should learn as much as possible about the security posture of an application, so that the risks of deploying it can be understood and managed wherever possible. Your organisation should, ideally, establish a relationship with developers and work with them to understand how their product does (or does not) meet the security posture expected of it.
You should assess third party applications to decide whether the risk of having their code executing on your devices is outweighed by the benefits that the application brings to your organisation. If third party applications are to be permitted on devices with sensitive data, then the following steps should be taken:
- Ensure that the applications holding sensitive data do not permit third party application access to the data, for instance making sure that the third party application is not included as one that the user can choose to open sensitive documents with.
- Ensure that sensitive data would remain secure if the third party application were compromised. For instance, the data should not be accessible due to it being stored in a world-readable location on the device.
Where a third party application is being considered to manage sensitive data, you may also wish to consider commissioning independent assurance. This is particularly true if the application implements its own protection technologies (such as a VPN or data-at-rest encryption), and does not use the native protections provided by Android. Many enterprise applications feature server side components and when present, these should be considered as part of the wider risk assessment..
Private enterprise application catalogues can be created and managed using MDM solutions, allowing organisations to build a set of accepted third party and in-house applications that can either be installed on to every organisational device, or made available for employees to browse and choose to install manually.
Security considerations
When deploying third party applications, the primary concern for an organisation is determining whether these applications could affect the security of the enterprise network, or access data held in a sensitive datastore.
Malware and application level vulnerabilities are of particular concern when developing secure applications for Android. Secure applications must therefore pay particular attention when protecting data both in storage on a device and in transit, if third party applications are permitted on the same device.
You should also consider the security features of the devices that will host your application. A number of manufacturers offer custom security features to protect corporate data from other applications. If the application will only be used on these devices, then permitting third party applications on the same device may be deemed acceptable.
Security requirements
Best practice when using third party applications is as follows:
- Server side components such as a reverse proxy should be used to restrict network enterprise access to trusted applications.
- The developers should be contacted in order to better understand the security posture of the application. Use the Questions for Application Developers section as your starting point.
- Data should be protected from third party applications by restricting their access to sensitive data and functionality.
3.2 In-house applications
In-house applications are those designed and commissioned by an organisation to fulfil a particular business requirement. The organisation can stipulate the functional and security requirements of the application, and enforce these contractually if the development work is subcontracted.
The intention when securing these applications is to minimise the opportunity for data leakage from these applications and to harden them against physical and network-level attacks. For the purposes of this document, these applications are assumed to access, store, and process sensitive data.
Security considerations
Regardless of whether the application is developed by an internal development team, or under contract by an external developer, you should ensure that supplied binaries match the version which you were expecting to receive. Applications should then be installed onto managed devices through an MDM server or in-house enterprise application catalogue front-end, to gain the benefits of an application being enterprise managed.
Security requirements
Both in-house and third party applications should be deployed directly to devices through an in-house enterprise application catalogue. This means they can be remotely managed, and kept separate from third party applications installed by the user.
4. Application wrappers
This section covers the different types of application wrappers, giving descriptions and the security considerations of each.
4.1 Security considerations
A variety of ‘application wrapping’ technologies exist on the market today. Whilst these technologies ostensibly come in a variety of forms which provide different end-user benefits, on most platforms (including Android) they essentially work in one of three ways.
Category 1: These provide a remote view of an enterprise service, for example a Remote Desktop view of a set of internal applications that are running at a remote location, or an HTML-based web application. Multiple applications may appear to be contained within a single application container, or may live separately in multiple containers to simulate the appearance of multiple native applications. Usually, only temporary cached data and/or a credential is persistent on the device itself.
Category 2: These are added to an application binary after compilation and dynamically modify the behaviour of the running application (for example to run the application within another sandbox and intercept and modify platform API calls) in an attempt to enforce data protection.
Category 3: The source code to the surrogate application is modified to incorporate a Software Development Kit (SDK) provided by the technology vendor. This SDK modifies the behaviour of standard API calls to instead call the SDKs API. The developer of the surrogate application will normally need to be involved in the wrapping process.
4.2 Security requirements
Category 1 technologies are essentially normal platform applications, but which store and process minimal information, deferring processing and storage to a central location. The development requirements for these applications are identical to other native platform applications. Developers should follow the guidelines given above.
Category 2 and category 3 wrapping technologies are frequently used to provide enterprise management to applications via the MDM server that the device is managed by. SDKs are integrated into these MDM solutions and can be used to configure settings in the application or to modify its behaviour. For example, the application could be modified to always encrypt all data or not use certain API calls.
On Android, both category 2 and category 3 wrapping technologies require the surrogate developer’s co-operation to wrap the application into a signed package for deployment onto an Android device. As such, normally only custom developed in-house applications, and sometimes trusted third party applications (with co-operation) can use these technologies. As the robustness of these wrapping technologies cannot be asserted in the general case, they should not be used with an untrusted application; they should only be used to modify the behaviour of trusted applications, or for ease of management of the wrapped applications.
In-house applications should be developed specifically against the previously described security recommendations wherever possible. The use of app-wrapping technologies should only be used as a less favourable alternative method of meeting the given security recommendations where natively meeting them is not possible.
Ultimately, it is more challenging to gain confidence in an application whose behaviour has been modified by a category 2 technology. It is difficult to assert that dynamic application wrapping can cover all the possible ways an application may attempt to store, access and modify data. It is also difficult to make any general assertions about how any given wrapped application will behave. As such, the NCSC cannot give any assurances about category 2 technologies or wrapped applications in general, and hence cannot recommend their use as a security barrier at this time.
However, category 3 technologies are essentially an SDK or library which developers use as they would any other library or SDK. In the same way that the NCSC does not assure any standalone cryptographic libraries, we do not provide assurance in SDKs which wrap applications. The developer using the SDK should be confident of its functionality, as they would be with any other library.
Source: NCSC