|Nov / 10 / 2017|
Recently, we have been working on some advanced Android malware and to test some ideas, we opted to create some custom samples. Our approach was based on the fact that many 3rd party app stores offer applications, which have been infected with malware (https://www.welivesecurity.com/2017/07/25/malware-found-lurking-behind-every-app-alternative-android-store/). Many of the offerings from 3rd party app stores are paid apps from Google’s Play Store – however, most of the 3rd party app stores promise the same app for free. As it turns out, many of the applications from alternative app stores also bring malware. We decided to model a malicious 3rd party app store and to evaluate the scenario of a ‘backdoored’ legitimate application.
In order to get started, we created an Android application, which implements the following features.
We compiled the application using Android Studio and we extracted the .smali bytecode from the application and implated the modules into an existing, legitimate application, downloaded from the Play Store. Our application of choice was the Facebook Lite application (https://play.google.com/store/apps/details?id=com.facebook.lite). We targeted the onCreate() method of the main activity of the host application. After patching, the backdoored application, while retaining the expected legitimate behaviour and features, carries out malicious actions in the background, unnoticed to the user.
The patching process affects the main activity. In order to instantiate our backdoor object, we need to add a few lines of smali code after the super.onCreate(); JAVA call (which is likely to be present on a GUI application). The smali code of the appropriate Backdoor object has been added to the set of smali sources of the application. Having re-compiled the application using smali, we used our custom developer certificate to sign the resulting application.
In order to mimic a popular approach in Android malware, we also developed a more advanced version of our trojanizer framework, which adds another feature. The advanced version of the trojanized app subscribes itself to the list of Device Administrator applications. Should the user promote this app as a Device Administrator (which can easily be done using a bit of social engineering), the application may be be very hard to remove, as a Device Admin application is able to intervene in the app removal process. For an analysis of an in-the-wild piece of malware implementing this behaviour, refer to our previous blog post.
As a result, we have two versions of the legitimate Facebook Lite application.
The resulting applications can be clearly considered malicious, and even though the package names are legitimate and matches the original app, the signing certificate and the signature is different. At the time of testing, our malicious application has not been uploaded to any apk stores, and has not been encountered by any AV engines. As a result, our custom sample can be considered as an appropriate model of an ‘0-day implantation’ malware variant. For a detailed description of the patching process, refer to http://www.syssec-project.eu/m/page-media/158/syssec-summer-school-Android-Code-Injection.pdf
We tested our sample with a number of Android AV applications. As a result, we found that the tested AV engines chose one of the two following approaches.
As it turned out, none of the two approaches provides the expected level of protection (namely that the AV reliably notifies the user whenever a malicious application is being installed on their device). In practice, none of our tested AV engines spotted the patched Facebook Lite applications.
The main issue with AVs that apply a black list based approach is that the hash/signature of the sample is not on any AV vendor’s known bad list, therefore reputational checks could not yield reliable results. The other approach, involving a warning whenever an ‘unknown’ developer certificate is encountered, is also counter effective. In scenarios where users install .apk files from 3rd party app stores, they will be conditioned to accept a nondescript ‘suscpicious app’ warning whenever they install a new .apk, therefore no real protection is provided.
As a result, none of the tested AV engines prevented our backdoored application from being executed or displayed a warning of the actual danger. Our sample had some tell tale traits, which could have been used to detect its malicious intents.
As conclusion of the first round, we came to the conclusion that for a regular AV, it seems to be a hard task to provide the expected level of protection on an Android device. Some of the reasons regarding local detection have been listed in our previous blog post in the topic.
As a second step, we decided to take a different approach. We assumed that our samples will be detected in a sandboxed environment, closely monitoring every action of the uploaded application. As we did not attempt to hide our malicious additions and they kicks in whenever the main activity is started, we expected that a dynamic analysis engine will have a better chance of a successful detection. In order to check our suspicion, we used the popular Joe Sandbox.
We submitted the backdoored Facebook Lite applications to the popular behaviour based analyser engine.
The v1.0 received a 64/100 score, marked as Malicious.
Joe detected the suspicious signing certificate, the automatically delivered SMS message and the data leak channel, properly revealing the operation of our addition.
v2.0 has been detected with a score 60/100, also marked as Malicious.
As a result, a dynamic analysis framework has a much wider set of opportunities to detect a backdoored benign application. The traits however, could be detected by an cloud based AV scanner.
For our investigation, we considered a ‘3rd party app store providing backdoored applications’ scenario and evaluated the level protection Android AVs provide. As a conclusion, we found that even though Android AVs do not provide an expected level of protection, there is a couple of clues that could be used to provide a better performance.