INDIEGOGO CAMPAIGN
ZDX Mobile AI
Your phone. Your AI. No cloud. No compromise.
Goal: $100,000 | zerodrivex.com | $14.99 at launch
What Is ZDX Mobile AI?
ZDX Mobile AI is a fully on-device AI assistant for Android that runs entirely on your phone — no internet connection required, no data sent to any server, no subscription fees, no surveillance.
It is not a wrapper around ChatGPT. It is not a cloud app pretending to be local. It runs real language models directly on your hardware using llama.cpp, accelerated by your phone's GPU via Vulkan — and it actually does things, not just talks.
It is running right now on a Samsung Galaxy S25 Ultra. Not a demo. Not a prototype. Running.
The Problem With Every Other AI App
Every major AI assistant on your phone today has the same architecture: your words leave your device, travel to a data center, get processed by someone else's model, and a response comes back. Your queries are logged. Your patterns are analyzed. You are the product.
Even apps that claim to be 'private' typically still phone home for licensing checks, usage telemetry, or model updates. True on-device AI — where nothing ever leaves your phone — is extremely rare, and when it exists, it's usually slow, limited, or requires a computer science degree to set up.
ZDX Mobile AI solves all of that.
What Makes It Different
Truly On-Device
Models run locally using llama.cpp. Your queries never leave your phone. There is no API call being made in the background. Pull your SIM card and it still works.
Vulkan GPU Acceleration
Most mobile AI apps run inference on the CPU — slow and battery-draining. ZDX Mobile AI uses Vulkan to offload computation to your phone's GPU, dramatically improving response speed on supported hardware.
Gravity Wells — Adaptive Model Routing
ZDX Mobile AI doesn't just run one model. It runs multiple (currently Phi and Qwen2.5) and uses a novel routing system called Gravity Wells to learn your usage patterns over time and automatically send each query to whichever model handles it best. The longer you use it, the smarter the routing gets. This is not a feature you'll find anywhere else — it was built from scratch.
MCP Tool Integration
ZDX Mobile AI supports the Model Context Protocol, meaning it can use tools — not just answer questions. File access, system queries, custom integrations. It takes actions, not just generates text.
Multi-Model Support
Run Phi for fast lightweight queries. Run Qwen2.5 for deeper reasoning. Swap models without leaving the app. Add new models as they're released. You own the stack.
The Bigger Vision: A Distributed Compute Network
The $100,000 goal isn't just to finish the app. It funds the next phase: a distributed GPU compute network where users who contribute their idle GPU time earn free premium licenses in exchange.
You have a phone or computer sitting idle most of the day. Instead of that compute going to waste, you contribute it to the ZeroDriveX network. In exchange, you get premium features for free. No monthly fee. No subscription. Compute for access.
This creates a self-sustaining ecosystem where the community powers the infrastructure and gets rewarded for it — removing ZeroDriveX's dependence on centralized cloud providers entirely.
Who Built This
ZDX Mobile AI is built by Justen, founder of ZeroDriveX LLC — a Wyoming-registered cybersecurity and AI infrastructure company. Justen has a background in cybersecurity dating back to age 13, spent 15 years away from technology, and returned to self-teach modern development from scratch.
In 18 months of bootstrapping — while supplementing income through DoorDash — he built a suite of live products including an authentication platform with Stripe billing, a prompt injection detection engine, a cross-platform local AI suite, and ZDX Mobile AI.
He built JWT/Redis authentication infrastructure, encrypted database middleware, and MCP server integration — all self-taught. He compiled Ollama natively for Android on a Samsung Galaxy S25 Ultra. ZDX Mobile AI is running on that phone right now.
This campaign funds the final push to production and the launch of the distributed compute network.
Backer Rewards
Every tier gets something real. No vague promises.
TIER | PLEDGE | PERKS | WHAT YOU GET |
⚡ Spark | $5 | Name in credits | Your name in the ZDX Mobile AI app credits. You believed first. |
🔓 Pioneer | $15 | Lifetime license | Full ZDX Mobile AI lifetime license ($14.99 value). Early access before public launch. |
🚀 Power User | $35 | Lifetime + GPU tier | Lifetime license + priority GPU compute tier in the distributed network when it launches. |
🛰️ Node Operator | $75 | Free premium + node slot | Contribute your idle GPU to the ZeroDriveX distributed network. Earn free premium in exchange for compute. First 50 slots. |
🏗️ Founding Dev | $250 | Everything + Discord | All above + direct Discord access to Justen during development. Shape features. Name in README. Limited to 20 backers. |
🌐 Infrastructure | $1,000 | Sponsor credit | Listed as infrastructure sponsor on zerodrivex.com, in-app, and all press. White-glove onboarding when enterprise features ship. |
Use of Funds
Here is exactly where the $100,000 goes:
CATEGORY | AMOUNT | % OF RAISE |
App completion & polish (UI, stability, performance) | $30,000 | 30% |
Distributed compute network infrastructure | $28,000 | 28% |
GPU optimization & Vulkan driver compatibility | $15,000 | 15% |
Security audit & penetration testing | $10,000 | 10% |
Google Play launch, legal, LLC compliance | $8,000 | 8% |
Marketing, campaign fulfillment, backer rewards | $6,000 | 6% |
Working capital & contingency | $3,000 | 3% |
Risks & Challenges
We are not going to pretend there are none.
- Hardware fragmentation: Vulkan support varies across Android devices. We are targeting mid-to-high end hardware first (Snapdragon 8-series, Samsung Exynos flagships) and will publish a compatibility list before launch.
- Model size vs. performance: Larger models require more RAM. We are optimizing quantized models for devices with 8GB+ RAM and will offer smaller model options for lower-spec hardware.
- Distributed network adoption: The compute network only works if enough users contribute. The Node Operator tier seeds the initial network before the app goes public.
- Timeline: App completion is estimated at 90 days post-funding. The distributed network launches at 180 days. These are estimates, not guarantees.
What is not a risk: the core app works. It is on a phone right now. The technology is proven. This campaign funds the last mile.
Why Now
AI is everywhere — and almost none of it is private. Every major player is racing to own your data, your queries, your patterns. The window to build a real alternative that puts users in control is open right now, before the market consolidates entirely around cloud-dependent models.
ZDX Mobile AI is that alternative. On your device. Under your control. Forever.
Back us and help build the infrastructure for AI that actually belongs to the person using it.
ZeroDriveX LLC
Wyoming, United States | zerodrivex.com
Built by one person. Backed by the community.
Document Details
Type: docx
Format: DOCX
Published: Yes
Category: Documentation