Not being allowed into the App Store played a pivotal role in the challenges my startup, Freyja, faced, even if it wasn’t the sole reason for its struggles. We had a great product, a clear safety-first mission, and a solid plan. But being shut out of one of the most powerful distribution channels in tech left us at a massive disadvantage.
This article dives into why we were excluded, the workarounds we attempted, the blatant hypocrisy baked into Apple’s policies, and a bigger conversation around who gets to control the future of the internet.
Why Were We Banned?
Officially, Apple cites its commitment to a “family-friendly” environment, legal compliance across jurisdictions, and user safety as reasons for banning adult content from the App Store. Their policies are meant to prevent exposure to explicit material, reduce liability, and create a consistent experience.
On paper, that sounds fair. But in practice, it’s inconsistent and, frankly, hypocritical.
What’s Flawed With That Policy?
1. Overreach and Censorship
Apple’s definition of “adult content” is sweeping. Apps focused on sexual health, relationships, or even education often get lumped in with explicit pornographic content, even if they’re not harmful or inappropriate. The result? Vital tools, many of them focused on safety, empowerment, and consent, get blocked.
2. Lack of Transparency
Developers are often left guessing what Apple considers acceptable. Rejections are vague. Appeals go unanswered. This kind of ambiguity breeds distrust and limits innovation in important, underserved areas.
3. Exclusion of Ethical Adult Platforms
Platforms like Freyja, built with consent verification, ID checks, and safety-first design, are treated the same as anonymous, unregulated adult sites. There’s no meaningful distinction between ethical, regulated content and harmful content.
4. Market Monopoly
Apple’s near-complete control of iOS distribution creates an unfair playing field. If you’re not allowed in the App Store, you're essentially invisible to most iPhone users, even if Apple itself says “use the web” as a fallback.
5. “Use the Web” Is a Trap
Despite claiming the web is the alternative for banned content, Apple intentionally makes web apps harder to use. It’s incredibly difficult for the average user to install or even find a web app on an iPhone. We had to create TikToks just to teach users how to save our web app to their home screen. And even that confused people, largely because Apple designed it to be confusing.
Web Apps are Cool, So Why Hide Them?
Here’s the irony: the web is everything Apple says it wants to support.
It’s open.
It’s accessible.
It’s innovation-friendly.
It’s uncensored.
It lets people build freely.
Web apps are the future. They work across devices, aren’t locked behind walled gardens, and anyone with an idea can build one.
So why is Apple working so hard to bury them?
Simple: they can’t make money off web apps.
No 30% cut. No subscriptions through Apple Pay. No control. No gatekeeping.
So instead, Apple has built a system where:
They block you from the App Store for violating vague “safety” rules.
They tell you to “use the web” instead.
Then they make it nearly impossible for users to actually use the web the way they would an app.
This isn’t just restrictive, it’s anti-competitive.
The Hypocrisy Is Wild
Apple claims to care about safety. But let’s be real: if safety were the priority, we wouldn’t see what we currently do in the App Store.
X (formerly Twitter) is allowed in the App Store. Yet, they allow porn on their platform, often unverified, unmoderated, and unconsented. No age gates. No ID verification. Contrastingly, adult sites have to verify all their adult content yet they are banned from the App Store.
Dating apps are all over the App Store. Many allow photo sharing and anonymous messaging, often leading to the exchange of nude content, sometimes involving underage users, and posing real-world safety risks. Unlike adult sites, none of this content is ID verified & doesn’t have consent paperwork behind it.
In-app browsers, like the ones used in TikTok, Instagram, and X, are technically 12+ and they can be used to visit any adult site with just a tap. This means kids can reach porn with less friction via a social media app than they ever could through a regulated adult app that Apple blocks.
And the biggest irony?
Browsers themselves are flagged 18+ in Apple’s system. That’s right. Apple knows you can access anything via the web, which supposedly makes it too “unsafe” for app inclusion, but it doesn’t apply the same standards to the apps that host those same browsers. This begs the question if their policies are about protecting kids or profits.
How Did We Gain Traction Without the App Store?
We built Freyja as a web app, not by choice, but because we had no other option.
We had to educate every single user on how to “install” it to their home screen. We created explainer videos. We answered endless questions like “why isn’t this in the App Store?” And every time, I couldn’t help but cringe at how many people we lost because the experience was just too clunky.
It’s hard not to think Apple intentionally made it that way.
Web apps could compete with native apps if users had the same seamless experience. But Apple knows that, which is why they’ve dragged their feet on supporting features like push notifications or proper app-like performance on iOS (while simultaneously supporting them on Safari for Mac).
How This Could Be Fixed
Create an Adults-Only Section
Apple could allow adult apps that meet strict safety and verification standards, gated behind ID verification or KYC.Make Home Screen Installation Easy
Streamline the process. Treat web apps with the same respect as App Store apps. Let users find them, install them, and use them easily.Hold All Apps to the Same Standards
If the argument is safety, then platforms like X and dating apps must be held to the same standard as regulated adult platforms. No exceptions.
Final Thoughts
Freyja set out to make the adult industry safer. We focused on consent, safety, and transparency. But instead of being supported, we were shut out by systems that claim to care about safety, while enabling far more dangerous behaviours elsewhere.
The App Store ban wasn’t the reason we failed but it sure didn’t help. Apple says “use the web,” but then ensures that doing so is as painful and limited as possible.
If we really want a safer internet, the solution isn’t to ban everything with the word “adult” in it. It’s to build systems that reward responsibility and transparency and stop pretending that censorship is the same as safety.