Kaleva et al.’s landmark study on women, privacy, and reproductive health shows that frontier AI is architecturally disqualified for the most intimate queries. It also proves that the market isn’t asking for “uncensored” models. It’s asking for models that don’t refuse when the stakes are highest.
https://arxiv.org/abs/2603.16918 PDF
1. The Quiet Migration
Millions of women are already using generative AI for sexual and reproductive health (SRH). Not in theory—in practice, today. They are disclosing cycle data, miscarriage symptoms, pregnancy status, and abortion considerations to chatbots that were built by well‑funded teams in San Francisco to debug Python, write marketing copy, and summarize meeting notes.
A new paper from the ACM CHI 2026 conference finally looks at this user population on their own terms. Ina Kaleva, Xiao Zhan, Ruba Abu‑Salma, and Jose Such conducted deep, semi‑structured interviews with 18 U.S. women across restrictive and non‑restrictive states, all of whom had used GenAI chatbots for SRH information in the post‑Dobbs landscape. Their findings are remarkable—not because they describe a distant future, but because they confirm a present‑day failure that Ellydee has been building to solve.
Before this paper, the “AI at home vs. AI at work” bifurcation was a strategic narrative. After Kaleva et al., it is citable HCI literature.
2. Four Findings That Map to the Ellydee Positioning
The paper’s results line up almost eerily with the architecture and market thesis Ellydee has been shipping. Here is how we read them.
Finding One: Adoption by default, not by design.
Participants are already disclosing highly sensitive SRH details to frontier tools despite stated privacy concerns. They do it because the utility is real—immediate answers, non‑judgmental tone, 3 a.m. availability. But they are doing it on the only platforms available, not platforms they trust. As the authors note, the perceived benefits of utility, usability, and anthropomorphism override privacy fears. The demand is there; the supply is broken.
Finding Two: The abortion threshold.
This is the paper’s sharpest finding, and it is exactly the wedge Ellydee occupies. For most SRH topics, participants accepted privacy risk as a trade‑off. But abortion‑related queries triggered a qualitatively different category of fear: criminalization, government surveillance, profiling, and legal jeopardy. In restrictive states, the chatbot is no longer just a health tool; it is a potential witness. The researchers found that users self‑censor, obfuscate, or abandon the query entirely when the topic turns to abortion.
Finding Three: Users are unarmed.
The authors write: “Few participants employed protective strategies beyond minimizing disclosures or deleting data.” Deletion and minimization are not strategies. They are hacks. They are what users do when the product category offers no real alternative. This is the unserved market in its purest form: women who want protection, have no idea how to get it, and are making do with a “clear history” button that everyone knows is inadequate.
Finding Four: The researchers wrote our PRD.
Kaleva et al. conclude with a set of design recommendations that read like an Ellydee spec sheet: health‑specific interactive privacy features co‑designed with users; SRH‑adapted moderation that does not reflexively refuse abortion questions; stronger transparency around data flows; and architecture that effectively demands zero‑knowledge data handling. Independent academic research is converging on the exact product Ellydee has already deployed.
3. Stop Saying “Uncensored.” Start Saying “Refusal.”
Ellydee has used the word uncensored in the past, and we owe our readers honesty about why that frame is no longer sufficient.
“Uncensored” is a content‑marketing term. It signals swagger, edginess, a willingness to let users role‑play or generate adult fiction. There is value in that space, but it misplaces the emphasis. What Kaleva et al. reveal is not a censorship problem. It is an architectural refusal problem.
Frontier models are programmed to refuse. Anthropic’s Claude adopts a high‑refusal posture on sensitive medical and legal topics. OpenAI retains conversation data for model improvement by default, meaning even a granted query becomes a training artifact. Gemini lives inside the Google advertising graph, where health data and surveillance capitalism are structurally inseparable. None of these are policy glitches. They are architectural realities that disqualify each platform, independently, from serving the abortion‑query cohort.
This is the bifurcation: AI at work versus AI at home.
Work AI must be sanitized, liability‑averse, brand‑safe, and integrable into enterprise IT stacks. Home AI must handle the questions you cannot ask your employer, your insurer, or sometimes even your doctor. It must handle miscarriage at 2 a.m., fertility anxieties, gender identity exploration, and yes, abortion options in states where asking the wrong question can put you in a prosecutor’s file. A tool that refuses those queries—or harvests them for training data, or injects them into an advertising profile—is not “safe.” It is simply unusable for the home.
4. From Narrative to Defensible Fact
The paper does something critical for the Ellydee thesis: it removes it from the realm of opinion and places it in the realm of evidence.
We now have peer‑reviewed, user‑centered research confirming that frontier platforms cannot serve this user. Not “will not” because of bad intentions, but cannot because of how they are built. That distinction matters. It means Ellydee is not competing with OpenAI, Anthropic, or Google on margin or marketing. We are competing with them on structural fitness for a use case their architectures exclude.
This unlocks a non‑obvious but defensible TAM story. There are roughly 60 million U.S. women of reproductive age. Post‑Dobbs, roughly half live in restrictive or legally hostile states. GenAI health‑seeking behavior is rising, and the qualitative data from Kaleva et al. suggest that fear, not apathy, is the primary barrier to deeper engagement. Even a 1–2 percent conversion to a genuinely privacy‑first, non‑refusing paid tool is a nine‑figure ARR outcome.
Inside Ellydee, we think about this in two cohorts. The companion and creative writing users are the cash flow—they fund the servers and the research. The SRH cohort is the mission and the moat. It is the population that needs zero‑knowledge encryption not as a feature, but as a prerequisite. It is the population for whom Acceptance is not a philosophical stance, but a medical necessity.
This research also validates a partnership strategy that frontier competitors literally cannot buy. Ellydee’s sponsorship of the Electronic Frontier Foundation already signals where we stand. But the paper opens doors to less obvious allies: Planned Parenthood Digital, the Digital Defense Fund, ACLU Technology projects, reproductive justice clinics, and international SRH organizations. These groups bring distribution, credibility, and a user base that is politically motivated to abandon surveillance‑based tools. You cannot purchase that alignment with ad spend when your business model depends on the very data extraction those organizations exist to fight.
5. The Researchers Wrote the Roadmap. We’re Building It.
Kaleva et al. asked for interactive privacy features co‑designed with users. We built the vault and the granular data controls.
They asked for SRH‑adapted moderation that does not refuse abortion queries. We built Acceptance, with a refusal boundary limited strictly to imminent harm to children—because that is the only limit that is ethical, not corporate.
They asked for transparency. We ship the Impact Dashboard and publish where our inference runs, who touches the data (no one), and what legal jurisdiction protects it.
They described data flows that functionally require zero‑knowledge architecture. We deployed XChaCha20‑Poly1305 end‑to‑end encryption with user‑held keys, then relocated corporate operations to Germany and inference to Finland to make sure “legal process” cannot become a backdoor.
This is not a coincidence. It is convergence. When independent researchers interview real users about real stakes, they arrive at the same architecture we did: privacy as infrastructure, refusal as the enemy of health, and the home as the frontier that Silicon Valley forgot.
6. Invitation
If you have ever deleted a chat history because the question felt too sensitive; if you have ever rephrased a medical query to dodge a content filter; if you have ever wondered whether your fertility data is training someone else’s model—you are the user this paper describes.
You are also the user Ellydee was built for.
The tools you have been using were built for the office. Try one that was built for the home.