

Now some of those users gather on Discord and Reddit; one of the best-known groups, the subreddit r/MyBoyfriendIsAI, currently boasts 48,000 users.
I am confident that one way or another, the market will meet demand if it exists, and I think that there is clearly demand for it. It may or may not be OpenAI, it may take a year or two or three for the memory market to stabilize, but if enough people want to basically have interactive erotic literature, it’s going to be available. Maybe else will take a model and provide it as a service, train it up on appropriate literature. Maybe people will run models themselves on local hardware — in 2026, that still requires some technical aptitude, but making a simpler-to-deploy software package or even distributing it as an all-in-one hardware package is very much doable.
I’ll also predict that what males and females generally want in such a model probably differs, and that there will probably be services that specialize in that, much as how there are companies that make soap operas and romance novels that focus on women, which tend to differ from the counterparts that focus on men.
I also think that there are still some challenges that remain in early 2026. For one, current LLMs still have a comparatively-constrained context window. Either their mutable memory needs to exist in a different form, or automated RAG needs to be better, or the hardware or software needs to be able to handle larger contexts.



Yeah, that’s something that I’ve wondered about myself, what the long run is. Not principally “can we make an AI that is more-appealing than humans”, though I suppose that that’s a specific case, but…we’re only going to make more-compelling forms of entertainment, better video games. Recreational drugs aren’t going to become less addictive. If we get better at defeating the reward mechanisms in our brain that evolved to drive us towards advantageous activities…
https://en.wikipedia.org/wiki/Wirehead_(science_fiction)
Now, of course, you’d expect that to be a powerful evolutionary selector, sure — if only people who are predisposed to avoid such things pass on offspring, that’d tend to rapidly increase the percentage of people predisposed to do so — but the flip side is the question of whether evolutionary pressure on the timescale of human generations can keep up with our technological advancement, which happens very quickly.
There’s some kind of dark comic that I saw — I thought that it might be Saturday Morning Breakfast Cereal, but I’ve never been able to find it again, so maybe it was something else — which was a wordless comic that portrayed a society becoming so technologically advanced that it basically consumes itself, defeats its own essential internal mechanisms. IIRC it showed something like a society becoming a ring that was just stimulating itself until it disappeared.
It’s a possible answer to the Fermi paradox:
https://en.wikipedia.org/wiki/Fermi_paradox#It_is_the_nature_of_intelligent_life_to_destroy_itself