Presumably the goal of the people throwing money at an OF content-creator [in the form of "donations" — "here's some extra money without any implied obligation" — rather than e.g. paying for custom content] is to try to jump the gap from parasocial relationship to real (sugar?) relationship.
Of course, the OF creator can't form true relationships with thousands of people. I'm guessing that the implicit mental model in the heads of OF subscribers who "donate" to creators, is that this is a competition — that they're all participating in something like an ongoing hidden auction for a slice of the creator's limited time. They think "if I just pay the most, then she'll feel obligated to pay attention to me." (Of course, most creators feel no such sense of obligation.)
If OF subscribers can know in advance that that jump is fundamentally impossible — such as if they can discern that a dumb AI is responding to their donation-attached messages, and that that AI fundamentally has no feature to forward messages to the creator themselves — then they probably wouldn't bother "donating" in the first place.
That's not an unreasonable thought, but I think you underestimate how many people know that they are buying an illusion and are fine with that. To support that point: there are people right now who pay real money to chat with an AI companion that pretends to be their boy- or girlfriend. [0,1]
I think these people are, by and large, under no illusion that their AI companion is anything other than a computer program, but they get attached anyway and choose to live in the illusion.
I would agree that at least some OF consumers that throw donations at the OF creators, are knowingly buying an illusion. But I don't think your analogy holds. I think there's a fundamental difference between these two activities (interaction with OF creators, vs interaction with AI "character" chatbots.) The former is, in fact, expected to be literally parasocial — key word "social"; while the latter is expected to be a form of private entertainment.
---
Re: the former, I would read "interacting with an OF creator" as no different than interacting with any other "their job is to pretend to like you for money, but they aren't putting themselves in a position where they're ever obligated to do anything for you" type of sex worker. For example, the employees at strip clubs; or for maybe an even better analogy, the employees at a Japanese "host club".
These are quintessential parasocial relationships: the consumer of such a service is buying an illusion, but the "product" they expect to be buying is specifically the illusion of fondness, as performed by a human. They're buying a pure act of emotional labor done by a human — "service with a smile", but where the service is the smile. (It's the same thing people get out of donating to a Twitch streamer — the streamer thanks them on-stream for their donation. They get noticed in a performatively appreciative manner.)
And, depending on how in-demand that human's time is / how many other consumers want that same emotional labor output from them, that emotional labor can be incredibly highly-valued in the market. Which is why some OF subscribers — despite knowing that the possibility of deeper connection to the OF creator is likely illusory — are still willing to pay huge amounts of money. They aren't expecting to literally enter into a sugaring relationship with the OF creator; but they are expecting to get the creator's attention and possibly receive a hand-written thank-you note or shout-out or some custom selfie they didn't ask for. An act of emotional labor on the OF creator's part, performatively responding to their donation.
But, like with going to see a magic show, this kind of illusion is only valuable when it has high verisimilitude. Nobody will pays to see a bad magic act, if they know it's going to be bad. And nobody donates to an OF creator, if they know they're going to get AI responses.
(I'm sure there are some OF consumers who are not observant enough to realize they're receiving AI responses, and so feel like they are receiving the parasocial emotional labor service they paid for. Just like there are some people — usually children — who are not observant enough to notice the flaws in a bad magic act, creating a market for bad magicians.)
---
Re: the latter — paid subscribers to AI character chatbot services treat this economic relationship entirely differently. They don't see the individual chatbot as anything that holds value. And a business model that tries to get them to pay for a specific AI chatbot character, would likely never work.
Rather, from my understanding, to the people who subscribe to these things, paying for the service is analogous to paying a subscription to a game streaming service like GeForce Now.
In both cases, there's a large quantity of interactive entertainment out there that you want to "play". And running that interactive entertainment locally, would require capabilities that none of the devices you own possess. And it would be very expensive to buy the fancy hardware with those capabilities—possibly to the point of financial impracticality, if you want a top-of-the-line experience. (And, funny enough, in both of these cases, the fancy hardware is a GPU!)
Also, you might want some additional convenience — maybe you don't have anywhere to put a gaming rig, but want to play everything on a laptop. Or even on your phone sometimes.
A game streaming service has one major USP, and one minor USP:
- The major USP is that it trades CapEx for OpEx. Rather than owning / maintaining / dealing with a gaming rig, you can effectively rent one in the cloud.
- The minor USP is that it might provide subsidized access to a number of entertainment titles you'd otherwise have to purchase. (Xbox Game Pass does; GeForce Now does not.)
AI character chatbot services — which form a spectrum with generic flat-monthly-fee "Inference-as-a-Service" providers intended for use with private FOSS AI-character-chat frontends (e.g. SillyTavern) — have the same two USPs:
- The major USP is, again, trading CapEx for OpEx.
- The minor USP is a bit stranger — the services that specifically market themselves as "AI character chat" services often market a model that's being continuously fine-tuned on other users' interactions to improve its fidelity for the specific use-case; and, less often, market a proprietary stable of characters developed for the service. (But "AI characters" themselves — the definitions that make a chatbot into a particular character — are mostly considered to be a commodity; they're posted for free to various "AI character card" hosting services, and most systems in this space just expect you to import the characters you're interested in from such hosting services, rather than offering their own proprietary ones.)
All in all, there's no "illusion" here for anyone to "fall for." There's just a desired capability (running AI models to play with), with a zero-sum trade-off being made between self-hosting that capability, vs. paying someone else to manage it for you.
There is, as always, the exception that the world contains some very unobservant people — possibly again children — who will mistakenly develop a parasocial bond to AI characters because they fail to notice the flaws and limitations that make interactions with an AI character qualitatively different from interactions with a human, and thereby fail to move past the initial sense of full immersion/verisimilitude they feel when interacting with such systems.
But these chumps are the exception. Most subscribers to these services are not confused about what they're paying for; they have moved past any initial impression of full immersion. Instead, they just see AI characters as fun toys — entertainment software! — and they're paying $10/mo or whatever because that's a fair price to pay to access a unique type of fun toy.
The key testable hypothesis here, is that once even our phones have the GPU grunt required to run high-fidelity "roleplaying" LLMs locally, the bottom will drop out of this market; there'll be no reason to pay for an "AI character streaming service" indefinitely, once your phone's OpEx gets you free unlimited access to that capability locally, in the form of a free or one-time-cost app that does the same thing.
(...also, just as a tangent: this is probably the "everybody knows it but nobody's going to say it" reason that so many people got so excited about the new Mac Mini. It's a perfect single-user AI-character-chatbot RP model host, that takes up minimal space and has a reasonable price-point. Many people currently paying for these services would actually rather trade OpEx for CapEx — the CapEx was previously just too dang high to make it worth it!)
Of course, the OF creator can't form true relationships with thousands of people. I'm guessing that the implicit mental model in the heads of OF subscribers who "donate" to creators, is that this is a competition — that they're all participating in something like an ongoing hidden auction for a slice of the creator's limited time. They think "if I just pay the most, then she'll feel obligated to pay attention to me." (Of course, most creators feel no such sense of obligation.)
If OF subscribers can know in advance that that jump is fundamentally impossible — such as if they can discern that a dumb AI is responding to their donation-attached messages, and that that AI fundamentally has no feature to forward messages to the creator themselves — then they probably wouldn't bother "donating" in the first place.