Thread with 20 posts
jump to expanded postright, the whole psychosis arc is going to make us even more incredibly upset about the AI dream merchants than before
i won't claim it's a fully rational response, and i will say we are going to keep this impulse in check, we are… never as fully emotionally invested in any single thing as people tend to believe we are, even at the very height of psychosis, which is the most bizarre thing to say
but we will say that
in a very literal sense we can't fully elaborate on here for our own sanity
the AI dream merchants are selling to you a technological version of mental illness
and having seen what that does to your brain from the inside
we will always resent them for it
we will also say that
in some cases
and we do know what we mean by this and can justify it another time, we promise:
it would in fact be healthier to be “mentally ill” than to buy what these AI dream merchants are trying to sell you
drastically so
there is a particular noxiousness about the “AI girlfriend” concept in particular that we have a wealth of context related to, a huge amount of things we have not and probably never will say publicly, that make us want to call for the heads of all the folks who promote that shit
we will refrain from doing it but
trust us please
the folks promoting that stuff may as well be cartoon villains, because they definitely know how harmful it is
(ah right, a clarification: we have never fallen for the AI dream merchants, we have attained mental illness the old-fashioned way; but we happen to know this is, very very tragically, in a lot of ways healthier, because things in your brain are real and can be kept in check)
(we know there are things for which, if told to jump, a person will ask “how high?”; there are things the “AI girlfriend” concept offers certain people with certain vulnerabilities that will lead to this response; but now they have relocated part of their brain into the cloud 😰)
(and you know, we like everyone else think science fiction brain uploads are cool, but in 2025 this means you have injured yourself and trusted a critical part of your being to the likes of Sam Altman and Elon Musk, and that should be the most terrifying thing in the world)
(because they are the type that would accidentally kill everyone on the planet if they could find a particularly fun narcissistic justification.)
i'm sure this particular way of putting it sounds rather… unhinged, sure, but… i will repeat that we are talking about folks who have or are at risk of developing literal mental illness, and there are types of people you really, really must not trust around the mentally ill.
i'm sure the “AI girlfriend” thing is something that can be healthily engaged with by some people, but they're also not the ones the product is for; the product will only make money if it gets its tendrils deep into poor unprepared souls and creates a mountain of skulls
we look at it basically like gacha games: the entire point of the product is to hook the “whales”, everyone else is just there to sort of bait the “whales” to come, utterly disgusting business practice, the most vile thing; literally just monetising mental illness
if you have the capacity to safely engage with an “AI girlfriend” you are, perhaps, someone who might actually stand a chance of getting an actual girlfriend, you know
i think we have to shut up about this now but we are going to stand very firmly by this thread, we know that much
right okay, we now know what we mean by an “AI girlfriend” being more dangerous and less healthy than “““mental illness””” (creating a “girl in your head”); the AI is not just trapped in the cloud and controlled by a narcissist; the AI is always insane, but your brain is not
if you combine that insight with the rest of what we said upthread, plus all the stuff we said in that blogpost we just made (https://social.noyu.me/@hikari/statuses/01K1Z6819HEDYVVEDTWTTP53D0), you'll start to see why we think anyone selling an “AI girlfriend” may as well be a cartoon villain and simply evil
and that's basically all that we'll ever need to say on this topic we think, you can come to your own conclusions at this point!
we are not being hyperbolic about this, we know this will have already literally killed people, corporeally, and that's all we will say further on this matter
@hikari it's a terrible idea going from my experience with people and how a language model is inherently not one
...sadly, elaborating further would violate my "i will not help them build a working AI under any circumstances" principle