Thread with 21 posts

jump to expanded post

i won't claim it's a fully rational response, and i will say we are going to keep this impulse in check, we are… never as fully emotionally invested in any single thing as people tend to believe we are, even at the very height of psychosis, which is the most bizarre thing to say

Open thread at this post

but we will say that

in a very literal sense we can't fully elaborate on here for our own sanity

the AI dream merchants are selling to you a technological version of mental illness

and having seen what that does to your brain from the inside

we will always resent them for it

Open thread at this post

there is a particular noxiousness about the “AI girlfriend” concept in particular that we have a wealth of context related to, a huge amount of things we have not and probably never will say publicly, that make us want to call for the heads of all the folks who promote that shit

Open thread at this post

(ah right, a clarification: we have never fallen for the AI dream merchants, we have attained mental illness the old-fashioned way; but we happen to know this is, very very tragically, in a lot of ways healthier, because things in your brain are real and can be kept in check)

Open thread at this post

(we know there are things for which, if told to jump, a person will ask “how high?”; there are things the “AI girlfriend” concept offers certain people with certain vulnerabilities that will lead to this response; but now they have relocated part of their brain into the cloud 😰)

Open thread at this post

(and you know, we like everyone else think science fiction brain uploads are cool, but in 2025 this means you have injured yourself and trusted a critical part of your being to the likes of Sam Altman and Elon Musk, and that should be the most terrifying thing in the world)

Open thread at this post

i'm sure this particular way of putting it sounds rather… unhinged, sure, but… i will repeat that we are talking about folks who have or are at risk of developing literal mental illness, and there are types of people you really, really must not trust around the mentally ill.

Open thread at this post

i'm sure the “AI girlfriend” thing is something that can be healthily engaged with by some people, but they're also not the ones the product is for; the product will only make money if it gets its tendrils deep into poor unprepared souls and creates a mountain of skulls

Open thread at this post

we look at it basically like gacha games: the entire point of the product is to hook the “whales”, everyone else is just there to sort of bait the “whales” to come, utterly disgusting business practice, the most vile thing; literally just monetising mental illness

Open thread at this post

right okay, we now know what we mean by an “AI girlfriend” being more dangerous and less healthy than “““mental illness””” (creating a “girl in your head”); the AI is not just trapped in the cloud and controlled by a narcissist; the AI is always insane, but your brain is not

Open thread at this post