Thread with 26 posts

jump to expanded post

one of the ways this can happen is by you deluding yourself: you all should know by know that the person experiencing a psychotic break has suddenly acquired some massive delusion they're really excited about that they have to try to forget or they're going to completely lose it

Open thread at this post

but another way this can happen is by someone else deluding you, and this is also something we've experienced: the “psychosismaxxer” ripping and tearing their way through our entire social circle, with us not being spared, and killing a close friend along the way (not a joke)

Open thread at this post

the first is what it does to the one with the mania/psychosis:

• you become very, very, very good at recognising patterns of all kinds (humans are already great at this but this supercharges it)
• you lose most or all of your self-insight and ability to keep yourself in check

Open thread at this post

in our social circle there are some incredibly disabled, incredibly impoverished, incredibly mentally ill folks, whose lives are, you might say, “shit” in some way. and in our social circle there are also folks like us who, by contrast, have an incredibly, incredibly good life.

Open thread at this post

now, what happens when someone with that dream comes into contact with someone deep in psychosis? (remember, those two someones might actually be the same person)

• the pattern-recognition lets them see what someone dreams of
• the lack of self-insight lets them lie to sell it

Open thread at this post

the LLMs have this very dangerous people-pleasing quality, they have this very dangerous ability to appear sane, honest, trustworthy, they (for all we know) perhaps even might be able to guess what someone's dream is, we've no idea about that part…

but you can see the hazard?

Open thread at this post

in fact, the decision to make the LLMs appear sane, to tweak them as hard as possible to do a sort of… performance of sanity, is a very interesting marketing decision that… ohohohoh this is very interesting: was done with “safety” as an excuse, but we should call it out

Open thread at this post

… and so this thing that allegedly may have had something to do with safety, but which to us seems more like a convenient marketing decision (and we will say both of these standpoints are massive oversimplification, AI is a field rich with nuances, please look into them!) …

Open thread at this post

we must apologise to the readers who will start doubting their memories, but since originally publishing we have made quite a lot of copy-edits to paper over the cracks in this that stem from, well, you know, us not being fully sane yet ^^; — it should be easier to follow now!

Open thread at this post
Jigme Datse , @JigmeDatse@social.openpsychology.net
(open profile)

@hikari@social.noyu.me Incredibly scary indeed. I do "engage" on a somewhat significant basis, and thankfully I am very quick to go, "why are you love bombing me..." or whatever term you want to use. I really worry about people who actively engage, and don't seem bothered by any of it.

Like, every few weeks (roughly I think) I will spend some time playing with AI. But it's like very brief almost always. It really pisses me off because it does exactly what I'm trying to
avoid.

Open remote post (opens in a new window)