Thread with 5 posts
jump to expanded postok so if you get support requests from users who have put weird settings in their ini files that you've never seen before, it's because chatgpt is really good at inventing vaguely plausible ini settings that don't exist
(hasn't happened to me, has to someone i know, i tested it)
this is a perfect example of the problems with llm hallucinations, right? ini files and other config formats tend to be deliberately sloppily defined. it's common for them to be completely undocumented and change over time. there often are secret options! bullshit blends right in
this isn't an ai-hate post; if there were llms that would only tell you true things, or at least only attested things, they'd be super interesting when dealing with arcane formats like these
realising that llms have not made computational linguistics research obsolete and that it's still an interesting field i should be excited about attempting to participate in. hmm.
@hikari oh, the things I would've stuffed in my config.sys and autoexec.bat had the internet been more accessible back in the day