Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

> It doesn't have any intentions

Yeah, and in a way it's even worse than that, since there's another layer of cognitive illusion: "It" doesn't exist.

The LLM algorithm is an ego-less document-generator, often applied to growing a document that resembles dialogue between two fictional characters.

So when your human-user character is "asking" the AI assistant character to explain its intentions, that's the same as asking a Count Dracula character to describe what it "really feels like" to become a cloud of bats.

You'll see something interesting, but it'll be what fits trained story-patterns rather than what any mind introspects or perceives.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: