Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

> Would some hypothetical future AI just "know" that tomorrow it's going to be 79 with 7 mph winds, without understanding exactly how that knowledge was arrived at?

I think a consciousness with access to a stream of information tends to drown out the noise to see signal, so in those terms, being able to "experience" real-time climate data and "instinctively know" what variable is headed in what direction by filtering out the noise would come naturally.

So, personally, I think the answer is yes. :)

To elaborate a little more - when you think of a typical LLM the answer is definitely no. But, if an AGI is likely comprised of something akin to "many component LLMs", then one part might very well likely have no idea how the information it is receiving was actually determined.

Our brains have MANY substructures in between neuron -> "I", and I think we're going to start seeing/studying a lot of similarities with how our brains are structured at a higher level and where we get real value out of multiple LLM systems working in concert.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: