Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

Chaos isn't intelligence. Chaos is unmanageable growth in your solution space, the oppisite of what you want.


Whats confusing to me is the dual use of the word entropy in both the physical science and in communication. The local minimums are some how stable in a world of increasing entropy. How do these local minimums ever form when there's such a large arrow of entropy.

Certainly intelligence is a reduction of entropy, but it's also certainly not stable. Just like cellular automata (https://record.umich.edu/articles/simple-rules-can-produce-c...), loops that are stable can't evolve, but loops that are unstable have too much entropy.

So, we're likely searching for a system thats meta stable within a small range of input entropy (physical) and output entropy (information).


There are theories and evidence that your brain operates hovering on the edge of the phase transition to chaos

https://en.m.wikipedia.org/wiki/Critical_brain_hypothesis


If you have any system that tries to gravitate to a local minimum it is almost impossible to not make Newton's fractal with it. Classical feed forward network learning does pretty much look like newtons method to me. Please take a look into https://en.m.wikipedia.org/wiki/Newton%27s_method




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: