Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

Thank you. It seems to me that every time a new state-of-the-art result in AI is announced, a deep composition of convolutional or recurrent "neural functional programs" is involved.

I don't see deep compositions of treenets or word embedding layers, which tend to be used instead stand-alone as simpler models or as preprocessing layers to deep networks. I'd have to think about attentional models.

This is not a criticism. Rather, it's my way of suggesting that we need more experimentation with more interesting compositions using a broader range of "neuron functional programs" -- which I believe is also one of your points.

And again, I think your essay is fantastic.

--

Edits: changed my wording to express what I actually meant to write.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: