Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

> It's worth noting that GPT-4 internally uses a Mixture of Experts (MoE) model with 8 'experts' internally

Has this actually been confirmed, either officially by OpenAI or otherwise? As far as I know, George Hotz claimed this once, and since then everyone just assumed it was the truth without actually waiting for any sort of verification.



TL;DR This might work - but it will be like watching Groundhog Day. It will require many iterations, make too many mistakes to get there and won’t remember a thing.

My naive understanding of layers in a model is that each layer loosely acts as an expert in one step of the entire process.

For example, in an object recognition model, one layer takes on the task of separating objects from the background, another excels at knowing the colour of different things, another might learn the difference between a blue sky vs. the colour sky blue.

So essentially, we’re trying to mimic the same working model at a higher level of abstraction. Similar to how our body is made of atoms. Many atoms make a molecule. Many molecules make organic tissue, and amino acids that perform more complex operations. Skip all the way up and you have a human being. Human beings in numbers can put a man on the moon, create anti-matter and nuclear explosions.

In theory - this can work just as well. One agent to provide a solution, another to critique it, another to verify it, another to mimic the end-user. The big obvious missing piece is memory and the ability to learn while doing (at least in existing LLMs).

It’s like having a software development team that’s frozen in time and knowledge. These LLM agents will always require some micromanagement and hand holding, and will waste a lot of resources with failed attempts - that they are going to repeat every time you ask them to perform this task.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: