Hacker Timesnew | past | comments | ask | show | jobs | submit | jmfldn's commentslogin

I'm somewhere in between. I'm excited about building more things faster and extending my capabilities. But I also love thinking about the underlying language, runtime, algorithms, the wider system. I want LLMs to enhance this for me, I want my understanding to go up as I write less code. It's also key to my job as a lead that I maintain understanding of the system for debugging, security etc.

So if I can do both with these tools, then great. I want to cognitively offload in a way that allows me to focus on the important bits. And I'm writing instructions to the LLM to help me do that eg 'help teach me this bit'. A builder and tutor at once.


Great approach.

I usually use ChatGPT, as a chat (as opposed to an agent).

It explains everything quite well (if sometimes a bit verbose).

Today, I am going to do an experiment: I’ll be asking it to rewrite the headerdocs in one of my files (I tend to have about a 50% comment/code ratio), so it generates effective docc documentation. I suspect the result will be good.


This is a profound category error. What Altman reduces to a 20-year 'training' cycle fueled by 'energy' is what we, in the actual world, call life. It is a stunningly hollow perspective that uses the language of industrial output to describe the human experience. While he is likely being provocative to keep his product at the center of the cultural conversation, it probably exposes something about him.


Exactly why we need to rid ourselves (by taxes) of billionaires. Those people have way too much power, and are often stupid dumbasses who just got rich randomly (right moment at the right place, or because their parents were rich in the first place), but are mostly spewing stupid lunacies


This is a super disingenuous take. He was very obviously making a specific point, not try express a perspective on the value of humanity.


I understand he’s making a technical point about efficiency, but language isn't neutral and I think it betrays something deeper. It's such a glib and shallow point too that I think it should be called out since he has a track record of saying some incredibly shallow things about AI, people, politics, and everything really.


The meaning of a message is what has been understood.


Can you please make your substantive points without being snarky or condescending? Your comment would be fine without that last bit.

https://qht.co/newsguidelines.html


The meaning of a message is what is intended + communicated, assuming those intentions were communicated clearly.

Willfully interpreting otherwise (especially uncharitably so) is the very definition of being disingenuous, which is pretending to not know what was really meant.


I disagree: if a message is open to such disingenuous interpretations, then its meaning has not been formulated clearly enough. I use the: (1) say what you will communicate, (2) communicate, (3) say what you have communicated rule, also the six W's...


No one communicates that way. It's not practical. Almost all expressions can be uncharitably interpreted by a listener who doesn't like you, and thus has a motive to quote your sentences and disingenuously pretend you're saying something much more dastardly than you clearly intended.


"In 2025 finally almost everybody stopped saying so."

I haven't.


Some people are slower to understand things.


That is why they need artificial inteligence


Well exactly ;)


If it moves and connects with you then it's real music.

It's fine to have a preference for live musicianship, but the 'real music' argument has been leveled against every new musical technology (remember the furore around Dylan going electric?). It dismisses contemporary creativity based on a traditionalist bias that elevates one form of execution above all others. There's also a huge amount of skill in producing good electronic music. It's always hard to make good music no matter the means.


Seeing all the experienced engineers on this thread who feel discouraged is really depressing.

Not because I disagree with you, I don't! It's because I fear a gradual brain drain of people who actually love their craft and know how to build things properly. I fear we'll end up with worse software thats simply 'good enough', built a atop a pile of AI slop.

If it's cheaper but with acceptably worse results, I fear this is good enough for a lot of companies.


A fun read, but wildly implausible. Perhaps there are other frontier technologies out there that get us even a fraction of this. But if we're talking this time horizon, I assume we mean LLMs or some other related thing? Are you joking?


LLMs are the visible tip, but underneath we have multimodal models, agent frameworks, robotics integration, and rapidly falling compute costs. Frontier tech rarely looks plausible at first—flight, the internet, even smartphones did not.

The point is not that LLMs themselves take us to 2125, but that they are the spark in a chain of exponential advances that will.


Sure maybe you're right. I'm just so underwhelmed by what I see in my day job that it's hard to map error prone and limited deep learning tools to what is being described here.

I don't see a strong argument here, more just a hope that something will spark this sci fi trajectory you describe. I'm sure big enough changes are afoot, but I think that the AI we have now will turn out to be much more of a 'normal' technology than most people expect.


I love his attitude, an example to us all. Woz knows what's important.


Best compromise is a markdown file. You can read with it with Obsidian if you want a better gui, but you can also just treat it like a simple text file if you prefer. No lock-in to an app.

I agree that complex todo apps are a bit of a waste of time.


Markdown + Vim-wiki plugin is a really powerful combo, that is still all only markdown underneath.

And yes, you can combine that with something like Obsididan at any time.


Less training data for Scala is the issue probably. Imagine how much Python is out there.


Depends what we mean by better. If you prefer rock music to Bach then great. Enjoy! I love popular music and classical for different reasons

But if we're talking skill, intellectual depth, craft, then there are objective criteria. Take Bach, his music is like a masterpiece of engineering with its unparalleled compositional complexity and craftsmanship. His mastery of counterpoint being but one example. His work represents a pinnacle of musical architecture, establishing foundational principles that profoundly influenced centuries of Western music.

That just doesn't compare to most pop music does it?


Counterpoint is cool, but a lot of the time is carries the emotional weight of listening to someone solve sudoku.

Objectively, Bach lacks the skill and emotional depth to write a song about that lonely feeling you get when you drink too much and get kicked out of the party (a foundational principal of Country Western music)


> Bach lacks the skill and emotional depth to write a song about that lonely feeling

For a wide range of such feelings, some can regard as "lonely", as they develop, achieve a triumph, a catharsis, and finally a recapitulation and a comforting, secure resolution -- communication, interpretation of human experience, emotion, i.e., art.

https://www.youtube.com/watch?v=NEUYq5t-cCM

Bach wrote it for solo violin, but it's been arranged for solo piano, full orchestra, etc.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: