Hacker Timesnew | past | comments | ask | show | jobs | submit | trueno's commentslogin

these big names at the top (thiel, musk, etc) ive really just started to tune them out. they're all bored, have too much money, and are obsessed with futurism & getting people to follow them into their lame visions of the future at all costs. they're p much entirely decoupled from the economic plights you and i face, they just play a different game altogether and any potential gamble for savings is never framed as a "this can make things better for everyone" but more or less just funneled through ... uh, i dont know shareholder opportunity or something.

i don't doubt there's plenty of upside in agriculture/farming to be had with technology, i just no longer find it meaningful when people from this social class are trying talking about them. something is really off putting now when silicon valley types try to be authority figures on completely different industries, it's super presumptuous. think they've lost the plot quite a bit here, i dont think anyone should be interested in their ideas of the future at all. they've done enough damage. all these dudes ever needed was to go to therapy, all we need now is for them to leave us alone. the incessant need to be the guy with the big ideas these guys are constantly displaying is just so exhausting at this point. wish they'd just go buy a beach and drink liquor out of coconuts and disappear, no one needs to move fast and break things and shake cows


This isn't a Thiel / Musk idea though .. not in the slightest.

This is a decade old New Zealand idea, spawned ground up from dairy farmer adjacent New Zealanders, that has recently sought funding and are now being backed by Thiel et al.

I'm saddened to see them now being in the pocket of SV venture caps .. good for them for fast, easy, large amounts of development and FU money .. but ultimately likely a loss for the AU/NZ dairy industries who will wake up to being controlled via subscription services piped through offshore clouds and having their owns cows being held to ransom and potentially forced to walk off cliffs by bored hackers or indignant Iranians.


yea there's so many ways through this now. golang and wails is great, rust and tauri is great. both seem to not feel like the slug that is electron because they just use whatever os native web view your os has.

for the dedicated more native stuff dioxus is kinda cool if you don't want a web stack in the mix.

i'm enjoying golang and wails though paired with whatever front end i want, all apps i've made perform execellent on windows. bottom line = yeah i can't really think of a scenario where i personally would ever write an app for windows specifically.

i, like you, used to get hung up on native vs web framework. i'd encourage you to give it a go, possible you cede that mayhaps the native thing isn't as important as you thought.


that is actually sick.

how common is it for go devs to experience leaking goroutines ? id like to think go is a lot less shoot yourself in the foot here since they provide a framework for concurrency/parallelism rather than you working with the tiny pieces of it and building out the architecture yourself, but ive only needed to use goroutines once and it was a pretty problem-free experience.


It depends how much the code uses manually spawned goroutines, and how complex the lifecycle of these goroutines is… in big codebases such as kubernetes, docker, etc, it has been a problem. There have been research papers and blogs about this, but most Go developers are not aware of this issue it seems.

> how common is it for go devs to experience leaking goroutines

About as often as leaking memory in C++


F. again i have minimal experience actually ever needing it in go, but guessing this is just generally the exercise of managing the lifecycle of a goroutine well in your code? proper handling so things dont get orphaned in buffer, fire and forget woopsies, etc.

early on i do feel like go kinda advertised batteries included concurrency but i kinda wished they advertised the foot-shooting-mechanisms and gaps in the abstraction a little more. overall i prefer to have enough control to choose how to manage the lifecycle. mem leaks bum me out and kill my steam, at least from my experience with c/cpp.


i played with this a bit the other night and ironically i think everyone should give it a shot as an alternative mode they might sometimes switch into. but not to save tokens, but instead to.. see things in a different light.

its kind of great for the "eli5", not because it's any more right or wrong, but sometimes presenting it in caveman presents something to me in a way that's almost like... really clear and simple. it feels like it cuts through bullshit just a smidge. seeing something framed by a caveman in a couple of occasions peeled back a layer i didnt see before.

it, for whatever reason, is useful somehow to me, the human. maybe seeing it laid out to you in caveman bulletpoints gives you this weird brevity that processes a little differently. if you layer in caveman talk about caves, tribes, etc it has sort of a primal survivalship way of framing things, which can oddly enough help me process an understanding.

plus it makes me laugh. which keeps me in a good mood.


Interesting point! Based on what you said, in a way caveman does save your human brain tokens. Grammar rules evolve in a particular environment to reduce ambiguities and I think we are all familiar enough with caveman for it to make sense to all of us as a common. For example, word order matters for semantics in modern english so "The dog bit the grandma" and "Dog bit grandma" mean the same. Coming from languages where cases matter for semantics (like German), word order alone does not resolve ambiguity. Articles exist in English due to its Germanic roots

Now I want to try programming in pigeon English

A pidgin is just a simplified form of language that hasn't evolved into its own new language yet. There are many English pidgins.

> Apple has a monopoly over the "M-chip" personal computer market

lmao what ? the "M-chip" is literally their chip that they designed, built relationships with TSMC over and bankrolled into production to put in their products. literally hardware by apple for apple. this was a decade plus long thing in the making, this is the risk/gamble apple took and invested heavily into. that is apples innovation. any other manuf is free to go do this themselves for their own devices, they just didn't and for the most part still don't. that just like isn't a monopoly at all, i'm amused you even got to that point in the first place. seems to carry some broad misunderstandings of what the M-series chips are or carries an assumption that cpus are supposed to be shared to any interested parties just because that was intels business model. intel was historically slacking & their one-size-fits-most approach wasn't meeting the engineering requirements apple was after generation after generation, so apple took the cpu destiny into their own hands and made their own. if you feel like non-apple laptop chips aren't living up to that kind of perf/ppu.... well yeah you'd be right. but that's not really apples fault. that's not a monopoly thing, like at all. either laptop manufs need to go make their own chip (unlikely) or intel/qualcomm/etc need to catch up.


ive gone down this rabbit hole and i dunno, sometimes claude chases a smoking gun that just isn't a smoking gun at all. if you ask him to help find a vulnerability he's not gonna come back empty handed even if there's nothing there, he might frame a nice to have as a critical problem. in my exp you have to have build tests that prove vulnerabilities in some way. otherwise he's just gonna rabbithole while failing to look at everything.

ive had some remarkable successes with claude and quite a few "well that was a total waste of time" efforts with claude. for the most part i think trying to do uncharted/ambitious work with claude is a huge coinflip. he's great for guardrailed and well understood outcomes though, but im a little burnt out and unexcited at hearing about the gigantic-claude exercises.


i emailed jen sen huang at the very tail end of the maxwell era and p much begged for maxwell support on macos. i didnt expect a reply, especially since i guessed his email based on some "how to find ceo emails" google search result.

he actually did reply weeks later and said "i didnt realize people wanted this, my team has added them. go check now". pretty sure that was the last time nvidia drivers came to macos.

there's a lot of assumptions made with this topic, particularly the assumption that apple is blocking them. at least in my experience the opposite was true, nvidia just flat out wasn't making them. however i don't doubt the truth lies somewhere in between: nvidia and apple have a pretty much nonexistant relationship now. i dont know whats required here but i also don't doubt apple makes this experience suck butt for any interested parties.


Imagine 2026 Jensen "OpenClaw is the greatest software ever" Huang responding to emails from mere mortals

you raise a good point that nvidia support on apple silicon via egpu is probably in much higher demand due to openclaw

nvidia employees: please fwd!


we still have an informix db for an old early 2000s application we have to support. shit runs on centos5 lmao. it's actually not too bad, around v12 there's cdc capabilities (requires you to build your own agent service to consume the data) that made the exercise of real time replicating the app db in our edw a cakewalk. which ironically has greatly extended the lifespan of the application since no one has to query informix anymore directly.

ibms docs and help sites suck butt tho.


mate nobody wants unwarranted tips. have you guys lost your mind

always noticed way too much reverence for satya because of what he did to valuations. i personally cant stand azure/365/all of it. i reserve no reverence for satya, he's playing a game in a class/world none of us can even relate to so i don't even see the point in talking about his achievements. looking back, it unequivocally sucks that microsoft acquired github.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: