> Rewriting it in C can prove that the problem is not in your programmin language
It's a lot of work of rewriting it in C. It's doable, but impractical. And such rewrite may introduce new bugs.
Proving that the problem isn't in my language wasn't necessary - there were no room for it to introduce such kind of bug. Language bugs are usually manifest themselves in a way different way, (like broken binary code leading to crashes or accepting invalid code). That's why I have created my question in a first place - while I was sure, that it wasn't a language bug.
> Outdated OS can be the problem as well.
Outdated OS can't be a problem. Basic socket functionality was implemented eons ago and all known bugs should be fixed even in pretty outdated versions of the kernel.
> What kind of advice did you expect?
Eventually I have found myself, that in my test program I create too much connections for a TCP server and its internal connection queue overflows. But it took some time to find it, way longer compared to what could be achieved if my SO was answered. The problem was not so hard to spot considering that I have described what exactly I do. Even providing code examples wasn't necessary - textual description was enough.
Could very well be that your language was the issue and writing a small proof of concept for the specific use case that's problematic in a battle tested language other people know is a standard trouble shooting step. Especially if it was a rare limiting error, that sounds like a trivial thing to be implemented in C with a simple for loop and some dummy data perhaps.
Same with the OS. Only because socket functionality is decades old doesn't mean that you can't hit a recently introduced bug, that could have been fixed in a new version. That also is a standard troubleshooting step.
It doesn't make for bad advice only because you're too lazy to do it.
Getting Java to run is a base requirement for running most software written in Java.
However, there is a dedicated Dockerfile for creating a native image (Java words for "binary") that shouldn't require a JVM. I haven't tested running the binary myself so it's possible there are dependencies I'm not aware of, but I'm pretty sure you can just grab the binary out of the container image and run there locally if you want to.
It'll produce a Linux image of course, if you're on macOS or Windows you'd have to create a native image for those platforms manually.
Isn't a docker image basically a universal binary at this point? It's a way to ship a reproducible environment with little more config than setting an ENV var or two. All my local stuff runs under a docker compose stack so I have a container for the db, a container for redis, LocalStack, etc
I'm not saying it's ideal, just saying that's what we've shifted to for repeatable programs. Your Linux "universal" binary certainly won't work on your Mac directly either...
Except compatibility, but the biggest gap is browser support, which is in the process of getting closed. Chrome has shipped JXL support behind a flag. Firefox are in the process of implementing support.
In Chrome you can enable JXL from here:
chrome://flags/#enable-jxl-image-format
There is much better than JPEG, however, because still images are not really a problem in terms of bandwidth and storage, we just use bigger JPEGs if we need more quality. The extra complexity and breaking standards is not worth it.
This is different for video, as video uses a whole lot more bandwidth and storage, it means we are more ready to accept newer standards.
That's where webp comes from, the idea is that images are like single frame videos and that we could use a video codec (webm/VP8) for still images, and it will be more efficient than JPEG.
That's also the reason why JPEG-XL is taking so long to be adopted. Because efficient video codecs are important, browsers want to support webm, and they get webp almost for free. JPEG-XL is an entirely new format just for still images, it is complex and unlike with video, there is no strong need for a better still image format.
IMO most of JPEG-XL's value is in the feature set. Having a format that can do transparency, proper HDR, and is efficient for everything from icons to pixel art to photographs is a really strong value prop. JPEG, Webp, and AVIF all have some of these, but none have all. (AVIF is the closest, but as a video codec it still has a pretty significant overhead for small images like icons).
I use Opus 4.6 with pi.dev (one agent). I give detailed instructions what to do. I essentially use it to implement as I do it manually. Small commits, every commit tested and verified. I don’t use plan mode, just let the agent code - code review is faster than reading plan. This approach works only if you make small changes. I create mental model of the code same way, as when writing it manually.
Some people on my team codes with AI without reading code. That’s mostly a mess. No mental model, lower quality. They are really proud of it though and think they are really smart. Not sure how this will turn out.
It feels like to me that junior devs don’t understand what they even need to learn. They just use agentic coding to get things done, without any deeper knowledge.
The worst is, they think they know exactly what they need to learn, and also think they can make good decisions.
Nothing against AI - just to inform people about quality, maintainability and future of this library. No human has mental model of the code, so don’t waste your time creating it - the original author didn’t either.
i ban the use of AI generated code in my projects. at least one of my projects uses libxml and someone could propose to switch to an alternative. a label would make it easier to avoid this library.
Involves and made only by, are 2 different things.
I use agentic coding in my daily work. I do make mental model of the code I write and I also test the code, exactly the same way, as when written completely manually.
I understand it can be difficult to label, and there's an inconveniently large grey area here, but there is a difference between plain vibe-coded software and software built with AI under the control, direction and validation of a developer.
The distinction is probably not very important for small applications, as nobody cares if a minor script or a one-shot data processor has been vibe-coded, but for large applications it surely matters in the long term.
Outdated OS can be the problem as well.
What kind of advice did you expect?
reply