Hacker Timesnew | past | comments | ask | show | jobs | submit | bognition's commentslogin

The decision to block all downloads is pretty disruptive, especially for people on pinned known good versions. Its breaking a bunch of my systems that are all launched with `uv run`

> Its breaking a bunch of my systems that are all launched with `uv run`

From a security standpoint, you would rather pull in a library that is compromised and run a credential stealer? It seems like this is the exact intended and best behavior.


You should be using build artifacts, not relying on `uv run` to install packages on the fly. Besides the massive security risk, it also means that you're dependent on a bunch of external infrastructure every time you launch. PyPI going down should not bring down your systems.

This is the right answer. Unfortunately, this is very rarely practiced.

More strangely (to me), this is often addressed by adding loads of fallible/partial caching (in e.g. CICD or deployment infrastructure) for package managers rather than building and publishing temporary/per-user/per-feature ephemeral packages for dev/testing to an internal registry. Since the latter's usually less complex and more reliable, it's odd that it's so rarely practiced.


There are so many advantages to deployable artifacts, including audibility and fast roll-back. Also you can block so many risky endpoints from your compute outbound networks, which means even if you are compromised, it doesn't do the attacker any good if their C&C is not allow listed.

Are you sure you are pinned to a “known good” version?

No one initially knows how much is compromised


That's PyPI's behavior when they quarantine a package.

Take this as an argument to rethink your engineering decision to base your workflows entirely on the availability of an external dependency.

That's a good thing (disruptive "firebreak" to shut down any potential sources of breach while info's still being gathered). The solve for this is artifacts/container images/whatnot, as other commenters pointed out.

That said, I'm sorry this is being downvoted: it's unhappily observing facts, not arguing for a different security response. I know that's toeing the rules line, but I think it's important to observe.


known good versions and which are those exactly??????

Oh this is super cool. Coming from Java I've long missed JDBI and Rosetta which makes writing SQL Queries in Java a dream. I've toyed around with a similar style interface for python, and looking at this give me hope I can achieve it.

Interesting, where are you running into trouble buying meat from local farmers? I've often visit rural farms that have a store houses. Nearly all of them haver refrigerators and freezers with meat to buy.


My take on the OP is that its commitment to an idea is what matters, not how quickly its created. I love seeing people insta-clone things but is this a side project that going to see updates for a few weeks or is this something that is going to be maintained actively for years to come.


It is the sites job to make documentation available to the users, no?

It’s so odd for a tech focused crowed to be so opposed to newer technology.

Users are getting used to natural language search, not having it will be perceived as friction.

Users are increasingly turning to agentic coding tools, those tools do best when documentation is available via an MCP server. Not having one will make it harder for people to use your product.


I'm not opposed to the idea of natural language search, my point is the tools should be on the user side. Right now, I can ask questions about plain text pages that haven't been updated in 30 years directly in Firefox with no effort from the site operator. If an agent needs to have direct access to documentation, it's trivial for it to download pages autonomously (or even set up its own MCP server). There's literally no reason to demand that millions of sites independently add features that browsers and agents already have better and more uniform versions of.


I can understand why this may seem simple, but when it comes to the brain almost nothing is simple.

Choline a key component in Acetylcholine is the primary neurotransmitter used in your hippocampus. Its an excitatory neurotransmitter meaning it turns neurons on. The hippocampus is a massive parallel feedback circuit that when over stimulated can and will begin to seize. In fact many people who suffer from seizures have over active hippocampal circuitry. Simply "flooding" the brain with more choline could have very very bad effects.

Likewise, taking choline might not work as the brain actively controls and regulates the contents of the cerebral spinal fluid. Unlike the rest of your body, the capillaries in the brain are not leaky, but instead are enshrouded in the blood-brain barrier and there are active transport proteins for anything that isn't lipid soluble.

Choline is actively transported into the brain and the brain has additional internal mechanisms to regulate the levels of choline.

Lastly, neurotransmitters aren't just floating around in the soup of your brain. They are released by specific neurons which are integrated into specific circuits. Parkinson's disease is a perfect example here. There is tiny region of the brain involved in regulating voluntary movements that is rich in dopamine neurons. For Parkinson's these neurons die off while the rest of the brain remains relatively strong. Simply putting dopamine into the brain doesn't fix the issue you need to up the dopamine released by these specific neurons.

The treatment here is l-dopa which is a precursor to dopamine which does this, but once those neurons are gone they're gone and there is little we can do to stop the disease.

So if this works for l-dopa why won't it work for choline? My guess is because of the tight regulation the brain has around choline levels as its needed to prevent the hippocampus from seizing up.


Hard agree with this article. Split up your application by domains, create public apis between modules, understand your dep tree and keep it clean.

The devil in the details is how you pull something like this off. At the end of the day is boils down to how do you enforce that your team does the right thing. You can have a single person that enforces standards with an iron fist, but this doesn't scale. You can teach everyone how this should work, but you're going to experience drift over time as people come and go. Or you can enforce it using technology and automation.

In the cases of the first choice, its going to restrict how big your team can get and will end up eating all of the time of your one person.

In the case of the second choice, a combination of the tragedy of the commons and regression to mean will degrade the system to spaghetti code.

For the third scenario language choice matters here a lot. In Java with multi maven modules you can setup maven to forbid imports of specific module types allowing you to make modules as private/public. In Python you can't do any of this.


As with all of these things, it needs to come from the very, very top, otherwise it doesn’t work.

In the end, AWS only happened because of Jeff Bezos’ infamous “all intra-team communication now goes over HTTP, no exceptions, or you’re fired”-email.

The decision to prefer modules whenever they do the job, and defer only to microservices whenever they don’t, seems like the kind of mantra that needs to come from the CTO and made part of the company culture’s DNA.


I think the real win with microservices is that when you are air-gapped between services, it really forces modularity and independence. Almost all the time in monoliths things slowly degrade into tangled dependencies. Once clean interfaces that just required a few specific parameters got lazy at some point and now the entire customer or order (or whatever) object is passed in and oops now the two are coupled.

This is of course still possible with a microservices architecture, but the barrier to changing a rest contract/API is usually much higher, and people think a lot more about what is being passed across the interface since that data is going to be sent over a wire.

Theoretically there is no difference, but its just far easier to slip when its one codebase and all it takes is someone a little too "LGTM" happy to let it through.


Not sure if you’re being coy or pushing the trope but you’re right guns don’t actually kill people. The bullets and blood loss tend to do that.

That said, denying people access to guns does result in fewer gun related fatalities.


But note that this is deeply unethical and amounts to denying people the right to self-defense, and by extension, denying people the right to freedom.

The only thing that happens when guns are denied is the rise of corrupt and totalitarian politicians, and that the common man is either oppressed by them or by criminal gangs. Enter guns, and the common man can defend himself against both, which is a blessing.

Thankfully it is now possible to print your own guns and build one easily, so this will become less of an issue in the future, when everyone, if they want, can carry concealed guns.


> the rise of corrupt and totalitarian politicians

Doesn't seem to track in the US


Could you provide some examples of these things actually happening? Many of the most stable, most free, least corrupt countries have strict gun laws.


I live in a country with an aggressive gun lobby, and a ludicrously corrupt government that wants to create a fascist ethnostate. The gun nuts are largely on the side of the fascist ethnostate. I don’t think guns are the perfect defenders of freedom you think they are.


The fall didn’t kill him, it was the landing.


The landing didn't kill him, it was the impact.


The impact didn’t kill him, it was the organ failure and blood loss.


Blood loss didn’t kill him, ischemia did?


Its an interesting idea and the engineer in my agrees with you. Then the product skeptic says that anyone who wants to ride a bike as a form of exercise and get around using a bike, would probably just ride a normal bike.


1. I'd like to ride somewhere and get a workout but I might go farther than I can go normally and rely on ebike assist when I go back home.

2. there are rides I do where I am going for the exercise. There are potential rides I want to do that I can't because I can't afford to be covered in sweat when I arrive. I dont' necessarily want to have TWO bikes or maybe I dont' mind getting a workout on the way home where I can promptly take a shower.


I think it's an incredibly common use case in places where biking is the default transit mode. I like biking for exercise, but I don't necessarily want to be a sweaty mess after my commute for work, or when lugging my kids around, or when getting groceries.


Yup the venn diagram of people who buy an electric bike (ie don't want to work hard to push pedals) and people who want an exercise bike (where the only thing going on is pedal pushing, no moving) has exactly one person in the center piece: op.


I own both! I use my ebike for shopping and visiting my parents who live on a steep hill near the base of a mountain range. And I have an exercise bike in my office to stretch my legs in the winter.


Why not?

In my docker files I use `uv sync` to install deps vs `pip install -f requirements.txt`

And then set my command to `uv run my_command.py` vs calling Python directly.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: