It's hilarious that not a single one of these has pricing listed anywhere public.
I don't think they expect anyone to actually buy these.
Most companies looking to buy these for developers would ideally have multiple people share one machine and that sort of an arrangement works much more naturally with a managed cloud machine instead of the tower format presented here.
Confirming my hypothesis, this category of devices more or less absent in the used market. The only DGX workstation on ebay has a GPU from 2017, several generations ago.
Nvidia doesn’t list prices because they don’t sell the machines themselves. If you click through each of those links, the prices are listed on the distributor’s website. For example the Dell Pro Max with GB10 is $4,194.34 and you can even click “Add to Cart.”
Because that's a different price point, that's getting near 100K, and the availability is very limited. I don't think they're even selling it openly, just to a bunch of partners...
The MSI workstation is the one that is showing some pricing around. Seems like some distributors are quoting USD96K, and have a wait time of 4 to 6 weeks [0]. Other say 90K and also out of stock [1]
'Important' people in organizations get them. They either ask for them, or the team that manages the shared GPU resources gets tired of their shit and they just give them one.
The typical inference workloads have moved quite a bit in the last six months or so.
Your point would have been largely correct in the first half of 2025.
Now, you're going to have a much better experience with a couple of Nvidia GPUs.
This is because of two reasons - the reasoning models require a pretty high number of tokens per second to do anything useful. And we are seeing small quantized and distilled reasoning models working almost as well as the ones needing terabytes of memory.
Yeah it's certainly unimaginable that the civilization that invented gunpowder, cannons, guns, rockets a thousand years ago can make it for cheap now :)
'Hypersonic' missile makes it sound like it's alien technology, no it's solid boosters that do not follow the usual ballistic trajectory with a computer from 1970.
The raw materials cost less than half of a standard car.
That's pretty much the entire point of what people are calling hypersonic missiles. All ballistic missiles fly at hypersonic speeds. The advance is being able to do so at low altitude with maneuverability.
You are correct, but I should point out that Russia has described its Kinzhal missiles as hypersonic, when they are really more of a traditional ballistic missile fired horizontally. So very fast (Mach 10), but not as maneuverable as what the U.S. has been calling hypersonic.
Since the original story here does not provide many details, we can't know which side of that fence this falls on (assuming it is real).
Was there any evidence that the Kinzhals fired, for example, toward Kyiv during the current conflict were fired on a depressed trajectory? I remember reading one account that looked like a plain old interception of a ballistic missile. (which is impressive enough to someone who remembers when "Patriot missile" was not exactly synonymous with excellence)
> That's pretty much the entire point of what people are calling hypersonic missiles.
Most missiles endowed with the "hypersonic" moniker are simply theater ballistic missiles used for standard ballistic missile things, which is part of why I asked the question.
> The advance is being able to do so at low altitude with maneuverability.
Hate to burst your bubble but arms dealers and governments are as capable as anyone else of marketing spin.
> Mach 5, high maneuverability, inside the atmosphere.
Out of these, Mach 5 and inside the atmosphere have been doable for several decades. Pretty much all countries that make missiles can make missiles with these two characteristics.
My point, which you seem to either misunderstand or deliberately misrepresent, is the other one - "maneuverability" - being the distinguishing factor for what we call hypersonic missiles. That makes these difficult to defend against.
Think of it like calling humans hyper-limbed animals, but limbs being not what really distinguishes humans from, say, chimpanzees.
C++ standards follow a tick-tock schedule for complex features.
For the `tick`, the core language gets an un-opinionated iteration of the feature that is meant for compiler developers and library writers to play with. (This is why we sometimes see production compilers lagging behind in features).
For the `tock`, we try to get the standard library improved with these features to a realistic extent, and also fix wrinkles in the primary idea.
This avoids the standard library having to rely on any compiler magic (languages like swift are notorious for this), so in practice all libraries can leverage the language to the same extend.
This pattern has been broken in a few instances (std::initializer_list), and those have been widely considered to have been missteps.
Regarding your mention of compiler magic and Swift, I don’t know much about the language, but I have read a handful of discussions/blogs about the compiler and the techniques used for its implementation. One of the purported benefits/points of pride for Swift that stood out to me and I still remember was something to the effect of Swift being fundamentally against features/abstractions/‘things’ being built in. In particular they claimed the example of Swift not having any literal types (ints, sized ints, bools, etc) “built in” to the compiler but were defined in the language.
I don’t doubt your point (I know enough about Swift’s generic resolution crapshow during semantic analysis to be justified in assuming the worst) but can you think of any areas worth looking into for expansion of the compiler magic issues.
I have a near reflexive revulsion for the kinds of non-composability and destruction of principled, theoretically sound language design that tends to come from compiler magic and shortcuts, so always looking for more reading to enrage myself.
I'd hope this community of all places would understand that "just integrate X with Y" is never as simple as "just". It's still something a team needs to do, and the gain is minimal unless Epic is also going to try and make their own console-esque device. That's the incentive for Steam.
Going by the Steam hardware survey, 3/4 of Linux users were not using Steam Decks when they got polled. So I’m not sure if a console-esque device is actually required. A large part of the reason why Linux usage is growing, is probably that it mostly just works these days
Yes, it's not the most optimal business decision as a software company to invest in hardware. The clear move is to either grease Microsoft's palms, or let then outright acquire Steam (or Valve as a whole). Valve not doing that is either in part ideological, or part very long term thinking on the best financial path later, instead of now.
But at the same time: while the ends was "be independent from Microsoft", their means at first was very Microsoft esque. Partner up with hardware vendors, make some Pcs with Steam built in, and brand it as such. Didn't work. Their goal had to be to roll their own hardware because that's what was needed to get the ball rolling (as well as a form factor that accompanied a desktop instead of competed against).
The problem for an also-ran app store is that you need every user you can find.
Linux support may not be a huge deal in the overall market (although it's growing due to the steam os devices) but it's just one more element to Steam's moat.
Agreed. The level of aggressive gatekeepers is just crazy, take Linux ARM mailing list for example. I found the Central and Eastern Europeans particularly aggressive there and I'm saying this as on myself. They sure do like to feel special there, with very little soft skills.
This will likely be alleviated when Ai first projects take over as important OSS projects.
Fir these projects everything "tribal" has to be explicitly codified.
On a more general note: this is likely going to have a rather big impact on software in general - the "engineer to company can not afford to loose" is likely loosing their moat entirely.
In the small, it's still a meritocracy. A patch like this is obviously correct and I expect to get in first try (maybe with a formatting fix by the maintainer).
For large works, the burden shifts, since you are increasing the maintenance load. Now we have the question of who will do the future work, and that requires judgement of the importance of the work and/or the author, and hence is a fundamentally political question.
> It does not seem very "viral" or income-generating
Yeah because that should be the end goal of everything right?
And from the response:
> 1. re: the first part, many people want something plug and play. and even if they were plug and play, the problem is that the user experience (on windows at least) with online drives generally sucks, and you don't have disconnected access.
Bingo. I am a really quite experienced Linux user (I've been using it since it came on two floppy disks and didn't really work) and I too want things that are just plug-and-play. Time spent dicking about making things work is time not spend doing something fun, although I get that for some folk their goal in using Linux is to "Be Using Linux". For most of us I suspect that extends out to "Be Using Linux to solve problems we actually have, not just be using Linux for the sake of it".
I don't think they expect anyone to actually buy these.
Most companies looking to buy these for developers would ideally have multiple people share one machine and that sort of an arrangement works much more naturally with a managed cloud machine instead of the tower format presented here.
Confirming my hypothesis, this category of devices more or less absent in the used market. The only DGX workstation on ebay has a GPU from 2017, several generations ago.
reply