Hacker Timesnew | past | comments | ask | show | jobs | submit | random3's commentslogin

Thanks! This summarizes it

> Overall, the work lacks a self-consistent and transparent accounting of resources, making its central claims difficult to substantiate and leaving a strong sense of sensationalism and hype, rather than honest scientific exposition.

"Clowns to the Left of Me, Jokers to the Right"


You are being disingenuous with your selective quoting;

Here is what the authors actually say w.r.t. the criticisms (all the comments are worth reading);

Our primary emphasis is ECC-256. Elliptic curve cryptography is widely deployed in modern systems, e.g., internet security and cryptocurrency.

For ECC-256, the space-efficient architecture uses 9,739 qubits with < 3-year runtime, the balanced architecture uses 11,961 qubits with < 1-year runtime, and the time-efficient architecture uses ~19,000 qubits with ~52-day runtime (or ~26,000 qubits with ~10-day runtime using higher parallelism). Space and time overheads are reported together within each architecture, not mixed across regimes.

The claim that our scheme requires 117 years selectively cites RSA-2048 under the most space-constrained architecture, which is one corner of a trade-off space we present clearly in Figure 3 of the work. We include RSA-2048 for completeness, and state explicitly that its runtimes are one to two orders of magnitude longer.

We believe our clearly labeled trade-offs constitute exactly the transparent resource accounting the commenter calls for.

Best regards,

Maddie, Qian, Robert, Dolev


These papers are as relevant to engineering/product stage as every other "new battery" or "new cancer" treatment are at the moment.

E.g.

4 days ago (same paper) https://www.reddit.com/r/science/comments/1s2bjqp/a_new_hafn...

3 years ago https://www.reddit.com/r/singularity/comments/14d1dt5/why_ar...

5 years ago https://www.quora.com/Would-it-be-worthy-to-use-memristors-t...


> These papers are as relevant to engineering/product stage as every other "new battery" or "new cancer" treatment are at the moment.

The growth of battery and a few other technologies have been frustrating in the past decade. But I wouldn't put cancer research alongside those. It's not every day that we encounter improvements in cancer treatments. The important fact to note is that cancer survival rates have improved significantly in the past few decades. Though, I'm worried that the current political climate will scuttle that progress.


Battery advances in the last couple of decades have also been incredibly impressive.

I'm curious. Are they in the market in large enough volumes yet? I've been waiting for ages for something better than Li-Ion and LiPo tech to become widely available. We need much higher energy densities, and preferably without the fire hazard.

Lithium-Ion density has basically doubled over the last decade. And the cost has dropped by 2/3.

See figure 2 here: https://rmi.org/the-rise-of-batteries-in-six-charts-and-not-... and the data here: https://elements.visualcapitalist.com/charted-lithium-ion-ba...

(Both of these are a couple of years old. I'm sure there's newer data out there that looks even beter.)

Newer batteries chemistries are slowly arriving, but they mostly aren't replacing Li-ion because Li-ion is getting better all the time. Except in specific circumstances. Like the Sodium-Ion ones that work far better at low temperatures and entered mass production two months ago:

https://carnewschina.com/2026/01/22/catl-unveils-worlds-firs...


Yes, cancer, bateries and computing has seen impressive progress.

Yet, in general the correlation between results "on paper" and results in practice is over a long period of time, if any.

It doesn't mean that new results aren't good, just that they may not translate into something practical very soon.


Making money is what companies are for. The rest are non-profits.

The “do it first, apologize later” will be the general principle with anything. It’s going to be hard and futile to prove even if they don’t do it through ToS first. Amazon has one of the largest corporate training sets out there:)

lol - the long tail of international standard dissent in US

Ring buffers never get old. Here’s a useful mention of some of the most extensive technical work by LMAX team over 15 years ago https://martinfowler.com/articles/lmax.html

Apple just launched a $600 amazing laptop and the top models have massive performance. What are we talking about here?

I don't think personal computers will go away, but I think the era of "put it together yourself" commodity PC parts is likely coming to an end. I think we're going to see manufacturers back out of that space as demand decreases. Part selection will become more sparse. That will drive further contraction as the market dries up. Buying boxed motherboards, CPUs, video cards, etc, will still exist, but the prices will never recover back to the "golden age".

The large PC builders (Dell, HP, Lenovo) will continue down the road of cost reduction and proprietary parts. For the vast majority of people pre-packaged machines from the "big 3" are good enough. (Obviously, Apple will continue to Apple, too.)

I think bespoke commodity PCs will go the route, pricing wise, of machines like the Raptor Talos machines.

Edit: For a lot of people the fully customized bespoke PC experience is preferred. I used to be that person.

I also get why that doesn't seem like a big deal. I've been a "Dell laptop as a daily driver" user for >20 years now. My two home servers are just Dell server machines, too. I got tired of screwing around with hardware and the specs Dell provided were close enough to what I wanted.


There are upsides here as well! I think of things like the NUC or Mac Mini - ATX is from 1995, I'm hopeful computers will become nicer things as we trend away from the bucket-o-parts model.

I'm very excited about the Steam Machine for the reasons you mention - I want to buy a system, not a loose collection of parts that kind-of-sort-of implement some standard to the point that they probably work together.


What are the upsides? You only listed a few things that you like, but not why they should take over all parts of the PC market. The only factor I can think of is size, but those small all-in-one computers are already widely available now without the need to hollow out the custom PC market.

There's nothing wrong with ATX or having interchangeable components. An established standard means that small companies can start manufacturing components more easily and provide more competition. If you turn PCs into prepackaged proprietary monoliths, expect even fewer players on the market than we have now, in addition to a complete lack of repairability and upgradability. When you can't pick and choose the parts, you let the manufacturer dictate what you're allowed to buy in what bundles, what spare parts they may sell to you (if any) and what prices you will pay for any of these things. Even if you're not building custom PCs yourself, the availability of all these individual components is putting an intrinsic check on what all-in-one manufacturers can reasonably charge you.


The above post is making a case that the market will implode. I think there's a chance that's really gonna happen. I'm trying to find a silver lining. If the parts market survives that'd be awesome, but there's a real chance this is the beginning of the end.

That I agree with. I'm just also making the point that the silver lining had always existed, since similar fully-integrated products go back decades. The end seems inevitable to me now, and there's no good to be found there. We already had everything. Now is when that starts to be taken away.

I'm thinking of this like car radios. Most cars used to have this standard called DIN to put the radio in. Most cars today don't have DIN mounts anymore. We've gotten way nicer, bigger touch screens in our infotainment now since cars are not locked into one form factor. On the other hand, it sucks in some ways because vendor lock in. I hope we at least get a tradeoff like that - that there will be something in return for it.

There are systems like the NUC but if I want a super-high-end 5090 and top-end CPU, all of the options to cool them feel like... well, something kluged together from whatever parts I can find, not something that's designed as a total system. Maybe we'll get some interesting designs out of this.


I'm afraid the acceptance (and, more troubling, the seeming desire on the part of technical people who I see as misguided) of mobile computers in the smart phone form factor to be locked down and hostile to their owners has moved the Overton window on personal computers being equally owner-hostile. The bucket-of-parts PC ecosystem is less susceptible to an effort to lock down the platform and create walled gardens. If that market goes away it gets easier to turn all of our personal computers into simply computer-shaped devices (like Chromebooks and iPads).

I'm really fearful that PCs are going down the road of locked bootloaders, running the user-facing OSs inside bare-metal hypervisors that "protect" the hardware from the owner, etc.

I'll accept that I'm likely under the influence of a bit of paranoia, too.

I'm strongly of the opinion several unaffiliated factions (oligarchs, cultural authoritarians, "intellectual property" maximalists, software-as-a-service providers, and intelligence agencies, to name a few) see unregulated general purpose computers in the hands of the public as dangerous.

I don't think there's an overt conspiracy to remove computing from the hands of the public. The process is happening because of an unrelated confluence of goals.

I don't see anybody even remotely comparable in lobbying power standing up for owner's rights, either.


8GB isn't an "amazing" laptop, it's a budget laptop. It's also thermally constrained quite a bit, so not even as "amazing" as it could be.

The point about Apple is that everyone from zoom, slack etc will be forced to optimize for that 8GB. (Same like getting rid of awful flash player).

Many a people need only a basic device for Netflix, YouTube, google docs or email or search/but flights tickets. That will be amazing.

Many have job supplied laptop/desktop for great performance (made rubbish by AV scanners but that's different issue)


>(Same like getting rid of awful flash player).

I was looking up an old video game homepage the other day for some visual design guidance. It was archived on the Wayback Machine, but with Flash gone, so was the site. Ruffle can't account for every edge case.

Flash was good. It was the bedrock of a massive chunk of the Old Net. The only thing awful are the people who pushed and cheered for its demise just so that Apple could justify their walled garden for the few years before webdev caught up. Burning the British Museum to run a steam engine.


Flash was a dumpster fire on MacOS. Apple probably would have supported it on the iPhone if Adobe had stopped it from crashing apps and made it performant on Apple's primary platform at the time (the Mac).

I remember pulling up crash logs for people showing them that Flash was in every one of the Safari crashes the wanted me to fix. I told them it was out of my hands.


All the other browsers managed it fine. That sounds like a Safari problem. Which would be totally in line with Apple's modus operandi.

No, they didn't. It was straightforwardly unsafe and broken, the heaps of effort that went into supporting it were largely just to paper over that fact. It's no accident that the other browser vendors went along with dropping support so quickly after Apple did.

The reason given for blocking Flash on iOS at the time was it's too cpu intensive on mobile, which impacts battery life. Not that it was "unsafe and broken".

The main reason other browsers stopped supporting Flash was websites stopped being built with Flash because iOS didn't support it, and a lot of people thought that mattered even though iOS had (and still has) a small market share world-wide.


> He cited the rapid energy consumption, computer crashes, poor performance on mobile devices, abysmal security, lack of touch support, and desire to avoid "a third party layer of software coming between the platform and the developer".

Sure, he's laying out a case for the app store they'd later introduce, but it wasn't simply CPU and battery. There's a reason I cited crash logs as the primary thing I remembered about how it affected me. It gave me an immediate reason to share with people about why I couldn't fix Safari crashes when Flash was involved, which made that aspect of my job easier to explain.

https://en.wikipedia.org/wiki/Thoughts_on_Flash


Did you use netscape?

Safari was a crap browser. I mean, it still is, but it used to be, too.

It was awful especially if you use open source OS like Linux. It slowed the computer, fans full speed. Wtf are you smoking?

It's cute that you think any of that optimization will happen just because Apple crapped out a budget laptop.

It is saddening that you are ignorant in knowing that technology companies follow Apple's devices more carefully that other OS.

They don't. There's myriad laptops out there that are nothing like Apple's products, not even trying to be like them at all.

You can get PCs in formats Apple has never ventured to make and never will.

And let's not forget that Apple copies everyone else.

Folding phone? How long did it take until Apple "invented" the folding phone market? Oh, that's right, they don't have one (yet, but they are followers after all).

VR Goggles? Also late to that race, and they did a stupid thing with it.

Even the iPhone was really, really late to the game (and no, I don't care about their ~30% worldwide mobile market share).

Apple fanboys love to think that Apple invented all the things and everyone else copied them, but those are fanboy thoughts.

Sorry, Apple simply is not the technology leader you think they are.


but I don't want a $600 amazing laptop, i want a powerful desktop x86 machine with loads of ram and disk space. As cheap as it was a couple of years ago.

Not sure about the memory, but Xeon Scalable/Max ES/QS chips and their boards are still not horribly expensive.

Prior to the crunch, you could have anything from 48-64 cores and a good chunk of RAM (128GB+). If you were inordinately lucky, 56 cores and 64GB of onboard HBM2e was doable for 900-1500 USD.

They’re not Threadrippers or EPYCs,but sort of a in between - server chip that can also make a stout workstation too.


> As cheap as it was a couple of years ago.

I also want housing as cheap as it was a couple of years ago.


You can have both. You just have to undo the forced bail-in of Millennial and Gen-Z/Alpha/Beta productivity to cover the debts and lifestyles of Silent Gen/Boomer/Gen-X asset holders. The insanity of contemporary markets doesn't reflect anything natural about the world's economic priorities, but instead the privileging of the priorities of that cohort. They've cornered control until enough people call bullshit. So, call bullshit.

x86 going away wouldn't be surprising. Ignoring David Patterson was a mistake to begin with.

Looking at AMD's x64 server offerings, I don't see why that would go away.

But I can imagine that it would become less prevalent on personal machines, maybe even rare eventually.


Reading some of the doomer comments in this thread feels like taking a glimpse into a different world.

We're out here with amazing performance in $600 laptops that last all day on battery and half of this comment section is acting like personal computing is over.


Two different populations — those interested in computing, and those interested in computers.

Personal computing and IBM PC clones are not the same thing. The fall of PC clones can happen while other personal computing devices continue to be produced. The $600 laptop is not a PC.

Apple laptops are PCs (Personal Computers). They are not IBM PCs. But IBM hasn't made PCs in years, and there hasn't been any IBM PC hardware to clone in years.

They don't run the software I want to run (Linux, Windows games) and/or with the performance I want.

Raspberry Pi is way cheaper than those things, and I'm sure you could hook one up with an all-day battery for $100-200.. Doesn't mean it's "better".


They trade blows performance wise with the M1 MacBook Pro sitting on my desk. And theres nothing stopping asahi linux running on them except for driver support. They look like fantastic machines.

They’re not ideal for all use cases, of course. I’m happy to still have my big Linux workstation under my desk. But they seem to me like personal computers in all the ways that matter.


Asahi Linux is NOT an option and may never be one due to: The A18 Pro (and M4) introduced SPTM — Secure Page Table Monitor — which runs at a higher privilege level (GXF EL2) than the OS kernel. Unlike M1/M2/M3 where m1n1 can directly chainload Linux, on A18 Pro/M4 the page table infrastructure is owned by SPTM and must be initialized by XNU before anything else can run. You cannot bypass it. (source: https://github.com/rusch95/asahi_neo)

So AI won't surpass humans, because Chris Lattner can do better than a model than didn't exist two years ago?

Can you elaborate?

are you familiar with tesla? i'm not super, but am aware of their public things. they introduced fake marketing products called full self driving and autopilot that don't do those things. apparently this person karpathy was in charge of computer vision there. he led the team who is responsible for these systems that occupy our roads which can't navigate due to such outstanding occurrences as sunlight, precipitation, and fog.

i don't know a great deal about the guy. i know: he worked at tesla, led autopilot there. if we ignore the character defects required to work at tesla, he's responsible for designing systems that would certainly kill people because they decided lidar was too expensive.


Building a tech and falsely advertising it to be something else that what it is (e.g. self driving instead of driving assistance) can typically done by different people. Lacking specific evidence, it's reckless to accuse this person.

right. i'm mostly ignorant of the subject and rushing to judgement based on bias. but he did lead the computer vision team for years at tesla that created autopilot. didn't resign in protest and to my knowledge hasn't apologized, but again i'm ignorant and not seeking new data.

IDK what Dijkstra believed in terms of how programmers should have looked like, bu the did seem to have a sense (and taste) of a direction of programming that was lost within practicing software engineering and their prefered PLs.

My own incomplete opionion is that the net effect is that we ended up writing orderd of magnitude more code than necessary to solve the problems at hand. It's the equivalent of doing the computations manually instead of using a calculator. This has led to an industry that has served us well, but strictly speaking it was never necessary and much more could have been achieved with a fraction of the resources.


While there is certainly some amount of unnecessary junk code out there, your claim that it could be reduced by an order of magnitude isn't even close to correct. In general the only way to write less code is to use higher level abstractions. The problem, of course, is that those abstractions are always leaky and using them tends to make certain required features too slow or even impossible to build at all. There is no free lunch.


as programmers we like to use all this jargon like "leaky abstraction", but never bothered to understsand it beyond the PL paradigms we use. There's no formal definition and simply makes them good terms to abuse, and throw in conversations to make our points.

Why are the abstractions leaky? Are all abstractions leaky? Why - we simply accept the situation without spending any real effort.

"There's no free lunch" - this is representative of the level of argument in software circles entirely. But WTF does that mean? If the lunch is not free, how cheap or expensive can it get and why?

This is why, as engineers, we tend to brush off the Dijkstras as arrogant, while at the same time ignoring both our arrogance and ignorance.


A leaky abstraction is like obscenity: I know it when I see it. It's impossible to define the concept in a rigorous way, and yet it impacts everything that we do.

You're simply wrong to claim that we accept the situation without spending any real effort. In reality the more experienced developers who build abstraction layers tend to spend a lot of time trying to prevent leaks, but they can't have perfect foresight to predict what capabilities others will need. Software abstractions often last through multiple major generations of hardware technology with wildly different capabilities: you can't prevent those changes from leaking through to higher levels and it would be foolhardy to even try.


I understand your position and I think it's the norm. Yet I find it difficult to comprehend how it's not self-evidently absurd.

Do you feel like software transcends pyhsics, mathematics and logics? Because that's what the statement translates to.

The only reason it's impossible, is because nobody tries, because trying to do so would interfer with the deliverables of next sprint. The software industry has painted itself into a corner.


Physics is full of leaky abstractions. Solid? Leaky abstraction (melting). Ideal gas? Leaky abstraction (van der Walls). Molecule? Leaky abstraction (chemical reactions). Atom? Leaky abstraction (ionization, fusion, fission, alpha and beta decay). Proton? Leaky abstraction (sometimes you have to care about the quarks).


Check out Urs Schreiber if you want to get over it


"Software people are not alone in facing complexity. Physics deals with terribly complex objects even at the "fundamental" particle level. The physicist labors on, however, in a firm faith that there are unifying principles to be found, whether in quarks or in unified field theories. Einstein repeatedly argued that there must be simplified explanations of nature, because God is not capricious or arbitrary.

No such faith comforts the software engineer. Much of the complexity he must master is arbitrary complexity, forced without rhyme or reason by the many human institutions and systems to which his interfaces must conform. These differ from interface to interface, and from time to time, not because of necessity but only because they were designed by different people, rather than by God."

- Fred Brooks, No Silver Bullet


No, I feel like software developers are unable to predict the future. Mathematics and logic aren't much help with that and physics barely enters into it.


>"There's no free lunch" - this is representative of the level of argument in software circles entirely. But WTF does that mean?

You cannot have a thing without doing the work to build it. You don't get the better abstraction without implementing it first. Your proof in theory, is just that, until exercised, and the divergence from the ideal to the real world is finally realized. I can teach a programmer all manner of linguistic trickery to allow them to exploit all sorts of mathematical abuse of notation. None of that makes a cotton-picking, salt-licking bit of difference if at the end of the day, if your symbolic proof isn't translateable to a machine code that runs and maps successfully onto the operational space of an implementation of a computing device. If you give a program written in the form of a Shakespearean sonnet (an example of a focus on radical novelty in encoding a program without regard to analogy); say; I still need a bloody compiler that'll turn that into something that is capable of running within the constraints of the machine, and the other primitives to make it work. That's TANSTAAFL. You break from what exists; you still have to reroot and establish a parallel basis of operation that covers the primitive operations you're familiar with. Djikstra might be right. There's something liberating to staying in the realm of the formal and mathematical. His detractor's were also right. He is so damn far above everyone else, that everybody in the room has trouble understanding just what it is he's going on about. At the end of the day, teach what the greatest number of the people there can firmly mentally grip, and pass that on. The geniuses like Djikstra will quickly outgrow it, and excel. They don't need the help. Everyone else on the other hand, does. I wouldn't be opposed, to trying Djikstra's approach myself. Shattering my current understanding of the practice of programming and working more from a formal methods POV. That comes after a career which has been fruitful, and was rooted in the old way which worked quite well for many others educated at the same time I was. I already know I can do it. His method just changes the emphasis. Though I will note with alarm, his reticence to test is disturbing. If he does assume everything is proofable from the get go, then I suppose you don't need tests; but that's hardly the way anything in the world actually bloody works. That's Math in a vacuum, with spherical cows. Not writing code then realizing "Shit, the processor in the machine I'm writing for doesn't support that primitive, or has a glitchy implementation thereof".

Software engineering isn't programming for people who can't; it's a set of practices and know-how to navigate a niche field that are battle hardened, and tested through time to actually guarantee some semblance of a chance of success in a field shaped by such fast development, the logic of 6 months ago seems antiquated. For that time with Moore's Law in full swing, yeah. Radical novelty might have been justifiable; but ultimately didn't push past the test of time. It can be as clever a hack as you can imagine, but if no one else can follow it... You haven't condensed it to a teachable form.



> IDK what Dijkstra believed in terms of how programmers should have looked like,

https://qht.co/item?id=47373080


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: