Hacker Timesnew | past | comments | ask | show | jobs | submit | paul's commentslogin

One of my favorite quotes: “There are two ways of constructing a software design: One way is to make it so simple that there are obviously no deficiencies, and the other way is to make it so complicated that there are no obvious deficiencies.”

I think about this a lot because it’s true of any complex system or argument, not just software.


This is indeed a great quote (one of many gems from Sir Tony) but I think the context that follows it is also an essential insight:

> The first method is far more difficult. It demands the same skill, devotion, insight, and even inspiration as the discovery of the simple physical laws which underlie the complex phenomena of nature. It also requires a willingness to accept objectives which are limited by physical, logical, and technological constraints, and to accept a compromise when conflicting objectives cannot be met. No committee will ever do this until it is too late.

(All from his Turing Award lecture, "The Emperor's Old Clothes": https://www.labouseur.com/projects/codeReckon/papers/The-Emp...)


"At first I hoped that such a technically unsound project would collapse but I soon realized it was doomed to success. Almost anything in software can be implemented, sold, and even used given enough determination. There is nothing a mere scientist can say that will stand against the flood of a hundred million dollars. But there is one quality that cannot be purchased in this way-- and that is reliability. The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay."

This explain quite a lot actually!


Very poignant, thank you. I can see my absolute core principle - KISS reflected in this. I still struggle to find a single use in my career where it wouldn't be the best approach, especially long term.


From the linked lecture, which I printed out to read as part of a new less is more screen time management regime (where I print out longer form writing for reading) I found this very interesting tidbit in the context of Tony having made a delivery miscalculation and his team failing to deliver on one of their products; which is where I think a lot people are today with LLMs:

"Each of my managers explained carefully his own theory of what had gone wrong and all the theories were different. At last, there breezed into my office the most senior manager of all, a general manager of our parent company, Andrew St. Johnston. I was surprised that he had even heard of me.

"You know what went wrong?" he shouted--he always shouted -- "You let your programmers do things which you yourself do not understand." I stared in astonishment. "


"No committee will ever do this until it is too late."

The software I like best was not written by "teams"

I prefer small programs written by individuals that generally violate memes like "software is never finished" and "all software has bugs"

(End user perspective, not a developer)


One of my biggest accomplishments was shipping a suite of 5 apps from four divisions where three of them resented each other’s existence and seemed bound and determined to build rules in the system that made sure the other two couldn’t function. Which made no goddamn sense because it was a pipeline and you can’t get anything out one end if it gets jammed in the middle.

I was brought in to finish building the interchange format. The previous guy was not up to snuff. The architect I worked for was (with love) a sarcastic bastard who eventually abdicated about 2 rings of the circus to me. He basically took some of the high level meetings and tapped in when one of us thought I might strangle someone.

Their initial impression was that I was a prize to be fought over like a child in a divorce. But the guy who gives you your data has you by the balls, if he is smart enough to realize it, so it went my way nine times out of ten. It was a lot of work threading that needle, (I’ve never changed the semantics of a library so hard without changing the syntax), but it worked out for everyone. By the time we were done the way things worked vs the way they each wanted it to work was on the order of twenty lines of code on their end, which I essentially spoonfed them so they didn’t have a lot of standing to complain. And our three teams always delivered within 15% of estimates, which was about half of anyone else’s error bar so we lowly accreted responsibilities.

I ended up as principal on that project (during a hiring/promotional freeze on that title. I felt bad for leaving within a year because someone pulled strings for that, but I stayed until I was sure the house wouldn’t burn down after I left, and I didn’t have to do that). I must have said, “compromise means nobody gets their way.” About twenty times in or between meetings.


These are the projects that give us confidence.


It's the committee vs the dictator issue - a small driven individual (or group) can achieve a lot, but they can also turn into tyrants.

A committee forms when there's widespread disagreement on goals or priorities - representing stakeholders who can't agree. The cost is slower decisions and compromise solutions. The benefit is avoiding tyranny of a single vision that ignores real needs.


Also, this software is free. Generally the authors were not paid to write it


We are poorer for him having waited to drop that sentence at his Turing Award acceptance speech. I use it all the time.

Tony might be my favorite computer scientist.


It seems that with vibe coding our industry has finally, permanently embraced the latter approach. RIP Tony.


> permanently

don't bet on it


Can’t argue with the quote. However my current boss has been pushing this to the extreme without much respect for real-world complexities (or perhaps I’m too obtuse to think of a simple solution for all our problems), which regrettably gives me a bit of pause when hearing this quote.


Reminds me of another good one: Make everything as simple as possible, but not simpler. (-- probably not Einstein)


There was an article posted here a few weeks ago titled "Nobody Gets Promoted for Simplicity."

I've been thinking about it a lot, and now, in turn, the memory of Mr. Hoare.


aged very well


Reminds me of this Pascal quote: "I would have written a shorter letter, but I did not have the time."

https://www.npr.org/sections/13.7/2014/02/03/270680304/this-...


"Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away."

Antoine de Saint-Exupéry


"The greatest ideas are the simplest."

- William Golding


Reminds me of this quote... “A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work.”



Seconded.

Someone once described Systemantics as the book that systwm designers read under the covers at night with a torch.


or should do!


Reading it now on Kindle Scribe without the help of AI ;-).

When I am done, I might simply replace SOUL.md with it and move on.


https://openlibrary.org/books/OL4904457M/Systemantics

(Systemantics is available for borrowing, Systems Bible is not yet, a copy has been sent for digitizing)


One of the policies of The Rhinoceros Party in Canada was to increase the complexity of the taxation system so much that nobody could find the loopholes to exploit.


Had to look them up (WP), wasn't disappointed. We have the Monster Raving Loony Party in the UK.

One of the Rhino's Party policies stands out - are you sure Trump wasn't born a Cannuck and was stolen at birth by racoons and smuggled down south?

"Annexing the United States, which would take its place as the third territory in Canada's backyard (after the Yukon and the Northwest Territories—Nunavut did not yet exist), in order to eliminate foreign control of Canada's natural resources"


I like them


Good thing we now have technology that allows us to crank out complex software at rates never-before seen.


Complex software full of very obvious deficiencies that nobody bothered to look for.


It can also be used to simplify existing code bases.


Agreed! Wonderful museum.

They also have some original codices along with translations that are interesting to look at https://photos.app.goo.gl/zeo3Hn2Q8v81cidX9


Interesting read. I wrote the original compiler back in 2002/2003, but a lot changed by the time it was open sourced (including the confusing name -- I just called it a javascript compiler).

One detail this story gets wrong though is the claim that, "The Gmail team found that runtime JavaScript performance was almost irrelevant compared to download times." Runtime performance was actually way more important than download size and we put a lot of effort into making the JS fast (keep in mind that IE6 was the _best_ browser at the time). One of the key functions of the js compiler was inlining and dead-code removal so that we could keep the code readable without introducing any extra overhead.


Thanks for the correction, Paul (and for the great email client and JS Compiler!). I've added a note to the article.

The focus on inlining as a performance win makes a lot of sense. It's hard to get back into the pre-JIT IE6 mindset where every getter and setter came at a cost. By the time I used Closure Compiler years later this had gotten simplified to just "minification good". I remember search (where I worked) in particular was extremely concerned with shaving bytes off our JS bundles.


To be clear, minification was absolutely a key feature/motivation for the compiler. Runtime performance was more important than code size, but as usual the key to improving runtime performance is writing better code -- there's not much a compiler can do to fix slow code. For example, I wanted the inbox to render in less than 100ms, which required not only making the JS fast but also minimizing the number of DOM nodes by a variety means (such as only having a single event handler for the entire inbox instead of one per active element).

As other here have pointed out, JS was very much looked down upon by most people at Google, and there was a lot of resistance to our JS-heavy approach. One of their objections was that JS didn't have any tooling such compilers, and therefore the language was "unscalable" and unmaintainable. Knocking down that objection was another of the motivations for writing the compiler, though honestly it was also just kind of fun.


I used Closure at Google after coming from a Java background. I always described it as "Closure puts the Java into JavaScript". The team I was working on found several bugs where live code was removed from the dead code removal too.

Now Closure (at Google) meant a couple of different things (by 2010+). First it was the compiler. But second it was a set of components, many UI related. Fun fact: the Gmail did had written their own set of components (called Fava IIRC) and those had a different component lifecycle so weren't interoperable. All of this was the most Google thing ever.

IMHO Closure was never heavily pushed by Google. In fact, at the time, publicly at least, Google was very much pushing GWT (Google Web Toolkit) instead. For those unfamiliar, this is writing code in Java that is transpiled to Javascripit for frontend code. This was based on the very Google notion of both not understanding and essentially resenting Javsscript. It was never viewed as a "real" language. Then again, the C++ people didn't view Java as a real language either so there was a hierarchy.

GWT obviously never went anywhere and there were several other Javascript intiatives that never reached mass adoption (eg Angular and, later, Dart). Basically, React came out and everything else just died.

But this idea of running the same code everywhere was really destructive, distracting and counter-productive. Most notably, Google runs on protobufs. Being a binary format, this doesn't work for Javascript. Java API protobufs weren't compatible with GWT for many years. JS had a couple of encodings it tried to use. One was pblite, which basically took the protobuf tag numbers as array elements. Some Google protobufs had thousands of optional fields so the wire format became:

    [null,null,null,null,...many times over...,null,"foo"]
Not exactly efficient. Another used protobuf tag numbers as JSON object keys. I think this had other issues but I can't remember what.

Likewise, Google never focused on having a good baseline set of components. Around this time some Twitter engineers came out with Bootstrap, which became the new reset.css plus a component library and everything else kind of died.

Even Angular's big idea of two-way data binding came at a huge cost (component transclusion anyone?).

Google just never got the Web. The biggest product is essentially a text box. The way I always described it is "How do you know you're engineering if you're not over-engineering?" Google does some absolutely amazing technical infrastructure (in particular) but the Web? It just never seemed to be taken seriously or it was forced into an uncomfortable box.


The whole time I read this article I kept thinking there was some write it in Java and compile to JS angle that wasn't being mentioned. GWT.


Yes definitely, I also worked there during that time, and agree with the idea that Google didn't get JS. This is DESPITE coming out with 2 of the greatest JS applications ever -- GMail and Google Maps -- which basically started "Web 2.0" in JS

I always found that odd, and I do think it was cultural. At a certain point, low-level systems programming became valued more, and IMO it was emphasized/rewarded too much over products. I also agree that GWT seemed to be more "staffed" and popular than Closure compiler. There were a bunch of internal sites using GWT.

There was much more JS talent at Yahoo, Facebook, etc. -- and even eventually Microsoft! Shocking... since early Google basically leap-frogged Microsoft's hotmail and I think maps with their JS-based products. Google Docs/Sheets/Slides was supposedly a very strategic JS-based product to compete with Microsoft.

I believe a lot of it had to do with the interview process, which was very uniform for all the time I was there. You could never really hire a JS specialist -- they had to be a generalist and answer systems programming questions. I think there's a sound logic in that (JS programmers need to understand systems performance), but I also think there's room for diversity on a team. People can learn different skills from each other; not everyone has to jump through the same hoops

---

This also reminds me that I thought Alex Russell wrote something about Google engineering not getting the WEB! Not just JavaScript. It's not this, but maybe something similar:

https://changelog.com/jsparty/263

I don't remember if it was an internal or external doc. I think it focused on Chrome side things.

But I remember thinking that too -- the C++ guys don't get the web. When I was in indexing, I remember the tech lead (Sitaram) encouraged the engineers to actually go buy a web hosting account, and set up a web site !! Presumably because that would get them more in touch with web tech, and how web sites are structured.

So yeah it seems really weird and ironic that the company that owns the biggest web apps and the most popular web browser has a lot of employees who don't value that tech

---

Similarly, I have a rant about Google engineering not getting Python. The early engineers set up some pretty great Python infrastructure, and then it kind of rotted. There were arguably sound reasons for that, but I think it basically came back to bite the company with machine learning.

I've heard a bunch of complaints that the Tensorflow APIs are basically what a non-Python programmer would invent, and so PyTorch is more popular ... that's sort of second-hand, but I believe it, from what I know about the engineering culture.

A lot of it has to do with Blaze/Bazel, which is a great C++ build system, while users of every other language all find it deoptimizes their workflow (Java, Go, Python, JS, ...)

So basically I think in the early days there were people who understood JS (like paul) and understood Python, and wrote great code with them, but the eng culture shifted away from those languages.


It was definitely cultural. The engineering hierarchy was:

1. C++ engineers thought the only real language was C++

2. Java engineers thought the only real languages were C++ or Java

3. Python engineers either thought the only real languages were Python, C++ or Java or some of them thought only Python

At that time (I don't know about now), Google had a designation of a frontend softare engineer ("FE SWE") and you'd see interview feedback where a particular interviewer would be neutral on a candidate and soft reject them by explicitly stating they were maybe good enough to be an FE SWE, even though the official stance was FE SWEs had to pass the regular SWE standard plus some extra.

Basically, anything JS/CSS/HTML related was very much looked down upon by many.

Blaze went through some growing pains. At a time it was a full Python interpreter and then an interpreter of a subset of Python and ultimately not really Python at all. There was a time when you had to do things with genrules, which was an awful user expeience and it broke caching, two reasons why they got rid of it, ultimately.

But Blaze made sense because you had to be able to compile things outside of Java, Python and (later) Go where Java and Go in particular had better build systems for purely Java and Go (respectively) code bases. It got better once there were tools for auto-generating Blaze config (ie the java_library build units).

Where Blaze was horrible was actually with protobufs. auto-generated code wasn't stored in the repo (unlike Facebook). There were protobuf versions (although, even by 2010, most things were protobuf version 2) but there were also API versions. And they weren't compatible.

So Java had API version 1 (mutable) and 2 (immutable) and if you needed to use some team's protobuf but they'd never updated to API v2, you'd either have to make everything v1 or do some horrible hacks or create your own build units for v2.

But I digress. Python essesntially got supplanted by Go outside of DS/ML. The code review tool was originally written in Python (ie Mondrian) before being rewritten in (IIRC) GWT (ie Critique). For a very long time Critique lacked features Mondrian had.

Personally, I was always sympathetic to avoiding large Python code bases just for the lack of strict typing. You ended up having to write unit tests for spelling mistakes. I can't speak to Tensorflow vs PyTorch.

I suspect this institutional disdain for Python was probably a factor in GvR leaving to go join Dropbox.


You have good points overall, but I'd say AngularJS did have mass adoption at a time - it was a great way to build web applications compared to the alternatives. React just came by and did even better.

And Dart may not have much of a standing on its own, but Flutter is one of the most popular frameworks for creating cross-platform applications right now.


> Some Google protobufs had thousands of optional fields so the wire format became: > [null,null,null,null,...many times over...,null,"foo"]

In pblite, they are serialized as `[,,,,...many times over...,,"foo"]`. Just comma, no "null".


    > JSON.parse('[null,null,null]')
    > (3) [null, null, null]
    > JSON.parse('[,,,]')
    Uncaught SyntaxError...
Maybe eventually someone realized this was horribly inefficient and added an extra step but a series of commas in an array with nothing in between them isn't valid JSON.


Dart is alive and well in Flutter: https://flutter.dev/


It is, but it is no longer marketed as a general replacement for JavaScript.


True, although GP was comparing Dart to UI frameworks (GWT, Angular and React):

> GWT obviously never went anywhere and there were several other Javascript intiatives that never reached mass adoption (eg Angular and, later, Dart). Basically, React came out and everything else just died.


It’s both a novel virus and a novel vaccine. Without clinical trials we really don’t know if the boosters are helping or hurting.

Also, we should be doing clinical trials on flu shots as well. New drugs, even if they are only slightly different still require clinical trials. Why is hacking the immune system not subject to the same scrutiny?


Show your math. Unless those nukes are very small, your claim seems way off.


According to this source [1] flights departing GVA generated 1.3 million tons of CO2 in 2018. According to this source [2], jet fuel generates 3.16 kg CO2 per kg of fuel. Multiplied by approximate fuel density and we get 2.57 kg CO2 per liter. We can therefore estimate that GVA consumes approximately 500 million (1.3 billion kg CO2/ 2.57 kg CO2 per liter) liters of fuel per year. Jet fuel contains about 35 MJ of energy per liter according to [3] so that's about 17.5 PJ. If the process to convert electricity to jet fuel is 50% efficient, that's 35 PJ. That is equivalent to a 1.1GW reactor running at 100% capacity. At the global average capacity factor of 80%, that's about 1.4GW required. Half a dozen is probably a pretty large overestimate. Alternatively, this would require around 6GW of solar, although 6GW of solar is probably quite a bit cheaper than 1.1 GW of nuclear power.



agreed, 1.1 GW seems reasonable for GVA. thanks!


My back of the envelope calculation is that 1GW for 1 year is 30PJ. Jet fuel has 42MJ/kg with a density of 0.8 kg/L for a total of around 100 million liters of fuel at 100% efficiency. An A321 holds around 30000L which comes out to about 30000 flights equivalent from one reactor. GVA had around 200000 flights in 2018 meaning about 6 1GW reactors equivalent of fuel used (obviously not all flights would be fully loaded but I don't know what a normal load is).


Another quick calculation, in addition to the one from tfussell:

Google tells me [0] that total fuel consumption by commercial airlines in 2019 was 95 billion gallons. One gallon of fuel has around 33 kWh of energy.

Power capacity needed to produce that amount of energy in a year is around 358 GW.

Total world electricity production in 2020 was around 3000 GW-year [1].

[0] https://www.google.com/search?q=total+yearly+aircraft+fuel+u...

[1] https://www.google.com/search?q=total+world+electricity+prod...


Comparing global fuel consumption with global electricity production is a good approach. It’s clearly substantial, but doable, especially since fuel production can utilize “unreliable” renewables (make gas when the sun shines).


I agree it was overestimated, note to self: always redo ppl's maths.

1) number of barrels of jet fuel per day in switzerland:

34000 barrel / day = 0.39 barrel / s

https://www.indexmundi.com/energy/?product=jet-fuel&graph=co...

2) energy in a barrel of jet fuel:

1700 kWh / barrel = 6120000000 J / barrel

3) total power for switzerland:

2.4 GW

so more like 2-4 reactors for all switzerland.


She’s also one of the best founders I’ve ever worked with. Her talks at yc are my favorite — incredibly honest, personal, and insightful.


I’m more disturbed by the fact that they can also edit or remove books that I’ve already purchased. How long until Amazon is forced to “deplatform” something offensive? Those old books contain a lot of words and ideas that have no place in 2019.

It’s one of several reasons why I mostly only buy paper books.


The denial is strong. Reminds me of how cell phone makers responded to the iPhone:

“The development of mobile phones will be similar in PCs. Even with the Mac, Apple has attracted much attention at first, but they have still remained a niche manufacturer. That will be in mobile phones as well,” Nokia chief strategist Anssi Vanjoki told a German newspaper at the time.

Back in the day, smartphones were pretty much defined by devices like the Palm Treo, and Palm CEO Ed Colligan doubted some computer maker was going to just waltz in and eat his lunch.

“We've learned and struggled for a few years here figuring out how to make a decent phone,” Colligan said. “PC guys are not going to just figure this out. They're not going to just walk in.”


CMGI and the other “incubators” of that era were nothing like YC. The label “accelerator” was applied to YC years after it was started because of the need for a generic term to describe YC and all of the clones. Whether or not that label was previously used for some other kind of business is irrelevant.


Makes sense. I suspected that might be the case, which is why I mentioned the possibility of something more specific intended. (In case other readers are struck by the same thought, it might be worth a footnote).



Thankfully, mine are more of a mild annoyance than cripplingly painful.

I'll check out Curable though, thank you!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: