I mean the fish rots from the head, but at the end of the day that rot translates into an engineering culture that doesn't value craftsmanship and quality. Every github product I've used reeks from sloppiness and poor architecture.
That's not to say they don't have people who can build good things. They built the standard for code distribution after all. But you can't help but recognize so much of it is duct taped together to ship instead of crafted and architected with intent behind major decisions that allow the small shit to just work. If you've ever worked on a similar project that evolved that way, you know the feeling.
My nephew was deep into Fornite for years - now at 15 he (and his friends) moved on to GTA V. Imagine what a treasure trove of gaming you can discover as a teenager today, looking back at a pool of 15-20years of great games.
I started playing in my late 40s, but it got stale pretty quickly. Epic keeps changing things to try to keep it fresh, but they change the wrong things: usually making the game harder and more frustrating for casual players, in order to cater to pros and streamers. When I started playing, I could win a few matches if I got lucky. Three chapters / 15+ seasons later, I get spanked within 5 minutes of joining a match by people who live and breathe the game. I stopped playing because it's just not very fun for someone who just logs on once a week to play for a half an hour or so.
It bothers me that everyone is laser focused on poor ATC staffing and working conditions (which is very valid, don't get me wrong). I think airport capacity should be fixed depending on ATC staffing. We need to have less air travel.
The way I think about it is this: substandard ATC staffing is just as bad as lacking jetways or damaged runways. When the airport can't land planes because of physical capacity constraints, flights get cancelled or delayed (literally happening today at LGA, flights are getting canceled because they're down one runway). The carriers need to eat the costs of forcing too much demand on ATCs.
You are correct. Robustness requires a system that is working within it's tolerance margin, and stressing that inevitably leads to failure. A fault-tolerant system in this case would require a large amount of redundant humans. Unfortunately, the capitalist mindset prevents accepting any amount of "waste" as tolerable, which makes a robust system impossible to implement over time. Every system touched by a capitalist optimizer will eventually fail.
The idea that waste must be reduced is killing society, and this mindset must be addressed first before any other safety-critical system can be made reliable again.
Start from the perspective of the user seeing effectively:
> error: expected the character ';' at this exact location
The user wonders, "if the parser is smart enough to tell me this, why do I need to add it at all?"
The answer to that question "it's annoying to write the code to handle this correctly" is thoroughly lazy and boring. "My parser generator requires the grammar to be LR(1)" is even lazier. Human language doesn't fit into restrictive definitions of syntax, why should language for machines?
> Because code is still read more than it is written it just doesn't seem correct to introduce ambiguity like this.
That's why meaningful whitespace is better than semicolons. It forces you to write the ambiguous cases as readable code.
I used to hate semicolons. Then I started working in parser recovery for rustc. I now love semicolons.
Removing redundancy from syntax should be a non-goal, an anti-goal even. The more redundancy there is, the higher the likelihood of making a mistake while writing, but the higher the ability for humans and machines to understand the developer's intent unambiguously.
Having "flagposts" in the code lets people skim code ("I'm only looking at every pub fn") and the parser have a fighting chance of recovering ("found a parse error inside of a function def, consume everything until the first unmatched } which would correspond to the fn body start and mark the whole body as having failed parsing, let the rest of the compiler run"). Semicolons allow for that kind of recovery. And the same logic that you would use for automatic semicolon insertion can be used to tell the user where they forgot a semicolon. That way you get the ergonomics of writting code in a slightly less principled way while still being able to read principled code after you're done.
Why is ";" different from \n from the perspective of the parser when handling recovery within scopes? Similarly, what's different with "consume everything until the first unmatched }" except substituting a DEDENT token generated by the lexer?
That's not to say they don't have people who can build good things. They built the standard for code distribution after all. But you can't help but recognize so much of it is duct taped together to ship instead of crafted and architected with intent behind major decisions that allow the small shit to just work. If you've ever worked on a similar project that evolved that way, you know the feeling.
reply