On the subject of the weights and measures to check that a pint is a pint, I remember the father of a friend of mine at university who was responsible for the weights and measures for Staffordshire. I think he was the undersheriff or something like that, and that the official pint was part of the collection.
This would have been in the late 80s - i've no idea if it was still in use, but i've a feeling that the law hadn't necessarily moved on, so I guess the official measure could have been required if challenged in court.
The older Tektronix TDS540 series did this, but at much lower rates as was common in those days though. Internally there are differential feeds from the very beautiful hybrid ceramic input boards to 4 DACs, with some clever switching so that a single input can be sampled by all 4 DACs with a suitable offset to create 4x the sample rate when running with all 4 inputs.
The calibration procedure on the scope fiddles with the time alignment to get the different DACs correctly offset so that the combined signal is correct.
The hybrid ceramic input boards in their metal cases are a thing of beauty, fragile (don't ask how I know), but beautiful.
Yup, a lot of scopes actually did this internally and some still do. It's part of why some scopes lose half their BW when you go from 2 ports to 4 ports (some go the other direction and run multiple ports on one very fast ADC), they split the digitizers. It's just very very difficult to keep it working external to the box mainly because of line drift.
Just out of interest, why aren't they cross compiling RISC-V? I thought that was common practice when targeting lower performing hardware. It seems odd to me that the build cycle on the target hardware is a metric that matters.
Interesting that it's mandated as native - i'm really not sure the logic behind this (i've worked in the embedded world where such stuff is not only normal, but the only choice). I'll do some digging and see if I can find the thought process behind this.
I do. At my first job it looked so technical! Sure they were a bit rubbish but at least you didn’t have to pick all the hair and gank off the rollers and balls all the time.
The way I read it, the prefix to the > indicates which file descriptor to redirect, and there is just a default that means no indicated file descriptor means stdout.
So, >foo is the same as 1>foo
If you want to get really into the weeds, I think 2>>&1 will create a file called 1, append to a file descriptor makes no sense (or maybe, truncate to a file descriptor makes no sense is maybe what I mean), but why this is the case is probably an oversight 50 years ago in sh, although i'd be surprised if this was codified anywhere, or relied upon in scripts.
Most of the effort when writing a compiler is handling incorrect code, and reporting sensible error messages. Compiling known good code is a great start though.
Depends on a lot of factors. LEO has high drag, but good radiation shielding, so if you've got a low enough orbit you can use most embedded hardware but need to compensate with bigger thrusters and bigger fuel tanks if you want it to survive "any length of time" without burning up from atmospheric drag.
This has reminded me that in System 7, the code for the window was a system resource (resource forks contained all sorts of code, icons, text dictionaries etc). Anyhow, if you dropped an updated window resource into your system with the correct resource id, you could change this default behaviour. A friend of mine wrote a round window for a clock app, and made a copy with resedit in the system, and a reboot later, all windows were round.
It was a very flexible and hackable system, very fragile, and no security whatsoever, but lots of fun!
The problem with articles like this is that they read a little like justifying a decision that has already been made. I've a feeling that if it was written in C++/Rust/Go/whatever, it would also be possible to justify that decision with similar reasoning.
I think you're not wrong, but it's not necessarily a problem? I might argue that's actually part of the main point the article is making. The software is already written in C, well understood and optimized in that language, and quite stable, so they'd need very compelling reasons for a rewrite.
I think the "lingua franca" argument for C and the points at the end about what they'd need from Rust to switch do go beyond merely justifying a decision that's already been made, though.
This would have been in the late 80s - i've no idea if it was still in use, but i've a feeling that the law hadn't necessarily moved on, so I guess the official measure could have been required if challenged in court.
reply