Hacker Timesnew | past | comments | ask | show | jobs | submit | dusted's commentslogin

I made this little DSL that's transported by the typescript syntax, but made for declaring interfaces between a typescript application and a backend. The code-generator spits out angular classes and typescript types for use by the web app, and it spits out a node backend class that you fill with the handlers for that interface. It also spits out a qt widget with the web application inside it, complete with functional plugin for the qt designer so you can drag your widget into existing UI layouts.

So you can use it to write UIs in web and use them either as regular Qt widgets or as stand-alone webapps with regular node backend.

It's really the wrong way around if you think about it.. using an inferior technology (web) for the UI part.. But somehow people prefer typing CSS and downloading gigabytes of boilerplate instead of just using a WYSIWYG designer.. I don't get that part..


I've long suspected that (at least my own) tinnitus was a neurological phenomenon, seeing how it's always been with me at various "levels of presence", from imperceptible to so loud I can't hear anything else, I've always felt it as an "inner sound".. Had multiple hearing tests, and nothing in particular showed up. It's also weird because it changes somewhat in frequency, both down to frequencies my 40 year old ears can register and up beyond what I can actually hear when doing a test..

But especially the coming and going and how it seems affected to level of tiredness or amount of sleep I got.. Of course, reading the article made me aware of it and now it's loud than before..

I've had strong symptoms of adhd my whole life, but never thought much of it (except as a lack of self dicipline and general failure of a broken robot to impersonate a real human), but as demands on my performance rose to real-adult levels with a young child and duties beyond not dying, I decided to tell the doc how it had generally felt like to be myself, at which point I was referred to someone with a specialty in broken brains, and we quickly agreed that while I wasn't going to become normal, certain stimulants at least provided me with sufficient energy to carry out most of the functions expected by an adult member of society with actual responsibilities.

And so, over the past.. more than a year, I've gotten to experience a little bit of everything as my brain gets to oscillate between being slightly oversaturated to absolutely drained of certain neurotransmitters in a way that at the same time feels slightly unsustainable and the only alternative where I get to not be absolutely miserable all the time.

The point of that story, being, these "phantom precepts", fits the bill somewhat well. I've always had a very conscious experience of common neurological phenomenon which are naturally present but largely-unnoticed by many (auras, visual snow, floaters, phosphenes, tinnitus, afterimages) so I'm probably a bit one the sensitive side, and, the medication seems to have a quite interesting effect on these as well, among them, I noticed the ABSENCE of noticing my clothes touching my skin.. I am no longer acutely aware of the cooling sensation of inhaling air through my nose, and I rarely hear the beat of my heart in my ears.. Maybe the weirdest effect is on saccades, in a conversation, looking from one person to the next seems to be as instant as before, but the blur of my eyes moving between points of focus is gone, it's kind of jarring, just poof, one picture, then another.. nothing in between.

I now seem to be able to influence my attention somewhat, that is, to do whatever that cognitive regulation is called, so that my focus shifts to a subject I need to do but have no interest in doing (oh wait, that's why I got the medication), but it does make me wonder, if tinnitus is just one of the more obvious (and therefore common) neurological processes that "pokes through" maybe perception of sound and attention (and maybe therefore also conscious experience of sound) have evolved to be more strongly linked (because if you notice the predator sneaking up on you, you get to not be eaten).

Maybe this stronger link is why tinnitus is so obvious, and maybe sleep is instrumental in regulating consciousness, so if consciousness is differently regulated, or less regulated, maybe it's easier for the phenomenon to "seep through".


out of all the people in this thread, you seem the most likely candidate to appreciate the following - tinnitus symptoms are often conflated with hearing sensitivity. if you can see auras, then id say your tinnitus symptoms arent indicative of tinnitus. search up brain wave frequencies and look at images. id wager you are hearing yourself, especially during changes of frequency. as well, just prior to sleep, you might hear a spike - thats the brain commencing the sleep mode algorithm (no wonder tinnitus wrecks sleep, affected individuals would struggle at synchronizing both hemispheres with the sleep algorithm when an involved sensor is malfunctioning)

seperately ... its clear that you recognise the incompatibility between sensitive individuals and a society designed to place the populace into constant fight or flight. youre still showing signs of blaming yourself. literally nobody is going to understand you (especially not doctors) and the sooner you accept this, the sooner you will free up a lot of trapped energy. id stop taking the stimulants regularly man ... even without their effect, barely anybody is going to understand your words, and the number of people who will appreciate your words is reduced when they are conveyed via essays (honestly i cant find the strength to read them properly) ... in general your expression has reminded me of the message behind the lateralus chorus


I have Ménière’s disorder and had a few short episodes of vertigo before one finally got me discombobulated. I woke up one morning and couldn’t get my extremities to function. Couldn’t tell up from down. It took about 8 hours for it to completely go away but then I realized I had lost most of my hearing in my right ear and half in my left, and had constant tinnitus and dizziness. I went to an ENT and learned that there is nothing I could do as there is no cure for Ménière’s. I have gotten better at dealing with the tinnitus and don’t notice it unless the train whistle or the lion roars start. I keep hoping one day that I will read that someone has found a cure! Ah well, hope springs eternal!!


I suffered a highly unpleasant vertigo attack yesterday - happens every once in a while. Tinnitus was the warning, and I was definitely over-tired beforehand.

After an ear infection 30 years ago I lost most hearing in my left ear and my balance was affected. Not a massive problem most of the time but I regret not being able to read when travelling, even by plane or train. It’s audiobooks all the way…


im sorry to hear that, some of my family members have inner ear disorders and the nausea/vertigo sounds terrible. the original comment isnt targeted at those like yourself who unfortunately must deal with damaged peripherals ... but still there is a chance it could apply. may i ask whether you ever noticed ringing in your ears to be correlated with a change of mental state? examples of this chage would be arriving at a big realisation, or commencing relaxation, or performing meditation.


I'm mainly thinking that the "sound" of tinnitus may be inherent in the brain, and the problem is the percept itself, not a percept of the tinitus, but the percept being generated while nothing was perceived, and so we become aware of this weird almost impossibly fine hight pitch.

It kind of fits with the patterns I and many other people describe, like the intensity varying with sleepiness and other mental state, and how it goes away if we hear _actual_ sounds of a broad enough spectrum..

It might be this little thing where it comes on by mistake, but it doesn't turn off again, and we latch onto it, and that's the feedback loop that enforces it.. I'm not saying we can "think it away" but I'm noticing in myself, that I didn't have any tinnitus _AT_ALL_ when I woke up, and now I'm almost consumed by this 20khz tone (my hearing stops around 16khz), and sitting here playing with that in my mind, I can certainly make it dim somewhat.

I wonder if there are some cognitive exercises that can be done especially for people who either don't have it, or have gotten it very recently. (Literature talks about some meditation and mindfulness, which I'm generally not a big believer in, but nevertheless, those do touch on the idea of messing around inside ones head in a top-down way).

I'm not too hooked on the idea that adhd is simply a "different kind of brain", I don't buy that we were the excellent survivors or hunters, I'm pretty sure I'd be the caveman who was eaten by a bear because I was too distracted by the pattern of shadows from two branches moving just the right way xD

I don't really blame myself, but I don't need to defend my condition (my personal condition, I'm not speaking on behalf of others), I've always been bothered by it, not simply when the mirror of expectation and society is held up against me, but even when left to do as I please, I find that while there are areas in which I function, and function well, there are areas where I'm so limited that it seems unreasonable even within my own framework. :)


constant 20khz does sound more like a damaged peripheral therefore my positation was incorrect, i wish i could help more but i dont have enough experience, that being said i do find super interesting the idea of playing a tone at the same frequency to manually force brains into filtering it out, thanks for sharing your perspective , all the best


Will.. will it be televised ?


So, is it vietnam or vienam ? because the headline says vienam.


> Abstractions don’t remove complexity. They move it to the day you’re on call.

Gold.


Some (the worst) clocks do that.. It's convenient that the hour hand is moving continuously because it means that unless you need to be able to say "it's five seconds past two minutes past four _in the morning_", you simply look at the hour hand, if it's in the middle of two hours, well it's half past the smaller.. if it's one forth past the smaller, it's.. yes, quarter past.. if it's one forth from the larger then it's quarter to.. and well, honestly, if you need to read the time more precisely than that and chose to use an analogue clock for it, you've chosen the wrong type of clock, a digital clock with seconds and 24 hour display is a superior tool for telling the time anyway.


When I was a kid, before kinder garden, I remember my parents beginning to teach me how to read an analogue clock.. But I this was the late 80s, maybe it was 1990, but, this thing called Digital Clocks were a thing at the time.. And I absolutely refused to learn that old fashioned shit when I was already staring right at the objectively better solution.. My reasoning was that the old clocks would either be replaced by digital clocks within a short time, and those that weren't replace would be when they broke (5 year old me didn't grasp the idea that people would continue buying the obviously inferior products until this very day), honestly, I'm still a bit perplexed by the fact that one can buy an analogue clock today.. It's objectively inferior in every way.. Most of them don't even do 24 hours, which, is the amount of hours we have in a day, leading some idiots to refer to 18:00 as "six-o-clock", and other idiots (like myself) to have to ask EVERY_TIME someone tells me a time that's less than, or equal to 12.. fuck that shit.

Yeah, I learned how to read inferior clocks, but.. I don't see the point.

So no, it's not that those students can't read a clock, they just can't read an analogue one, because they're probably need to as often as they need to read an octal clock, or a binary led clock, or a 24 hour dial clock, or Chinese..


Years ago I once wasted 2 hours arriving at 07:00 instead of 19:00. The AM/PM stuff is ridiculous, especially when people don't specify it. It was the time we were going to leave for a trip, so 07:00 or 19:00 were all acceptable times. So 24-hour time is obviously better. Most people don't even bother saying "AM" or "PM" if it even exists in their language or culture.

I think analog clocks are mostly for old people who don't like change, for people nostalgic for the past, for people who think like it makes them better, smarter, fancier of classier somehow - especially with expensive mechanical analog watches.


Excuse me while I prima facie dismiss a kindergartner’s opinion on what is an “objectively” superior/inferior system.

It’s your opinion and prerogative, don’t try to masquerade it as settled truth.


Wow, you must have been violated by an analog clock somewhere sometime. It left a visible trauma. Go get help.


I've thought a lot about law-as-code, but my conclusion is always that bad actors will be given an advantage by being able to brute-force the code until they find a way to get away with whatever obviously-immoral-harmful stuff they want (imagine giga-corps spending a few millions on hardware to brute-force tax law - ROI probably even better than tunneling through mountains to grab stonks first..).

In the end it reminds me of a quote by Edmund Burke: "Bad men obey the law only out of fear of punishment; good men obey it out of conscience - and thus good men are often restrained by it, while bad men find ways around it."


Right, but if laws were developed in regulatory sandboxes, you'd also have the opportunity to red-team them.

Might be a design idea for future lawmakers.


I'm wondering if it might be impossible to write a law that both prevents the sprit of what we want it to prevent, while also not preventing the spirit of what we don't want to prevent. :)


It probably is impossible, but you could cover a lot of cases with more deliberate design. For the rest, you can leave it up to the judges to decide.

Then again, that might be exactly how (some) lawmakers think, but I'm not aware of it.


I recently used it to boot a ~1996 Compaq Presario from CD-Rom to image the hard-drive to a USB stick before wiping it for my retro-computer fun :)

It's kind of sad to hear "adult" people claim in all seriousness that it's reasonable that a kernel alone spends more memory than the minimum requirement for running Windows 95, the operating system with kernel, drivers, a graphical user interface and even a few graphical user-space applications.


I got this insight from a previous thread : you can run linux with gui on the same specs as win 95 fine if your display resolution is 640x480. The framebuffer size is the issue


That and the fact that everything is 64 bit now. The Linux kernel is certainly much bigger though and probably has many more drivers loaded.

It is not one factor but the size of a single bitmap of the screen is certainly an issue.


I mean why is that a problem? Win95 engineering reflects the hardware of the time, the same way today's software engineering reflects the hardware of our time. There's no ideal here, there's no "this is correct," etc its all constantly changing.

This is like car guys today bemoaning the simpler carburetor age or the car guys before them bemoaning the model T age of simplicity. Its silly.

There will never be a scenario where you need all this lightweight stuff outside of extreme edge cases, and there's SO MUCH lightweight stuff its not even a worry.

Also its funny you should mention win95 because I suspect that reflects your age, but a lot of people here are from the dos/first mac/win 2.0 age, and for that crowd win95 was the horrible resource pig and complexity nightmare. Tech press and nerd culture back then was incredibly anti-95 for 'dumbing it all down' and 'being slow' but now its seen as the gold standard of 'proper computing.' So its all relative.

The way I see hardware and tech is that we are forced to ride a train. It makes stops but it cannot stop. It will always go to the next stop. Wanting to stay at a certain stop doesn't make sense and as in fact counter-productive. I wont go into this, but linux on the desktop could have been a bigger contender if the linux crowd and companies were willing to break a lot of things and 'start over' to be more competitive with mac or windows, which at he time did break a lot of things and did 'start over' to a certain degree.

The various implementations of linux desktop always came off clunky and tied to unix-culture conventions which dont really fit the desktop model, which wasn't really appealing for a lot of people, and a lot of that was based on nostalgia and this sort of idealizing old interfaces and concepts. I love kde but its definitely not remotely as appealing as win11 or macos gui and ease of use.

In other words, when nostalgia isn't pushed back upon, we get worse products. I see so much unquestionable nostalgia in tech spaces, I think its something that hurts open source projects and even many commercial ones.


I agree with this take. Win95's 4MB minimum/8MB recommended memory requirement and a 20MHz processor is seen as the acceptable place to draw the line but there were graphical desktops on the market before that on systems with 128K of RAM and 8MHz processors. Why aren't we considering Win95's requirements as ridiculously bloated?


Yep, at the time the Amiga crowd was laughing at the bloat. But now its suddenly the gold standard on efficiency? I think a lot of people like to be argumentative because they refuse to understand they are engaging in mere nostalgia and not actually anything factual or logical.


if you can compile the kernel though, there is no reason that W95 should be any smaller than your specifically compiled kernel - in fact it should be much bigger

however this is of course easier said than done


> There will never be a scenario where you need all this lightweight stuff

I think there are many.

Some examples:

* The fastest code is the code you don't run.

Smaller = faster, and we all want faster. Moore's law is over, Dennard scaling isn't affordable any more, smaller feature sizes are getting absurdly difficult and therefore expensive to fab. So if we want our computers to keep getting faster as we've got used to over the last 40-50 years then the only way to keep delivering that will be to start ruthlessly optimising, shrinking, finding more efficient ways to implement what we've got used to.

Smaller systems are better for performance.

* The smaller the code, the less there is to go wrong.

Smaller doesn't just mean faster, it should mean simpler and cleaner too. Less to go wrong. Easier to debug. Wrappers and VMs and bytecodes and runtimes are bad: they make life easier but they are less efficient and make issues harder to troubleshoot. Part of the Unix philosophy is to embed the KISS principle.

So that's performance and troubleshooting. We aren't done.

* The less you run, the smaller the attack surface.

Smaller code and less code means fewer APIs, fewer interfaces, less points of failure. Look at djb's decades-long policy of offering rewards to people who find holes in qmail or djbdns. Look at OpenBSD. We all need better more secure code. Smaller simpler systems built from fewer layers means more security, less attack surface, less to audit.

Higher performance, and easier troubleshooting, and better security. There's 3 reasons.

Practical examples...

The Atom editor spawned an entire class of app: Electron apps, Javascript on Node, bundled with Chromium. Slack, Discord, VSCode: there are multiple apps used by tens to hundreds of millions of people now. Look at how vast they are. Balena Etcher is a, what, nearly 100 MB download to write an image to USB? Native apps like Rufus do it in a few megabytes. Smaller ones like USBimager do it in hundreds of kilobytes. A dd command in under 100 bytes.

Now some of the people behind Atom wrote Zed.

It's 10% of the size and 10x the speed, in part because it's a native Rust app.

The COSMIC desktop looks like GNOME, works like GNOME Shell, but it's smaller and faster and more customisable because it's native Rust code.

GNOME Shell is Javascript running on an embedded copy of Mozilla's Javascript runtime.

Just like dotcoms wanted to dis-intermediate business, remove middlemen and distributors for faster sales, we could use disintermediation in our software. Fewer runtimes, better smarter compiled languages so we can trap more errors and have faster and safer compiled native code.

Smaller, simpler, cleaner, fewer layers, less abstractions: these are all goods things which are desirable.

Dennis Ritchie and Ken Thompson knew this. That's why Research Unix evolved into Plan 9, which puts way more stuff through the filesystem to remove whole types of API. Everything's in a container all the time, the filesystem abstracts the network and the GUI and more. Under 10% of the syscalls of Linux, the kernel is 5MB of source, and yet it has much of Kubernetes in there.

Then they went further, replaced C too, made a simpler safer language, embedded its runtime right into the kernel, and made binaries CPU-independent, and turned the entire network-aware OS into a runtime to compete with the JVM, so it could run as a browser plugin as well as a bare-metal OS. Now we have ubiquitous virtualisation so lean into it: separate domains. If your user-facing OS only runs in a VM then it doesn't need a filesystem or hardware drivers, because it won't see hardware, only virtualised facilities, so rip all that stuff out. Your container host doesn't need to have a console or manage disks.

This is what we should be doing. This is what we need to do. Hack away at the code complexity. Don't add functionality, remove it. Simplify it. Enforce standards by putting them in the kernel and removing dozens of overlapping implementations. Make codebases that are smaller and readable by humans.

Leave the vast bloated stuff to commercial companies and proprietary software where nobody gets to read it except LLM bots anyway.


I wonder if it would be possible to have gone directly to Zed, without going through Atom first (likewise, Plan 9 would never have been the first iteration of a Unix-like OS). "Rewrite it in Rust" makes a lot of sense if you have a working system that you want to rewrite, but maybe there's a reason that "rewrite it in Rust" is a meme and "write it in Rust" isn't. If you just want to move fast, put things up on the screen for people to interact with, and figure out how you want your system to work, dynamic languages with bytecode VMs and GC will get you there faster and will enable more people to contribute. Once the idea has matured, you can replace the inefficient implementation with one that is 10% of the size and 10x the speed. Adding lots of features and then pruning out the ones that turn out to be useless may also be easier than guessing the exact right feature set a priori.


This is true, but it is generally true. Even for UV-EPROMs the retention time can be as low as a 25 years, if kept warm, even with the window sealed correctly. Magnetic drives are quite a lot better, around 50 years.

CD-RWs are somewhat wider in their stability, I have ~20 year old discs that are becoming unreadable because the actual foil is delaminating from the plastic disc. Meanwhile I have ~40 year old DS-DD floppies that are still fully readable even though their medium is in physical contact with the read/write heads (although here, again, storage conditions and especially the different brands/batches seem to make a difference).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: