> Supporting only Server grade hardware and ignoring laptop/consumer grade GPU/APU for ROCm was a terrible strategical mistake. A lot of developers experiments first and foremost on their personal laptop first and scale on expensive, professional grade hardware later.
NVIDIA is making the same mistake today by deprioritizing the release of consumer-grade GPUs with high VRAM in favour of focusing on server markets.
They already have a huge moat, so it's not as crippling for them to do so, but I think it presents an interesting opportunity for AMD to pick up the slack.
I don't think it's a single axis even in the original poster's conception, since you could be both incorrect and also not pragmatic.
But if a fix needs to be described as pragmatic relative to the alternatives, that's probably because it couldn't be described as correct. Otherwise you wouldn't be talking about how pragmatic it is.
Damn, I was working on exactly the same idea not too long ago. It never really went anywhere because I couldn't find a speech rate detection algorithm that worked well enough to make it practically useful. I hope some more mathematically inclined people take a look at this idea and find a way to make it work.
If their whole business is based around being an established standard and making users happy is not a relevant goal, then why do anything at all? They already are an established standard, so why would they bother taking any further actions whatsoever, making any changes or rolling out any new products? Clearly they are trying to achieve something, right? So what is it?
It is about making specific high value users happy. If the rest of us are unhappy - we don't matter. They know for most people ubuntu or whatever isn't a realistic option and so they can take whatever money they can get from those people. Sure a few people like me will run *BSD or linux, but we are a footnote not worth their time.
The only danger is every once in a while one of those little footnotes becomes large enough to be a problem and you lose the market of those who do matter as well. While there are many obvious examples of where that happened, there are also a lot of cases where it didn't.
Wow, these preassembled ESP32 plus touchscreen boards are extremely cheap, and there are tons of them in all kinds of different form factors on Amazon. I didn't realize this kind of thing was so plentiful, this seems like a great way to bootstrap many kinds of electronics/IoT projects
Here’s a list of just a few. They’re insanely popular not only because they’re just good to use, but also because they’re one of the cheaper FCC approved modules you can buy, which takes a lot of the pain out of bringing a product to market.
IIRC Apple has attempted to implement some defences against this, for example by requiring the passcode to be inputted before an update can be installed to prevent another San Bernardino scenario. A cursory search indicates that they also have some kind of transparency log system for updates, but it seems to only apply to their cloud systems and not iOS updates.
The table has two categorizations: "In transit & on server" and "End-to-end". The former, which covers iCloud backups in the default configuration, is explicitly NOT end-to-end, meaning there are moments in time during processing where the data is not encrypted.
However, iCloud backups actually are listed as "End-to-end" if you turn on the new Advanced Data Protection feature.
NVIDIA is making the same mistake today by deprioritizing the release of consumer-grade GPUs with high VRAM in favour of focusing on server markets.
They already have a huge moat, so it's not as crippling for them to do so, but I think it presents an interesting opportunity for AMD to pick up the slack.
reply