Hacker Timesnew | past | comments | ask | show | jobs | submit | fwip's commentslogin

And having a colony on Mars will be profitable because of...?

Was the British colonization and funding of Canada, New Zealand, and Australia profitable? All three colonies were not profitable for decades after their formation.

Yet looking back, colonialism was probably the most profitable venture ever undertaken. All three of them ended up becoming key allies and instrumental trading partners.

Think on a longer time scale.


Building those colonies involved a lot of slavery and forced or indentured labour.

I'm pretty sure that Britain actually had pretty specific goals of profitability from the get-go.

[flagged]


Am I the bozo with this? I assure you I don’t think I am very smart.

It's not just speed - incremental parsing allows for better error recovery. In practice, this means that your editor can highlight the code as-you-type, even though what you're typing has broken the parse tree (especially the code after your edit point).

Thank you for being up-front in disclaiming that this project is AI-written, both here and in the Github page. I really appreciate the transparency.

I think reasonable people can disagree on this.

From the point of view of an individual developer, it may be "fraction of tasks affected by downtime" - which would lie between the average and the aggregate, as many tasks use multiple (but not all) features.

But if you take the point of view of a customer, it might not matter as much 'which' part is broken. To use a bad analogy, if my car is in the shop 10% of the time, it's not much comfort if each individual component is only broken 0.1% of the time.


> But if you take the point of view of a customer, it might not matter as much 'which' part is broken. To use a bad analogy, if my car is in the shop 10% of the time, it's not much comfort if each individual component is only broken 0.1% of the time.

Not to go too out of my way to defend GH's uptime because it's obviously pretty patchy, but I think this is a bad analogy. Most customers won't have a hard reliability on every user-facing gh feature. Or to put it another way there's only going to be a tiny fraction of users who actually experienced something like the 90% uptime reported by the site. Most people are in practice are probably experienceing something like 97-98%.


Sorry, by 'customer' I meant to say something like a large corporate customer - you're buying the whole package, and across your org, you're likely to be a little affected by even minor outages of niche services.

But yeah, totally agree that at the individual level, the observed reliability is between 90% and 99%, and probably toward the upper end of that range.


Or if your kettle is not working the house is considered not working?

I've been on a flight that was late leaving the gate because the coffeemaker wasn't working.

A better analogy is if one bulb in the right rear brake light group is burnt out. Technically the car is broken. But realistically you will be able to do all the things you want to do unless the thing you want to do is measure that all the bulbs in your brake lights are working.

That's an awful analogy because "realistically you will be able to do all the things you want to do". If a random GitHub service goes down there's a significant chance it breaks your workflow. It's not always but it's far from zero.

One bulb in the cluster going out is like a single server at GitHub going down, not a whole service.


Is there any audio you might play that doesn't fit in 400Mbps?

The point isn’t really about audio bandwidth; it’s about the cable being strangely overbuilt for what it actually does.

It’s rigid and thick, like a Thunderbolt 3 cable, yet only supports USB 2.0 speeds and fast charging for a device that doesn’t need fast charging.

Compare that to Apple’s iPhone USB-C cable which is thin, flexible, and supports the same features.

That matters because someone might grab that cable assuming it’s a “better cable”: it came with a £629 product, it’s thick and feels serious, so surely it’s capable. But it isn’t. And there’s nothing marked on it to tell you otherwise.

The whole system ends up relying on presumption, which is exactly the problem the device in the article is solving.


> The point isn’t really about audio bandwidth; it’s about the cable being strangely overbuilt for what it actually does.

The purpose of the heavy construction is to make it durable, not to carry 5 Gbps data streams to your headphones.

Unlike most USB peripherals like your printer and keyboard that get plugged in and then don’t move around, headphone cables go to your head and move around constantly. They can get pinched in drawers or snagged on corners.

Hence the more durable construction.


Apple's woven USB-C cable gets dragged around with iPhones, iPads and laptops daily and manages durability at half the thickness. Durability doesn't require rigidity... in fact for a headphone cable, rigidity is the opposite of what you want. Stiff cables tug on the headphones and transmit mechanical noise.

You don’t wear your iPhone or iPad on your head with the cable plugged in all day like you do with headphones.

Apple’s USB iPhone cables wearing out prematurely is so common it’s a meme.


Not sure why you're being downvoted.

Maybe Apple's changed their cables recently, but the fragility is the reason I avoid Apple cables.

Especially in headphones. The number of times those broke during a bike ride or run was way to high for me to keep wasting money on them knowing full well they weren't going to last more than a few months just like every other Apple headphone I've ever had.



It’s common to add weights to headphones to make them feel premium which is bizarre since actually premium headphones tend to try very hard to reduce weight as the weight makes them more uncomfortable.

I don’t know how to fix the market especially when consumers keep rewarding these practices, and I think the effectiveness of TikTok style influencer marketing will make it worse.


I don’t think that’s what’s happening here. B&W actually reduced the weight on the Px8 S2 compared to the original, and the headphones themselves are genuinely lightweight for what they are. The cable isn’t thick to “feel premium” (it feels kinda bad); it’s thick because it’s rated for 65W+ power delivery that the headphones don’t need.

The problem is the opposite of what you’re describing, it’s not a cynical design choice, it’s a lazy one. They probably just purchased a cable for capabilities irrelevant to the product and the result is worse ergonomics and misleading physical cues about what the cable can actually do.


“I don’t think..” Ok, you’ve made a number of assumptions and we don’t share the same priors so I’m unable to follow you to your conclusion.

I think you are underestimating the importance of perceived premium combined with the pressures of cost accounting, but I do think that is pretty normal for ‘audiophiles’ which is their target market.


Which assumptions? The weight reduction on the S2 is documented and the cable’s 65W rating is what the tester confirmed.

If the argument is that B&W deliberately chose a thick cable to seem premium, it doesn’t square with them actively slimming down the headphones. B&W are primarily a speaker company, their USB-C product range is basically just a few headphones and earbuds.

More likely they just sourced a generic cable that happened to support high wattage and didn’t think about the mismatch.

Either way, we’re deep in the weeds on B&W’s cable procurement now. The root point is that USB-C is a mess. You can’t tell what a cable supports by looking at it, and even premium manufacturers are shipping cables that don’t do what you’d reasonably expect.

That’s exactly the problem the Treedix from the article solves.


My point on weight was that the market for that it is common, which is probably a stronger statement than needed. I should have made the weaker argument and said the market exists which only needs one example. The company Beats can serve as that example, this company sells the majority of premium headphones but I don’t actually know what percentage have weights placed in them. I am assuming a non trivial percentage.

You are using circular reasoning in your logic, you assume the premise is true and from there you derive your evidence.

I would contend that someone thought about it and decided to go with the cheaper option because they could get away with it. I would consider my assumption to have more grounding given my experience with manufacturing and cost accounting.


You’ve gone from “companies add weight to feel premium” to “they went with the cheaper option because they could get away with it.” Those are opposite explanations. But either way, the cable doesn’t do what its physical presence suggests, nothing on it tells you otherwise, and that’s the entire point of the device in the article.

My position is entirely consistent, it is cheaper to signal premium quality than actually deliver it. The point I am making is that there is immense comercial pressure to do this is a highly competitive market when selling to consumers who don’t know better.

My example of weights is that the steel weighs are cheaper than the alternative of using heavier drivers, by adding weight they are signaling premium without delivering it. Similarly with the USB cable, consumers assume such cables are thick because of thicker wires and better shielding, it’s cheaper to make a thick cable without those those features, once again signaling premium without actually providing it.


That's a more coherent version of your argument, but it's still speculative. You're attributing a deliberate strategy to what is more easily explained by indifference. B&W make about four products with USB-C cables. This isn't a company with a cable strategy, cynical or otherwise.

4th times the charm. You’ve provided no evidence for indifference. My point remains, given industry standards indifference would be highly unusual and not at all a safe assumption.

The vast majority of high volume consumer manufacturers use cost accounting practices which would absolutely be tracking and attributing the usb cable costs and the whole point of that accounting practice is to constantly be thinking about minimizing costs of even the smallest inputs, all the way down to the individual screws used. Yes, they’re thinking about how to save 1/100ths of a cent from each screw.


The reason it is thick is because it supports 65W charging. Apple did the same with the USB-C cables that shipped with the pre-MagSafe MacBooks. It was a thicker cable that supported 100w charging but was only USB 2.0.

Can you help me understand why that would be a reason to compromise the comfort of the cable that is supplied for a device which charges at 5w?

Or, why Apple manages the same in half the footprint?

Or, why someone would expect that a cable that came with a pair of headphones actually charges things at over 65w?


Like most things in the audiophile world, it's more about aesthetics than anything else. A big cable looks like it means business.

I think that's being a bit uncharitable to B&W specifically; they're one of the few headphone companies where the engineering does back up the price. The cable is the odd one out.

I don't have an informed opinion of B&W either way, but are you sure it's not an instance of Gell-Mann amnesia?

The headphones have equivalent performance whether a USB 2 cable is connected, or a USB 3 cable is connected. The headphones themselves are not USB 3 devices; the addition of USB 3 cabling instead of USB 2 cabling would change absolutely nothing about how they work.

So, no: I wouldn't expect the cable for a pair of headphones (of any price) to support USB 3. That represents extra complexity (literally more wires inside) that is totally irrelevant for the product the cable was sold with. (The cables included with >$1k iPhones don't support USB 3, either.)

Meanwhile: Fast charging. All correctly-made USB C cables support at least 3 amps worth of 20 volts, or 60 Watts. This isn't an added-cost feature; it's just what the bare minimum no-emarker-inside specification requires. A 25-cent USB C-to-C cable from Temu either supports 60W of USB PD, or it is broken and defiant of USB-IF's specifications.

---

Now, of course: The cable could be thinner and more flexible and do these same things. That'd probably be preferred, even: Traditional analog headphones often used very deliberately thin cables with interesting construction (like using Litz wire to reduce the amount of internal plastic insulation) to improve the user's freedom of movement, and help prevent mechanical noise from the cables dragging across clothes and such from being telegraphed to the user's ears.

Using practical cabling was something that headphone makers strived to be good at doing. I'm a little bit annoyed to learn that a once-prestigious company like B&W is shipping cables with headphones that are the antithesis of what practical headphone cables should be.

---

But yeah, both USB C cables and the ports on devices could be better marked so we know WTF they do, to limit the amount of presumption required in the real world. So that a person can tell -- at a glance! -- what charging modes a device accepts or provides, or whether it supports video, or whether it is USB 2 or USB 3, or [...].

Prior to USB C, someone familiar with the tech could look at a device or a cable and generally succeed at visually discerning its function, but that's broadly gone with USB C. What we have instead is just an oblong hole that looks like all of the other oblong holes do.

After complaining about this occasionally since the appearance of USB C a decade or so ago, I've come to realize that most people just don't care about this -- at all. Not even a little bit. Even though these things get used by common people every day, the details are completely out of the scope of their thought processes.

It doesn't have to be this way, but it's not going to change: Unmarked ports are connected together with unmarked cables and thus unknown common capabilities are just how we roll.


The Litz wire point is pretty spot on, traditional headphone manufacturers understood that cable ergonomics mattered. Somewhere in the transition to USB-C, that institutional knowledge just evaporated.

Your last paragraph is depressingly accurate though. I think that's exactly why devices like the Treedix exist: the standards bodies and manufacturers clearly aren't going to fix the marking problem, so now we need test equipment to figure out what our own cables do.


> The Litz wire point is pretty spot on, traditional headphone manufacturers understood that cable ergonomics mattered. Somewhere in the transition to USB-C, that institutional knowledge just evaporated.

"I heard what you guys are planning and I talked to my financial guy. He said I have enough to put a manufactured home on some land in some desolate place like the Dakotas or central Wisconsin, as long as I keep a bit of supplemental income and live a little lower. So I'm going to do that, and take my chances on growing artisanal rutabaga to sell at farmers markets.

I've already packed up the Prius. I just stopped by to wish you kids luck with your new headphone project and tell you that I won't be back."


No. CD audio is 1,4 mbit. Even increasing the temporal and spatial resolution beyond that, which is audiophile nonsense, will never even approach USB 2 speeds.

Real answer - it's because an LLM is better than you at the things you suck at.

For executives, that's writing code. For ICs, it's other stuff.


Yep. I have one of these on my desk: https://www.amazon.com/Generic-Cattop/dp/B09F8QQPJH/ (a heating pad shaped like a laptop), and our cat will spend 90% of my workday on it, and the remaining 10% is spent getting my attention (or getting lunch).

I seem to remember Windows XP using tabs in a lot of its settings pages - and possibly earlier versions as well.

It did, but those were static tabs. It was pretty easy to create tabs as a form of sub-organization. But the treatment of tabs as documents was new-ish to Chrome/Firefox. Other applications treated multiple, concurrent document views as whole, resizable, sub windows inside of an "MDI" panel.

Look at how older versions of Word, Excel, and Visual Studio worked. The tool trays stay consistant as you move between document windows. The entire application is minimizable and quittable together as one.

Photoshop still uses this metaphor. In the ealry and mid-2000s, Photoshop on Windows had a window for the application separate from the documents, but on Apple OS9 and OSX, the only representation of the application itself was in the menu bar. Document windows and tooltray windows both floated in the same desktop space as every other window.

I haven't checked on the GNU Image Manipulation Program, but I seem to remember it retained the same "no application window, tooltrays and doc windows exist in the DE" metaphor for much longer than Photoshop.

There is also a difference in the way that Chrome renders tabs in the window title area. That's a part of the UI chrome that one would expect to be in the perview of the UI toolkit, but Google took it on themselves.


Virtual desktops in Unix predate Visual Studio. I'm pretty sure there was a concept of tabbed interfaces somewhere in the Amiga or BeOS or any other OS.

https://en.wikipedia.org/wiki/Tab_(interface)

Don Hopkins himself can enlighten us about it (NeWS) better than me literally anyone in this thread, jut wait.


What does that have to do with my criticism of the two most popular operating system that they failed to innovate or adapt in areas that showed obvious need?

"Tall women aren't women."

Every single person in that meet "pushed someone" else out of that meet. That's how competition works.

So your position is that men can freely play in women's sports?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: