Anyway, I don't get how updates are so big these days. The games as well, but I guess with some lazy programming... but the updates? I really wonder how hard it is to reliably patch a binary without replacing the whole binary, since apparently it outweighs the additional bandwidth cost for pretty much every company that does any sort of software patching.
One problem you face here is versioning. You can have tiered systems where you have a patch from version 1 to 2 to .. n. And then if somebody wants to go from version 1 to n, you just incrementally apply each and every single patch. Or you can just have a replacement 'patch' that means no matter what version or setup you're on, you're good to go following the exact same process.
And in some cases this is not only easier, but also can provide major performance benefits. For instance imagine an update changes something about a local database that requires some expensive process like reindexing or something. And then another update carries out further changes, and so on. Going straight from 1 to n instead of 1 to 2 to ... n can be vastly more efficient from a user perspective.
But if you're seeing huge updates then more often than not it's probably data, not code, though the same story applies there.
it's not only binaries that are patched. Data files, which can be of multiple gbs are patched too. You can either delta patch them, requiring significant cpu time or replace them entirely. Most devs prefer the latter
I gladly accept the occasional 10gb update in return for a game that doesn’t crash multiple times a day or week.