Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

Hard agree on the backwards compatibility. It appears to be some law of nature that "our next compiler must compile 20 year old code".

Instead of new platform libs and compilers simply defaulting to some reasonable cutoff date and saying "You need to install an ancient compiler to build this".

There is nothing that prevents me from building my old project with an older set of tools. If I want to make use of newer features then I'm happy to continuously update my source code.



>It appears to be some law of nature that "our next compiler must compile 20 year old code".

Some examples of companies/products not implementing backwards compatibility are Delphi and Angular. Both are effectively dead. .NET Core wasn't backwards compatible with .NET Framework, but MS created .NET Standard to bridge that gap. .NET Standard allows people to write code in .NET core and will run in .NET Framework. It's not perfect, but apparently it was good enough.

Companies usually won't knowingly adopt a technology that will be obsoleted in the future and require a complete rewrite. That's a disaster.


But that's .NET, not C#. Language and platforms are different. Libraries must be compatible (because you don't know if your library will be consumed in a newer app).

But the compiler only consumes syntax (C#11, C#12 C#13 and so on) so I don't see why the compiler that eats C#13 necessarily must swallow C#5 without modification


They did a breaking change in a recent C# where nullable objects must be postfixed with a ?, so old code is:

  public Patient Patient { get; set; }
The same thing with modern code would be

  public Patient? Patient { get; set; }
Because with the new C#, objects are by default not null. Fortunately there is a compiler flag to turn this off, but it's on by default.

As a guy who has worked in C# since 2005, a breaking change would make me pretty irate. Backwards compatibility has its benefits.

What issues do you have with backwards compatibility?


NRT wasn't really breaking as it's a warning which you control top level. But there have been some real breaking changes in edge cases but they are pretty far between. I think the language could be better if it was always structured in the best way possible, rather than in the best compatible way.

As a class library example (which is contrary to what I said earlier about .NET compatibility vs C# compatibility) is that it was a massive mistake to let double.ToString() use the current culture rather than the invariant culture. It should change to either required passing a culture always (breaking API change) or change to use invariantculture (behaviour change requiring code changes to keep old behavior)


>a massive mistake to let double.ToString() use the current culture rather than the invariant culture.

I would imagine that's a carryover from the Win32/Client-Server days when that would have been a better choice.

Is that annoying? Yea. Is that annoying enough to force companies to collectively spend billions to look through their decades old codebases for double.ToString() and add culture arguments? Also keep in mind, this is a runtime issue, so the time to fix would be much more than if it were a compile issue. I would say no.


Nowadays you just apply https://learn.microsoft.com/en-us/dotnet/core/runtime-config... and call it a day. It is also added as a default to all AOT templates.


That's a great idea (and after the fact, much better than changing the API). On day 1 it should have been easy though.


> Delphi

Just the move to Unicode (i.e. from 2007 to 2009) took some work, but otherwise I can't think of any intentional breaking changes...? In fact, it's one of the most stable programming environments I know of – granted, in part because of being a little stagnant (but not dead).


I seem to recall some in Delphi 4, but it's been forever.


Ah yes, the version released in 1998. Let's ignore the 26 years since then...

I've been using Delphi since Delphi 3. The only really breaking change I can recall was the Unicode switch. And that was just a minor blip really. Our 300kloc project at work took a couple of days to clean up the compiler errors and it's been Unicode-handling ever since. It's file integration and database heavy, so lots of string manipulation.

Most of my hobby projects didn't need any code changes.

In fact, the reason Delphi was late to the Unicode party was precisely because they spent so much time designing it to minimize impact on legacy code.

Not saying there hasn't been some cases, but the developers of Delphi have had a lot of focus on keeping existing code running fine. We have a fair bit of code in production that is decades old, some before y2k, and it just keeps on ticking without modification as we upgrade Delphi to newer versions.


>Let's ignore the 26 years since then...

The market has been ignoring Delphi for that long. It probably peaked with D5, once they changed their name from Borland to Inprise, it was over.

I hear it's still somewhat popular in Eastern European countries, but I heard that several years ago.


You don't need to rewrite old .net project to compile it in a new machine.

But is also not a trivial task.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: