> so high performance "python" is actually going to always rely on restricted subsets of the language that don't actually match language's "real" semantics.
I don't even understand what this means. If I write `def foo(x):` versus `def foo(x: int) -> float:`, one is a restricted subset of the other, but both are the language's "real" semantics. Restricted subsets of languages are wildly popular in programming languages, and for very varied reasons. Why should that be a barrier here?
Personally, if I have to annotate some of my code that run with C style semantics, but in return that part runs with C speed, for example, then I just don't really mind it. Different tools for different jobs.
> If I write `def foo(x):` versus `def foo(x: int) -> float:`, one is a restricted subset of the other, but both are the language's "real" semantics.
You either are performing some wordplay here or you don't understand but type hints are not part of the semantics at all: since they are not processed at all they do not affect the behavior of the function (that's what semantics means).
EDIT: according to the language spec and current implementation
I don't even understand what this means. If I write `def foo(x):` versus `def foo(x: int) -> float:`, one is a restricted subset of the other, but both are the language's "real" semantics. Restricted subsets of languages are wildly popular in programming languages, and for very varied reasons. Why should that be a barrier here?
Personally, if I have to annotate some of my code that run with C style semantics, but in return that part runs with C speed, for example, then I just don't really mind it. Different tools for different jobs.