Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Maybe it's just me but there's something fundamentally unappealing about bolt-on static type-systems on top of pre-existing dynamic languages. Sure, what's come about with e.g. TypeScript is pragmatic and very useful, but if imagining a future utopia then TS/JS++ and similar would not feature in the dream.


It's a nightmare, not a dream, if the language is mutable. Mutability is the worst and least fixable issue with almost every popular language today.


I don't really like Typescript, I think I would rather just code in Javascript. Typescript adds a lot of structure and tries to shoehorn more of a Java like programming style. A lot of the type checks seem unnecessary. If I am going to use types I would rather use Fable or ReasonML, something that really just has types from the start.


Why not? I can't think of anything off the top of my head that I think is fundamentally broken about TypeScript, or even that I strongly dislike about it.


- Too many statements instead of expressions.

- Lacks structural pattern-matching.

- Lacks first-class sum-types (although they can be clumsily encoded as objects with "kind" properties).

- Static type-checking not directly (or at all) contributing to the execution performance of the code. This is particularly egregious with e.g. CPython.

Presence of the first three are the sort of things that people appreciate in Good static languages (as opposed to the status quo static languages you alluded to that Python/Ruby/JS/etc were rebelling against in the 2000s).

The last is the dead giveaway of not utopia but rather a bizarre compromised situation.


Depending on who you ask, the fact that the type system is unsound could be considered a form of brokenness.


A small sample:

The existence of `any`; worse yet, the use of `any` in the standard library.

Mutable arrays are treated as covariant: the classic `cats : Cat[] ; animals: Animal[] = cats; animals.push(dog);` problem.

Methods are both co- and contravariant in their arguments by default, which is comically wrong. Member variables which are functions, meanwhile, are handled correctly.

`readonly` is a lie:

    const test: {readonly a: number } = {a: 0}
    const test2: {a: number} = test
    test2.a = 5
`Record<string, string>` is actually `Record<string, string | undefined>`. There's no way for an interface to specify that it really does return a value at any string index (e.g. for a map with a default value).

`{...object1, ...object2}` is typed as the intersection `typeof object1 & typeof object2`, which is not correct.

    const a: {a: 5} = {a: 5}
    const b: {a: 4} = {a: 4}

    const spread = <L, R>(l: L, r: R): L & R => ({...l, ...r})
    const impossible: never = spread(a, b)
You have to resort to bizarre conditional type hackery to control when unions distribute and when they don't.

You can constrain generics as `<T extends string>foo(t: T) => ...` but not `<string extends T>foo(t: T) => ...`, which makes many functions (e.g. Array.includes) much too strict about what they accept.


Its type system is apparently Turing complete. That’s a pretty major flaw, imo. Meaning that static analysis isn’t always possible.


Turing complete type systems are extremely common. C#, C++, Java - it's hard to avoid if you have subtyping and generics. In practice it almost never comes up.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: