Swift regret: inferred types for globals and stored properties— Jordan Rose (@UINT_MIN) September 24, 2021
In Swift, as in many other statically-typed languages these days, you can say `var x = 5` and the compiler figures out that `x` is an Int. That’s fine.
Part of the Swift Regrets series.
In Swift, as in many other statically-typed languages these days, you can say
var x = 5 and the compiler figures out that
x is an Int. Sometimes this is extremely useful, e.g. for types like
<Slice<RingBuffer<Int>>>. (Though I can hear the non-generic language folks saying “types dreamed up by the utterly deranged!”) Most of the time, a local’s type is clear enough from its name and use.
Aside: another thing you can do is specify part of the type. This is useful for collection literals:
let items: Set = [foo, bar, baz] // Set<Item>, not Array<Item> or Set<Any>
All of this is a matter of expression type-checking, something you have to do anyway to initialize the binding. That seems like it should apply to globals and stored properties as well as local variables…except globals and stored properties can be used across files. Which means cross-file uses of a global might have to do type-checking just to figure out its type.
“Well, but expression type-checking shouldn’t be slow anyway.” I hear you, but it’s some wasted work regardless. If Swift only allowed you to omit the type for locals, it would never have to type-check expressions from other files (except for inlining in whole-module optimization). But there’s another argument here anyway: library evolution. It’s not surprising that changing the type of a public global or property would break source compatibility, but if the type is inferred, it’s easier to do that by accident. Most of Swift makes you write public things explicitly.
“Okay but writing
: String on all my string constants seems redundant.” Again, I hear you. I think it’d be reasonable to allow a few specific exceptions: if you omit the type and the expression is syntactically a string, integer, floating-point, or boolean literal, infer the default type.
There’s a bit of complication here with opaque result types, because you can’t spell those at all. I think it would work out okay, though? Do people actually store those by that type? If we had this requirement already I don’t think it would have blocked opaque result types.
Finally, computed properties and defaulted parameters already can’t have inferred types, and I think that was the correct decision. That just leaves globals and stored properties, and I don’t think those are special enough to need inferred types, especially if there were an exception for literals.