Typescript: Suggestion: option to include undefined in index signatures

Created on 31 Jan 2017  ·  242Comments  ·  Source: microsoft/TypeScript

Update: for my latest proposal see comment https://github.com/Microsoft/TypeScript/issues/13778#issuecomment-406316164

With strictNullChecks enabled, TypeScript does not include undefined in index signatures (e.g. on an object or array). This is a well known caveat and discussed in several issues, namely https://github.com/Microsoft/TypeScript/issues/9235, https://github.com/Microsoft/TypeScript/issues/13161, https://github.com/Microsoft/TypeScript/issues/12287, and https://github.com/Microsoft/TypeScript/pull/7140#issuecomment-192606629.

Example:

const xs: number[] = [1,2,3];
xs[100] // number, even with strictNullChecks

However, it appears from reading the above issues that many TypeScript users wish this wasn't the case. Granted, if index signatures did include undefined, code will likely require much more guarding, but—for some—this is an acceptable trade off for increased type safety.

Example of index signatures including undefined:

const xs: number[] = [1,2,3];
xs[100] // number | undefined

I would like to know whether this behaviour could be considered as an extra compiler option on top of strictNullChecks. This way, we are able to satisfy all groups of users: those who want strict null checks with or without undefined in their index signatures.

Committed Suggestion

Most helpful comment

We remain quite skeptical that anyone would get any benefit from this flag in practice. Maps and maplike things can already opt in to | undefined at their definition sites

Isn't one of the goals of TypeScript to allow errors to be caught at "compile" time rather than rely on the user to remember/know to do something specific? This seems to go against that goal; requiring the user to do something in order to avoid crashes. The same could be said for many other features; they're not needed if the developer always does x. The goal of TypeScript is (presumably) to make the job easier and eliminate these things.

I came across this bug because I was enabling strictNullChecks on existing code and I already had a comparison so I got the error. If I'd been writing brand new code I probably wouldn't have realised the issue here (the type system was telling me I always always getting a value) and ended up with a runtime failure. Relying on TS developers to remember (or worse, even know) that they're supposed to be declaring all their maps with | undefined feels like TypeScript is failing to do what people actually want it for.

All 242 comments

With the exception of strictNullChecks, we do not have flags that change the type system behavior. flags usually enable/disable error reporting.
you can always have a custom version of the library that defines all indexers with | undefined. should work as expected.

@mhegazy That's an interesting idea. Any guidance on how to override the type signatures for array/object?

There isinterface Array<T> in lib.d.ts. I searched by the regexp \[\w+: (string|number)\] to find other indexing signatures as well.

Interesting, so I tried this:

{
    // https://github.com/Microsoft/TypeScript/blob/1f92bacdc81e7ae6706ad8776121e1db986a8b27/lib/lib.d.ts#L1300
    declare global {
        interface Array<T> {
            [n: number]: T | undefined;
        }
    }

    const xs = [1,2,3]
    const x = xs[100]
    x // still number :-(
}

Any ideas?

copy lib.d.ts locally, say lib.strict.d.ts, change the index signature to [n: number]: T | undefined;, include the file in your compilation. you should see the intended effect.

Cool, thanks for that.

The issue with the suggested fix here is it requires forking and maintaining a separate lib file.

I wonder if this feature is demanded enough to warrant some sort of option out of the box.

On a side note, it's interesting that the type signature for the get method on ES6 collections (Map/Set) returns T | undefined when Array/Object index signatures do not.

this is a conscious decision. it would be very annoying for this code to be an error:

var a = [];
for (var i =0; i< a.length; i++) {
    a[i]+=1; // a[i] is possibly undefined
}

and it would be unreasonable to ask every user to use !. or to write

var a = [];
for (var i =0; i< a.length; i++) {
    if (a[i]) {
        a[i]+=1; // a[i] is possibly undefined
    }
}

For map this is not the case generally.

Similarly for your types, you can specify | undefined on all your index signatures, and you will get the expected behavior. but for Array it is not reasonable. you are welcome to fork the library and make whatever changes you need to do, but we have no plans to change the declaration in the standard library at this point.

I do not think adding a flag to change the shape of a declaration is something we would do.

@mhegazy but for arrays with holes a[i] is actually possibly undefined:

let a: number[] = []
a[0] = 0
a[5] =0
for (let i = 0; i < a.length; i++) {
  console.log(a[i])
}

Output is:

undefined
undefined
undefined
undefined
0

We remain quite skeptical that anyone would get any benefit from this flag in practice. Maps and maplike things can already opt in to | undefined at their definition sites, and enforcing EULA-like behavior on array access doesn't seem like a win. We'd likely need to substantially improve CFA and type guards to make this palatable.

If someone wants to modify their lib.d.ts and fix all the downstream breaks in their own code and show what the overall diff looks like to show that this has some value proposition, we're open to that data. Alternatively if lots of people are really excited to use postfix ! more but don't yet have ample opportunities to do so, this flag would be an option.

We remain quite skeptical that anyone would get any benefit from this flag in practice. Maps and maplike things can already opt in to | undefined at their definition sites

Isn't one of the goals of TypeScript to allow errors to be caught at "compile" time rather than rely on the user to remember/know to do something specific? This seems to go against that goal; requiring the user to do something in order to avoid crashes. The same could be said for many other features; they're not needed if the developer always does x. The goal of TypeScript is (presumably) to make the job easier and eliminate these things.

I came across this bug because I was enabling strictNullChecks on existing code and I already had a comparison so I got the error. If I'd been writing brand new code I probably wouldn't have realised the issue here (the type system was telling me I always always getting a value) and ended up with a runtime failure. Relying on TS developers to remember (or worse, even know) that they're supposed to be declaring all their maps with | undefined feels like TypeScript is failing to do what people actually want it for.

We remain quite skeptical that anyone would get any benefit from this flag in practice. Maps and maplike things can already opt in to | undefined at their definition sites
Isn't one of the goals of TypeScript to allow errors to be caught at "compile" time rather than rely on the user to remember/know to do something specific?

Actually the goal is:

1) Statically identify constructs that are likely to be errors.

What is being discussed here the likelyhood of an error (low in the opinion of the TypeScript team) and the common productive usability of the language. Some of the early change to CFA have been to be less alarmist or improve the CFA analysis to more intelligently determine these things.

I think the question from the TypeScript team is that instead of arguing the strictly correctness of it, to provide examples of where this sort of strictness, in common usage would actually identify an error that should be guarded against.

I went into the reasoning a bit more at this comment https://github.com/Microsoft/TypeScript/issues/11238#issuecomment-250562397

Think of the two types of keys in the world: Those which you know do have a corresponding property in some object (safe), those which you don't know to have a corresponding property in some object (dangerous).

You get the first kind of key, a "safe" key, by writing correct code like

for (let i = 0; i < arr.length; i++) {
  // arr[i] is T, not T | undefined

or

for (const k of Object.keys(obj)) {
  // obj[k] is T, not T | undefined

You get the second kind from key, the "dangerous" kind, from things like user inputs, or random JSON files from disk, or some list of keys which may be present but might not be.

So if you have a key of the dangerous kind and index by it, it'd be nice to have | undefined in here. But the proposal isn't "Treat dangerous keys as dangerous", it's "Treat all keys, even safe ones, as dangerous". And once you start treating safe keys as dangerous, life really sucks. You write code like

for (let i = 0; i < arr.length; i++) {
  console.log(arr[i].name);

and TypeScript is complaining at you that arr[i] might be undefined even though hey look I just @#%#ing tested for it. Now you get in the habit of writing code like this, and it feels stupid:

for (let i = 0; i < arr.length; i++) {
  // TypeScript makes me use ! with my arrays, sad.
  console.log(arr[i]!.name);

Or maybe you write code like this:

function doSomething(myObj: T, yourObj: T) {
  for (const k of Object.keys(myObj)) {
    console.log(yourObj[k].name);
  }
}

and TypeScript says "Hey, that index expression might be | undefined, so you dutifully fix it because you've seen this error 800 times already:

function doSomething(myObj: T, yourObj: T) {
  for (const k of Object.keys(myObj)) {
    console.log(yourObj[k]!.name); // Shut up TypeScript I know what I'm doing
  }
}

But you didn't fix the bug. You meant to write Object.keys(yourObj), or maybe myObj[k]. That's the worst kind of compiler error, because it's not actually helping you in any scenario - it's only applying the same ritual to every kind of expression, without regard for whether or not it was actually more dangerous than any other expression of the same form.

I think of the old "Are you sure you want to delete this file?" dialog. If that dialog appeared every time you tried to delete a file, you would very quickly learn to hit del y when you used to hit del, and your chances of not deleting something important reset to the pre-dialog baseline. If instead the dialog only appeared when you were deleting files when they weren't going to the recycling bin, now you have meaningful safety. But we have no idea (nor could we) whether your object keys are safe or not, so showing the "Are you sure you want to index that object?" dialog every time you do it isn't likely to find bugs at a better rate than not showing it all.

Statically identify constructs that are likely to be errors.

Perhaps this needs to be amended to say "Statically identify constructs that are more likely than others to be errors." :wink:. I'm reminded of when we get bugs that are essentially "I used * when I meant to use /, can you using make * a warning?"

I understand, but index out of range is a real and common issue; forcing people to enumerate arrays in a way that they can't do this would not be a bad thing.

The fix with ! I actually dislike too - what if someone comes along and makes a change such that the assumption is now invalid? You're back to square one (a potential runtime failure for something that the compiler should catch). There should be safe ways of enumerating arrays that do not rely on either lying about the types or using ! (eg. can't you do something like array.forEach(i => console.log(i.name)?).

You already narrow types based on code so in theory couldn't you could spot patterns that are safe narrow the type to remove | undefined in those cases, giving best of both worlds? I'd argue that if you can't easily convey to the compiler that you're not accessing a valid element then maybe your guarantee is either invalid or could easily be accidentally be broken in future.

That said, I only use TS on one project and that will ultimately be migrated to Dart so it's unlikely to make any real difference to me. I'm just sad that the general quality of software is bad and there's an opportunity to help eliminate errors here that is seemingly being ignored for the sake of convenience. I'm sure the type system could be made solid and the common annoyances addressed in a way that doesn't introduce these holes.

Anyway, that's just my 2 cents.. I don't want to drag this out - I'm sure you understand where we're coming from and you're far better placed to make decisions on this than me :-)

I think there are a few things to consider. There are a lot of patterns for iterating over arrays in common use that account for the number of elements. While it is a possible pattern to just randomly access indexes on arrays, in the wild that is a very uncommon pattern and is not likely to be a statical error. While there are modern ways to iterate, the most common would be something like:

for (let i = 0; i < a.length; i++) {
  const value = a[i];
}

If you assume spare arrays are uncommon (they are) it is of little help to have value be | undefined. If there is a common pattern, in the wild, where this is risky (and likely an error) then I think the TypeScript would listen to consider this, but the patterns that are in general use, having to again again against all values of an index access be undefined is clearly something that affects productivity and as pointed out, can be opted into if you are in a situation where it is potentially useful.

I think there has been conversation before about improving CFA so that there is a way to express the co-dependancy of values (e.g. Array.prototype.length relates to the index value) so that things like index out of bounds could be statically analysed. Obviously that is a significant piece of work, wrought with all sorts of edge cases and considerations I wouldn't like to fathom (though it is likely Anders wakes up in a cold sweat over some things like this).

So it becomes a trade off... Without CFA improvements, complicate 90% of code with red herrings to catch potentially 10% bad code. Otherwise it is investing in major CFA improvements, which might be wrought with their own consequences of stability and issues against again, finding what would be unsafe code.

There is only so much TypeScript can do to save us from ourselves.

All this focus is on arrays and I agree it's less likely to be an issue there, but most of the original issues raised (like mine) were about maps where I don't think the common case is always-existing keys at all?

All this focus is on arrays and I agree it's less likely to be an issue there, but most of the original issues raised (like mine) were about maps where I don't think the common case is always-existing keys at all?

If this is your type, add | undefined to the index signature. It is already an error to index into an type with no index signature under --noImplicitAny.
ES6 Map is already defined with get as get(key: K): V | undefined;.

i rewrote all definitions of Arrays and Maps to make index signatures returning | undefined, never regreted since that, found a few bugs, it doesn't cause any discomfort because i work with arrays indirectly via a handmade lib that keeps checks for undefined or ! inside of it

would be great if TypeScript could control flow the checks like C# does (to eliminate index range checks to save some processor time), for example:

declare var values: number[];
for (let index = 0, length = values.length; index< length; index ++) {
   const value = value[index]; // always defined, because index is within array range and only controlled by it
}

(to those who uses sparse arrays - kill yourself with hot burning fire)

as for Object.keys, it takes a special type say allkeysof T to let the control flow analysis do safe narrowings

I think this would be a good option to have, because right now we are essentially lying about the type of the indexing operation, and it can be easy to forget to add | undefined to my object types. I think adding ! in the cases where we know we want to ignore undefineds would be a nice way to deal with indexing operations when this option is enabled.

There are (at least) two other problems with putting |undefined in your object type definitions:

  • it means you can assign undefined into these objects, which is definitely not intended
  • other operations, like Object.values (or _.values) will require you to handle undefined in the results

tslint reports false-positive warning of constant condition, because typescript returns wrong type information (= lacking | undefined).
https://github.com/palantir/tslint/issues/2944

One of the regularly skipped errors with the absence of | undefined in the array indexing is this pattern when used in place of find:

const array = [ 1, 2, 3 ];
const firstFour = array.filter((x) => (x === 4))[0];
// if there is no `4` in the `array`,
// `firstFour` will be `undefined`, but TypeScript thinks `number` because of the indexer signature.
const array = [ 1, 2, 3 ];
const firstFour = array.find((x) => (x === 4));
// `firstFour` will be correctly typed as `number | undefined` because of the `find` signature.

I would definitely use this flag. Yes, old for loops will be annoying to work with, but we have the ! operator to tell the compiler when we know it's defined:

for (let i = 0; i < arr.length; i++) {
  foo(arr[i]!)
}

Also, this problem is not a problem with the newer, way better for of loops and there is even a TSLint rule prefer-for-of that tells you to not use old-style for loops anymore.

Currently I feel like the type system is inconsistent for the developer. array.pop() requires an if check or a ! assertion, but accessing via [array.length - 1] does not. ES6 map.get() requires an if check or a ! assertion, but an object hash does not. @sompylasar's example is also good.

Another example is destructuring:

const specifier = 'Microsoft/TypeScript'
const [repo, revision] = specifier.split('@') // types of repo and revision are string
console.log('Repo: ' + repo)
console.log('Short rev: ' + revision.slice(0, 7)) // Error: Cannot call function 'slice' on undefined

I would have preferred if the compiler forced me to do this:

const specifier = 'Microsoft/TypeScript'
const [repo, revision] = specifier.split('@') // types of repo and revision are string | undefined
console.log('Repo: ', repo || 'no repo')
console.log('Short rev:', revision ? revision.slice(0, 7) : 'no revision')

These are actual bugs I've seen that could have been prevented by the compiler.

Imo this shouldn't belong into the typings files, but should rather be a type system mechanic - when accessing anything with an index signature, it can be undefined. If your logic ensured that it isn't, just use !. Otherwise add an if and you're good.

I think a lot of people would prefer the compiler to be strict with some needed assertions than to be loose with uncaught bugs.

I'd really like to see this flag added. In my company's code base, array random access is the rare exception and for loops are code smells that we'd usually want to rewrite with higher-order functions.

@pelotom what is your concern then (since it seems you mostly got yourself out of trouble)?

@aleksey-bykov mostly object index signatures, which occur extensively in third-party libraries. I would like accessing a property on { [k: string]: A } to warn me that the result is possibly undefined. I only mentioned array indexing because it was brought up as a case for why the flag would be too annoying to work with.

you know you can rewrite them exactly the way you want? (given a bit of extra work)

Yes, I could rewrite everyone's typings for them, or I could switch on a compiler flag 😜

keep playing captain O...: you can rewrite your lib.d.ts today and be a happy owner of more sound codebase or you can wait for the flag for the next N years

@aleksey-bykov how can it be done by rewriting lib.d.ts?

declare type Keyed<T> = { [key: string]: T | undefined; }

then in the Array defintion in lib.es2015.core.d.ts, replace

[n: number]: T;

with

[n: number]: T | undefined;

@aleksey-bykov maybe you missed the part where I said I don't care about arrays. I care about where third party libraries have declared something to be of type { [k: string]: T }, and I want accessing such an object to return something possibly undefined. There's no way to accomplish that by simply editing lib.d.ts; it requires changing the signatures of the library in question.

do you have control over 3rd party definition files? if so you can fix them

And now we're back to

Yes, I could rewrite everyone's typings for them, or I could switch on a compiler flag 😜

Time is a flat circle.

don't be silly, you don't use "everyone's typings" do you? it's literally a day of work max for a typical project, been there done it

Yes, I have to edit others' typings all the time and I'd like to do it less.

and you will in N years, maybe, for now you can suffer or man up

Thanks for your incredibly constructive input 👍

constructive input for this is as follows, this issue needs to be closed, because:
a. either the decision on whether [x] can be undefined or not is left to the developers by

  • letting them keep it all in their heads as they always did before
  • or by altering lib.d.ts and 3rd-party.d.ts as was suggested

b. or it takes special syntax / types / flow analysis / N years to mitigate something that can be easily done by hands in #a

The issue is a proposal for (b), except no new syntax is being proposed, it's just a compiler flag.

What it comes down to is that the type { [x: string]: {} } is almost always a lie; barring the use of Proxy, there's no object which can have an infinite number of properties, much less every possible string. The proposal is to have a compiler flag which recognizes this. It may be that it's too hard to implement this for what is gained; I'll leave that call to the implementors.

the point is that neither

  • T | undefined
  • nor T

is right for the general case

in order to make it right for the general case you need to encode the information about the prerense of values into the types of their containers which calls for a dependent type system ... which by itself isn't a bad thing to have :) but might be as complex as all current typescript type system done to this day, for the sake of ... saving you some edits?

T | undefined is correct for the general case, for reasons I just gave. Gonna ignore your nonsensical ramblings about dependent types, have a nice day.

you can ignore me as much as you want but T | undefined is an overshoot for

declare var items: number[];
for (var index = 0; index < items.length; index ++) {
   void items[index];
}

I'd rather have T | undefined there by default and tell the compiler that index is a numeric index range of items thus doesn't get out if bounds when applied to items; in the simple cases such as a set of frequently used for/while loop shapes, the compiler could infer that automatically; in complex cases, sorry, there can be undefineds. And yes, value-based types would be a good fit here; literal string types are so useful, why not have literal boolean and number and range/set-of-ranges types? As far as TypeScript goes, it tries to cover everything that can be expressed with JavaScript (in contrast to, for example, Elm which limits that).

it's literally a day of work max for a typical project, been there done it

@aleksey-bykov, curious what was your experience after that change? how often do you have to use !? and how often do you find the compiler flagging actual bugs?

@mhegazy honestly i didn't notice much difference moving from T to T | undefined, neither did i catch any bugs, i guess my problem is that i work with arrays via utility functions which keep ! in them, so literally there was no effect for the outside code:

https://github.com/aleksey-bykov/basic/blob/master/array.ts

In which lib file can I find the index type definition for objects? I have located and updated Array from [n: number]: T to [n: number]: T | undefined. Now I would like to do the same thing for objects.

there is no standard interface (like Array for arrays) for objects with the index signature, you need to look for exact definitions per each case in your code and fix them

you need to look for exact definitions per each case in your code and fix them

How about a direct key lookup? E.g.

const xs = { foo: 'bar' }
xs['foo']

Is there any way to enforce T | undefined instead of T here? Currently I use these helpers in my codebase everywhere, as type safe alternatives to index lookups on arrays and objects:

// TS doesn't return the correct type for array and object index signatures. It returns `T` instead
// of `T | undefined`. These helpers give us the correct type.
// https://github.com/Microsoft/TypeScript/issues/13778
export const getIndex = function<X> (index: number, xs: X[]): X | undefined {
  return xs[index];
};
export const getKeyInMap = function<X> (key: string, xs: { [key: string]: X }): X | undefined {
  return xs[key];
};

@mhegazy As I write this, I am fixing a bug in production on https://unsplash.com that could have been caught with stricter index signature types.

i see, consider mapped type operator:

const xs = { foo: 'bar' };
type EachUndefined<T> = { [P in keyof T]: T[P] | undefined; }
const xu : EachUndefined<typeof xs> = xs;
xu.foo; // <-- string | undefined

If a flag like --strictArrayIndex is not an option because flags are not designed to change the lib.d.ts files. Maybe you guys can release strict versions of lib.d.ts files like "lib": ['strict-es6']?

It could contain multiple improvements, not just strict array index. For example, Object.keys:

interface ObjectConstructor {
    // ...
    keys(o: {}): string[];
}

Could be:

interface ObjectConstructor {
    // ...
    keys<T>(o: T): (keyof T)[];
}

Update from today's SBS: We yelled at each other for 30 minutes and nothing happened. 🤷‍♂️

@RyanCavanaugh What's an SBS, out of curiosity?

@radix "Suggestion Backlog Slog"

thats intriguing, because the solution is obvious:
both Tand T | undefined are wrong, the only right way is to make the index variable aware of the capacity of its container, either by picking from a set or by enclosing it in a known numeric range

@RyanCavanaugh i ve been thinking abou this, it looks like the following trick would cover 87% of all cases:

  • values[index] gives T if index declared in for (HERE; ...)
  • values[somethingElse] gives T | undefined for all variables declared outside of for

@aleksey-bykov we discussed something even smarter - that there could be an actual type guard mechanism for "arr[index] was tested by index < arr.length. But that wouldn't help the case where you pop'd in the middle a loop, or passed your array to a function that removed elements from it. It really doesn't seem like people are looking for a mechanism that prevents OOB errors 82.9% of the time -- after all, it's already the case that approximately that fraction of array-indexing code is correct anyway. Adding ceremony to correct code while not catching bugs in the incorrect cases is a bad outcome.

not to imply that Aleksey would ever mutate an array

I ported an app from Elm to Typescript recently and indexing operations being incorrectly typed is probably one of the biggest sources of bugs I've run into, with all the strictest settings of TS enabled (also stuff like this being unbound).

  • you cant track mutations to the container
  • you cant track index manipulations

with this said little can you guarantee, if so why would you even try if a typical use case is dumb iteration through all elements within < array.length

like i said usually

  • if there is an error, it is likely originates from somewhere in the clauses of the for statement: initialization, increment, stop condition and thus is not something you can verify, because it requires turing complete typechecking
  • outside of for clauses there is usually no place for error

so as long as index is declared somehow (nothing we can say about it with confidence) it is reasonably fair to believe that the index is correct withing the loop body

But that wouldn't help the case where you pop'd in the middle a loop, or passed your array to a function that removed elements from it

This seems like a really weak argument against this. Those cases are already broken - using them as an excuse not to fix other cases makes no sense. Nobody here is asking for either of those cases to be handled correctly nor for it to be 100% correct. We just want accurate types in a really common case. There are many cases in TypeScript that aren't handled perfectly; if you're doing something weird, types might be wrong. In this case we're not doing anything weird and the types are wrong.

Adding ceremony to correct code

I'm curious to see a code example of where this is "adding ceremony to correct code". As I understand it, a basic for loop over an array is easy to detect/narrow. What's the real world code that isn't a simple for loop where this becomes inconvenient which is not potentially a bug? (either now, or could be from a trivial change in the future). I'm not saying there isn't any; I just can't picture it and haven't seen any examples (I've seen plenty of examples using for loops, but unless you're saying these can't be narrowed they seem irrelevant).

while not catching bugs

There have been examples of both working code that fails to compile because of this and code that throws at runtime because the types mislead the developer. Saying there is zero value in supporting this is nonsense.

Why can't we just keep it simple and not add _any_ magic behaviour around old-style for loops at all. You can always use ! to make things work. If your codebase is full of old-style for loops, don't use the flag. All modern codebases I've worked with use either forEach or for of to iterate arrays and those codebases would benefit from the additional type safety.

we're not doing anything weird and the types are wrong.

This paragraph, to me, reads like a good reason to not have this feature. Accessing an array out-of-bounds is a weird thing to do; the types are only "wrong" if you're doing an uncommon thing (OOBing). The vast majority of code reading from an array does so in-bounds; it would be "wrong" to include undefined in those cases by this argument.

... working code that fails to compile

I'm not aware of any - can you point to them specifically?

Why can't we just keep it simple and not add any magic behaviour around old-style for loops at all. You can always use ! to make things work.

What's the useful difference between this and a TSLint rule that says "All array access expressions must have a trailing !" ?

@RyanCavanaugh my assumption is that the TSLint rule would not be able to narrow types or use control-flow type analysis (e.g. wrapping the access in an if, throwing an exception, returning or continueing if it is not set, etc). It is also not just about array access expressions, it is also about destructuring. How would the implementation look like for the example in https://github.com/Microsoft/TypeScript/issues/13778#issuecomment-336265143? To enforce the !, it would essentially have to create a table to track types of these variables as being possibly undefined. Which to me sounds like something that the type checker should do, not a linter.

@RyanCavanaugh

This paragraph, to me, reads like a good reason to not have this feature. Accessing an array out-of-bounds is a weird thing to do; the types are only "wrong" if you're doing an uncommon thing (OOBing).

... working code that fails to compile

I'm not aware of any - can you point to them specifically?

In my case it wasn't an array, but it was closed as a dupe so I assume this issue is supposed to be covering these too. See #11186 for my original issue. I was parsing a file into a map and then checking them against undefined. IIRC I got the error on the comparison that they can't be undefined, even though they could (and were).

It's always allowed to compare something to undefined

which is a shame

const canWeDoIt = null === undefined; // yes we can!

It's always allowed to compare something to undefined

It's been so long, I'm afraid I don't remember exactly what the error was. I definitely had an error for code that was working fine (without strictNullChecks) and it lead me to the testing that resulted in the case above.

If I get some time, I'll see if I can figure it out exactly what it was again. It was definitely related to this typing.

Most of this discussion has been focused on arrays, but, in my experience, object access is where this feature would be most helpful. For example, this (simplified) example from my codebase looks reasonable and compiles fine, but is definitely a bug:

export type Chooser = (context?: Context) => number | string;
export interface Choices {
    [choice: number]: Struct;
    [choice: string]: Struct;
}

export const Branch = (chooser: Chooser, choices: Choices, context?: Context): Struct => {
    return choices[chooser(context)];  // Could be undefined
}

Regarding objects and simply changing the signature to include | undefined, @types/node does that for process.env:

    export interface ProcessEnv {
        [key: string]: string | undefined;
    }

but it doesn't allow narrowing the type at all:

process.env.SOME_CONFIG && JSON.parse(process.env.SOME_CONFIG)

gives

[ts]
Argument of type 'string | undefined' is not assignable to parameter of type 'string'.
  Type 'undefined' is not assignable to type 'string'.

@felixfbecker

I believe that is related to this bug: https://github.com/Microsoft/TypeScript/issues/17960

You can workaround it by assigning the env var to a variable, and then guarding that:

const foo = process.env.SOME_CONFIG
foo && JSON.parse(foo);

Adding the | undefined in some node typings was done to allow augmentation of these interface for convenience with frequently used properties to get code completion. e.a.

interface ProcessEnv {
  foo?: string;
  bar?: string;
}

Without the | undefined in the index signature TSC complains with Property 'foo' of type 'string | undefined' is not assignable to string index type 'string'..

I know that this has a cost if you have to work with types having the extra | undefined and ones which don't have it.

Would be really nice if we could get rid of this.

I'm trying to collect my thoughts on this issue.

Objects are a mixed bag in JavaScript. They can be used for two purposes:

  • dictionaries aka maps, where the keys are unknown
  • records, where the keys are known

For objects used as records, index signatures do not need to be used. Instead, known keys are defined on the interface type. Valid key lookups return the corresponding type, invalid key lookups will fallback to the index signature. If there is no index signature defined (which there shouldn't be for objects used as records), and noImplicitAny is enabled, this will error as desired.

For objects used as dictionaries (aka maps) and arrays, index signatures are used, and we can choose to include | undefined in the value type. For example, { [key: index]: string | undefined }. All keys lookups are valid (because keys are not known at compile time), and all keys return the same type (in this example, T | undefined).

Seeing as index signatures should only be used for the dictionary objects pattern and arrays, it is desired that TypeScript should enforce | undefined in the index signature value type: if the keys are not known, and key lookups possibly return undefined.

There are good examples of bugs that may appear invisible without this, such as Array.prototype.find returning undefined, or key lookups such as array[0] returning undefined. (Destructuring is just sugar syntax for key lookups.) It is possible to write functions like getKey to correct the return type, but we have to rely on discipline to enforce use of these functions across a codebase.

If I understand correctly, the issue then becomes about reflection over dictionary objects and arrays, such that when mapping over the keys of a dictionary object or array, key lookups are known to be valid i.e. will not return undefined. In this case, it would be undesirable for value types to include undefined. It may be possible to use control flow analysis to fix this.

Is this the outstanding issue, or do I misunderstand?

Which use cases and problems haven't I mentioned?

There are (at least) two other problems with putting |undefined in your object type definitions:

  • it means you can assign undefined into these objects, which is definitely not intended
  • other operations, like Object.values (or _.values) will require you to handle undefined in the results

I think this is a very important point.

At the moment, I am experimenting with the following approach:

Define const safelyAccessProperty = <T, K extends keyof T>(object: T, key: K): T[K] | undefined => object[key];

Then access properties like safelyAccessProperty(myObject, myKey), instead of myObject[myKey].

@plul Good catch. The discussion is currently focused on read operations, but the indexer type definition is in fact two-fold, and adding | undefined would allow writing undefined values.

The safelyAccessProperty function you are experimenting with (mentioned above as getKey by @OliverJAsh) requires discipline and/or a linter rule to forbid indexing operations on all arrays and objects.

This can be made scalable if the function is provided on all array and object instances (every type that provides indexer operations), like in C++ std::vector has .at() which throws an exception in runtime for OOB access, and an unchecked [] operator which in best case crashes with SEGFAULT on OOB access, in worst case corrupts memory.

I think the OOB access problem is not solvable in TypeScript/JavaScript at the type definition level alone, and requires language support to restrict potentially dangerous indexer operations if this strictness feature is enabled.

The two-fold nature of the indexer could be modeled as a property with get and set operations as functions, but that would be a breaking change for all existing indexer type definitions.

One idea that seems like it could be promising: what if you could use ?: when defining an index signature to indicate that you expect values to be missing sometimes under normal use? It would act like | undefined mentioned above, but without the awkward downsides. It would need to disallow explicit undefined values, which I guess is a difference from usual ?:.

It would look like this:

type NewWay = {[key: string]?: string};
const n: NewWay = {};

// Has type string | undefined
n['foo']

// Has type Array<string>
Object.values(n)

// Doesn't work
n['foo'] = undefined;

// Works
delete n['foo'];

Compared to the previous approach of | undefined:

type OldWay = {[key: string]: string | undefined};
const o: OldWay = {};

// Has type string | undefined
o['foo']

// Has type Array<string | undefined>
Object.values(o)

// Works
o['foo'] = undefined;

// Works
delete o['foo'];

I came here from rejecting adding | undefined in the DT PR above, as it would break all the existing users of that API - could this be better looked at as allowing the user to pick how fussy they want to be, rather than the library?


I'll note optional properties add the | undefined as well, and that has bitten me a few times - essentially TS doesn't distinguish between a missing property and a property set to undefined. I would just like { foo?: T, bar?: T } to be treated the same as { [name: 'foo' | 'bar']: T }, whichever way that goes (see also the process.env comments above)


Is TS against breaking the symmetry here on number and string indexers?

foo[bar] && foo[bar].baz() is a very common JS pattern, it feels clumsy when it's not supported by TS (in the sense of reminding you that you need to if you don't add | undefined, and warning when it's obviously not required if you do).


Regarding mutating arrays during iteration breaking the guard expression guarantee, that's possible with the other guards, too:

class Foo {
    foo: string | number = 123

    bar() {
        this.foo = 'bar'
    }

    broken() {
        if (typeof this.foo === 'number') {
            this.bar();
            this.foo.toLowerCase(); // right, but type error
            this.foo.toExponential(); // wrong, but typechecks
        }
    }
}

but I guess that's a lot less likely in real code that old loops mutating the iteratee.

It's clear that there is a demand for this feature. I really hope TS team will find some solution. Not by just adding | undefined to indexer because it has it's own issues (mentioned already) but by more "clever" way (reading returns T|undefined, writing requires T, good compiler checking for for loop, etc. good proposition were also mentioned already.)

We are fine with runtime error when we mutate and work with arrays in non trivial, difficult to verify by compiler way. We just want error checking for most cases and are fine with using ! sometimes.

Having said that, if this stricter handling of arrays would be implemented, now with #24897 it would be possible to implement nice type narrowing when checking array length with constant. We could just narrow array to tuple with rest element.

let arr!: string[];
if (arr.length == 3) {
  //arr is of type [string, string, string]
}

if (arr.length > 3) {
  //arr is of type [string, string, string, string, ...string[]]
}

if (arr.length) {
  //arr is of type [string, ...string[]]
}

if (arr.length < 3) {
  //arr is of type [string?, string?, string?]
  if (arr.length > 0) {
    //arr is of type [string, string?, string?]
  }
}

It would be useful when you index by constant or destructure an array.

let someNumber = 55;
if (arr.length) {
  let el1 = arr[0]; //string
  let el2 = arr[1]; //string | undefined
  let el3 = arr[someNumber]; //string | undefined
}

if(arr.length >= 3){
    let [el1, el2, el3, el4] = arr;
    //el1, el2, el3 are string
    // el4 is string | undefined    
}

if (arr.length == 2){
    let [el1, el2, el3] = arr; //compiler error: "Tuple type '[string, string]' with length '2' cannot be assigned to tuple with length '3'.",
}

Other question is what would we do with big numbers like:

if(arr.length >= 99999){
    // arr is [string, string, ... , string, ...string[]]
}

We can't show the type of this huge tuple in IDE or compiler messages.

I guess we could have some syntax to represent "tuple of certain length with same type for all items". So for example the tuple of 1000 strings is string[10000] and the type of narrowed array from above example could be [...string[99999], ...string[]].

The other concern is if compiler infrastructure can support such big tuples now, and if not how hard would it be to do.

Objects

I always want an index type of [key: string (or number, symbol)]: V | undefined, only sometimes I forget about the undefined case. Whenever a dev has to explicitely tell the compiler "trust me, this is the type of that thing for real" you know it's unsafe.
It makes very little sense to type Map.get properly (strictly) but somehow plain objects get a free pass.
Still, this is easily fixable in user land, so it's not too bad. I don't have a solution, at any rate.

Arrays

Perhaps I'm missing something but it seems the argument that "you almost never access an Array in an unsafe way" can go both ways, especially with a new compiler flag.

I tend to think more and more people follow these two best practices:

  • Use functional native methods or libraries to iterate or transform Arrays. No bracket access here.
  • Don't mutate Arrays in place

With this in mind, the only remaining and rare cases where you need low level bracket access logic would really benefit from type safety.

This one seems like a no brainer and I don't think copy-pasting the entire lib.d.ts locally is an acceptable workaround.

When we explicitly index into an array/object, e.g. to get the first item of an array (array[0]), we want the result to include undefined.

This is possible by adding undefined to the index signature.

However, if I understand correctly, the issue with including undefined in the index signature is that, when mapping over the array or object, the values will include undefined even though they're known not to be undefined (otherwise we wouldn't be mapping over them).

Index types/signatures are used for both index lookups (e.g. array[0]) and mapping (e.g. for loops and Array.prototype.map), but we require different types for each of these cases.

This is not an issue with Map because Map.get is a function, and therefore its return type can be separate from the inner value type, unlike indexing into an array/object which is not a function, and therefore uses the index signature directly.

So, the question becomes: how can we satisfy both cases?

// Manually adding `undefined` to the index signature
declare const array: (number | undefined)[];

const first = array[0]; // number | undefined, as desired :-)
type IndexValue = typeof array[0]; // number | undefined, as desired! :-)

array.map(x => {
  x // number | undefined, not desired! :-(
})

Proposal

A compiler option which treats index lookups (e.g. array[0]) similar to how Set.get and Map.get are typed, by including undefined in the index value type (not the index signature itself). The actual index signature itself would not include undefined, so that mapping functions are not effected.

Example:

declare const array: number[];

// The compiler option would include `undefined` in the index value type
const first = array[0]; // number | undefined, as desired :-)
type IndexValue = typeof array[0]; // number | undefined, as desired :-)

array.map(x => {
  x // number, as desired :-)
})

This however won't solve the case of looping over array/objects using a for loop, as that technique uses index lookups.

for (let i = 0; i < array.length; i++) {
  const x = array[i];
  x; // number | undefined, not desired! :-(
}

For me and I suspect many others, this is acceptable because for loops are not used, instead preferring to use functional style, e.g. Array.prototype.map. If we did have to use them, we would be happy using the compiler option suggested here along with non-null assertion operators.

for (let i = 0; i < array.length; i++) {
  const x = array[i]!;
  x; // number, as desired :-)
}

We could also provide a way to opt-in or opt-out, e.g. with some syntax to decorate the index signature (please forgive the ambiguous syntax I came up with for the example). This syntax would just be a way of signalling which behaviour we want for index lookups.

Opt-out (compiler option enables by default, opt-out where needed):

declare const array: { [index: number]!!: string };

declare const dictionary: { [index: string]!!: string }

Opt-in (no compiler option, just opt-in where needed):

declare const array: { [index: string]!!: string };

declare const dictionary: { [index: string]??: string }

I haven't read up on this issue or the pros and cons, various proposals, etc. (just found it in Google after being repeatedly surprised that array/object access isn't handled consistently with strict null checks), but I have a related suggestion: an option to make array type inference as strict as possible unless specifically overridden.

For example:

const balls = [1, 2 ,3];

By default, balls would be treated as [number, number, number]. This could be overridden by writing:

const balls: number[] = [1, 2 ,3];

Further, tuple element access would be handled consistently with strict null checks. It's surprising to me that in the following example n is currently inferred as number even with strict null checks enabled.

const balls: [number, number, number] = [1, 2 ,3];
const n = balls[100];

I would also expect array mutation methods such as .push to not exist in the tuple type definition, since such methods change the run-time type to be inconsistent with the compile-time type.

Nice! Well that's interesting timing. I'd had the release announcement open, but hadn't read through it yet; only came here after running into a situation where I needed to do some weird casting ((<(T|undefined)[]> arr).slice(-1)[0]) to make TypeScript (2.9) do what I wanted it to do.

Just wanted to bring things back to this suggestion: https://github.com/Microsoft/TypeScript/issues/13778#issuecomment-383072468

This would fix the problems with indexed types that I've experienced. It'd be great if it was the default, but I understand that that would break plenty of things in the real world.

@mhegazy @RyanCavanaugh Any thoughts on my proposal? https://github.com/Microsoft/TypeScript/issues/13778#issuecomment-406316164

To me, there's a simple solution. Enable a flag which checks for this. Then, instead of:

const array = [1, 2, 3];
for (var i =0; i< array.length; i++) {
    array[i]+=1; // array[i] is possibly undefined
}

You do:

const array = [1, 2, 3];
array.forEach((value, i) => array[i] = value + 1);

Then, when doing random index accesses, you are required to check if the result is undefined, but not while iterating an enumerated collection.

I still think this warrants having an open issue.

As a programmer who is new to TypeScript, I found the situation around indexing objects in strict mode to be unintuitive. I would have expected the result of a lookup to be T | undefined, similarly to Map.get. Anyway, I just ran into this recently, and opened an issue in a library:

screepers/typed-screeps#107

I'm probably going to close it now, because it seems there's no good solution. I think I'm going to try "opting-in" to T | undefined by using a little utility function:

export function lookup<T>(map: {[index: string]: T}, index: string): T|undefined {
  return map[index];
}

There were some suggestions here to explicitly set T | undefined as the object index operation return type, however that does not seem to be working:

const obj: {[key: string]: number | undefined} = {
  "a": 1,
  "b": 2,
};

const test = obj["c"]; // const test: number

This is VSCode version 1.31.1

@yawaramin Make sure you have strictNullChecks enabled in your tsconfig.json (which is also enabled by the strict flag)

If your use case requires arbitrary indexing on arrays of unknown length, I think it is worth adding the undefined explicitly (if only to "document" that unsafeness).

const words = ... // some string[] that could be empty
const x = words[0] as string | undefined
console.log(x.length) // TS error

Tuples work for small arrays of known lengths. Maybe we could have something like string[5] as a shorthand for [string, string, string, string, string] ?

Very much in favor of this as an option. It's a noticeable hole in the type system, particularly when the strictNullChecks flag is enabled. Plain objects are used as maps all the time in JS, so TypeScript should support that use case.

Ran into this with array destructing of a function parameter:

function foo([first]: string[]) { /* ... */ }

Here I'd expect a to be of type string | undefined but it's just string, unless I do

function foo([first]: (string | undefined)[]) { /* ... */ }

I don't think we have a single for loop in our code base, so I'd happily just add a flag to my tsconfig's compiler options (could be named strictIndexSignatures) to toggle this behavior for our project.

This is how I've been working around this issue: https://github.com/danielnixon/total-functions/

Good workaround, I really hope this gets sherlocked by the TypeScript team.

When a programmer makes assumptions in written code, and the compiler cannot deduce that it is save, this should result in a compiler error unless silenced IMHO.

This behavior would also be very helpful in combination with the new optional chaining operator.

I ran into an issue with using | undefined with map today while using Object.entries().

I have an Index type that's described fairly well by {[key: string]: string[]}, with the obvious caveat that not every string key possible is represented in the Index. However, I wrote a bug that TS didn't catch when trying to consume a value looked up from the Index; didn't handle the undefined case.

So I changed it to {[key: string]: string[] | undefined} as recommended, but now this leads to issues with my use of Object.entries(). TypeScript now assumes (reasonably, based on the type specification) that the Index may have keys that have a value of undefined specified, and so assumes the result of calling Object.entries() on it may contain undefined values.

However, I know this to be impossible; the only time I should get an undefined result is looking up a key that doesn't exist, and that would not be listed when using Object.entries(). So to make TypeScript happy I either have to write code that has no real reason to exist, or override the warnings, which I'd prefer not to do.

@RyanCavanaugh, I assume your original reply to this is still the TS team's current position? Both as a bump to that because nobody reads the thread and checking in case a couple more years experience with more powerful JS collection primitives, and introducing several other options to increase strictness since have changed anything.

(The examples there are still somewhat unconvincing to me, but this thread's already made all the arguments, another comment isn't going to change anything)

If someone wants to modify their lib.d.ts and fix all the downstream breaks in their own code and show what the overall diff looks like to show that this has some value proposition, we're open to that data.

@RyanCavanaugh Here are some of the cases from my code base where some runtime crashes are slumbering. Note that I already had cases where I had crashes in production because of this and needed to release a hotfix.

src={savedAdsItem.advertImageList.advertImage[0].mainImageUrl || undefined}
return advert.advertImageList.advertImage.length ? advert.advertImageList.advertImage[0].mainImageUrl : ''
birthYear: profileData.birthYear !== null ? profileData.birthYear : allowedYears[0].value,
upsellingsList.upsellingProducts[0].upsellingProducts[0].selected = true
const latitude = parseFloat(coordinates.split(',')[0])
const advert = Object.values(actionToConfirm.selectedItems)[0]
await dispatch(deactivateMyAd(advert))

In this case it would be annoying as ArticleIDs extends articleNames[] includes undefined in the resulting values, while it should just allow completely defined subsets. Easily fixable by using ReadonlyArray<articleNames> instead of articleNames[].

export enum articleNames {
    WEB_AGB = 'web_agb',
    TERMS_OF_USE = 'web_terms-of-use',
}
export const getMultipleArticles = async <ArticleIDs extends articleNames[], ArticleMap = { [key in ArticleIDs[number]]: CmsArticle }>(ids: ArticleIDs): Promise<ArticleMap> => {...}

All in all I'd really like to have this extra type safety for preventing possible runtime crashes.

I went into the reasoning a bit more at this comment #11238 (comment)

Think of the two types of keys in the world: Those which you know _do_ have a corresponding property in some object (safe), those which you _don't_ know to have a corresponding property in some object (dangerous).

You get the first kind of key, a "safe" key, by writing correct code like

for (let i = 0; i < arr.length; i++) {
  // arr[i] is T, not T | undefined

or

for (const k of Object.keys(obj)) {
  // obj[k] is T, not T | undefined

You get the second kind from key, the "dangerous" kind, from things like user inputs, or random JSON files from disk, or some list of keys which may be present but might not be.

So if you have a key of the dangerous kind and index by it, it'd be nice to have | undefined in here. But the proposal isn't "Treat _dangerous_ keys as dangerous", it's "Treat _all_ keys, even safe ones, as dangerous". And once you start treating safe keys as dangerous, life really sucks. You write code like

for (let i = 0; i < arr.length; i++) {
  console.log(arr[i].name);

and TypeScript is complaining at you that arr[i] might be undefined even though _hey look I just @#%#ing tested for it_. Now you get in the habit of writing code like this, and it feels stupid:

for (let i = 0; i < arr.length; i++) {
  // TypeScript makes me use ! with my arrays, sad.
  console.log(arr[i]!.name);

Or maybe you write code like this:

function doSomething(myObj: T, yourObj: T) {
  for (const k of Object.keys(myObj)) {
    console.log(yourObj[k].name);
  }
}

and TypeScript says "Hey, that index expression might be | undefined, so you dutifully fix it because you've seen this error 800 times already:

function doSomething(myObj: T, yourObj: T) {
  for (const k of Object.keys(myObj)) {
    console.log(yourObj[k]!.name); // Shut up TypeScript I know what I'm doing
  }
}

But you didn't fix the bug. You meant to write Object.keys(yourObj), or maybe myObj[k]. That's the worst kind of compiler error, because it's not actually helping you in any scenario - it's only applying the same ritual to every kind of expression, without regard for whether or not it was actually more dangerous than any other expression of the same form.

I think of the old "Are you sure you want to delete this file?" dialog. If that dialog appeared every time you tried to delete a file, you would very quickly learn to hit del y when you used to hit del, and your chances of not deleting something important reset to the pre-dialog baseline. If instead the dialog only appeared when you were deleting files when they weren't going to the recycling bin, now you have meaningful safety. But we have no idea (nor _could_ we) whether your object keys are safe or not, so showing the "Are you sure you want to index that object?" dialog every time you do it isn't likely to find bugs at a better rate than not showing it all.

I do agree with the delete file dialog analogy, however I think that this analogy can also be extended to forcing user to check something that is possibly undefined or null, so this explanation don't really make sense, because if this explanation is true, the strictNullChecks option is going to induce the same behaviour, for example getting some element from the DOM using document.getElementById.

But that is not the case, a lot of TypeScript user wants the compiler to raise flag about such code so that those edge cases can be handled appropriately instead of throwing the Cannot access property X of undefined error which is very very hard to trace.

In the end, I hope these kind of features can be implemented as extra TypeScript compiler options, because that's the reason users want to use TypeScript, they want to be warned about dangerous code.

Talking about accessing array or objects wrongly is unlikely to happen, do you have any data to backup this claim? Or is it just based on arbitrary gut feeling?

for (let i = 0; i < arr.length; i++) {
  console.log(arr[i].name);

TypeScript‘s Control Flow Based Type Analysis could be improved to recognize this case to be safe and not require the !. If a human can deduce it is safe, a compiler can too.
I‘m aware that this might be non-trivial to implement in the compiler though.

TypeScript‘s Control Flow Based Type Analysis could be improved to recognize this case to be safe

It really can't.

declare function someFunc(arr: number[], i: number): void;
let arr = [1, 2, 3, 4];
for (let i = 0; i < arr.length; i++) {
  someFunc(arr, arr[i]);
}

Does this function pass an undefined to someFunc on the second pass through the loop, or doesn't it? There are a lot of things I could write in someFunc that would result in an undefined showing up later.

What about this?

declare function someFunc(arr: number[], i: number): void;
let arr = [1, 2, 3, 4];
let alias = arr;
for (let i = 0; i < arr.length; i++) {
  someFunc(alias, arr[i]);
}

@fabb let me give another example:

```
$ node

const arr = []
undefined
arr[7] = 7
7
arr
[ <7 empty items>, 7 ]
for (let i = 0; i < arr.length; i++) {
... console.log(arr[i])
... }
undefined
undefined
undefined
undefined
undefined
undefined
undefined
7
undefined```

@RyanCavanaugh since you're here, how about inferring item :: T for arr :: T[] in for (const item of arr) ..., and otherwise inferring arr[i] :: T | undefined when using some --strict-index? How about the case I care about, obj[key] :: V | undefined but Object.values(obj) :: V[] for obj :: { [key: string]: V }?

@yawaramin If you're using sparse arrays, Typescript already doesn't do the right thing. A --strict-index flag would fix that. This compiles:

const arr = []
arr[7] = 7

for (let i = 0; i < arr.length; i++) {
    console.log(Math.sqrt(arr[i]));
}

@RyanCavanaugh There's one very common example that I can show you where user are prone to accessing array incorrectly.

const getBlock = (unitNumber: string): string => unitNumber.split('-')[0]

The code above shouldn't be pass the compiler checking gracefully under strictNullChecks, because some uses of getBlock will return undefined, for example getBlock('hello'), in such cases I seriously wants the compiler the raise flag so that I can handle the undefined cases gracefully without exploding my application.

And this also applies to a lot of common idioms such as accessing the last element of an array with arr.slice(-1)[0], accessing the first element arr[0] etc.

Ultimately, I want TypeScript to annoy me for such errors rather than having to deal with exploded applications.

Does this function pass an undefined to someFunc on the second pass through the loop, or doesn't it? There are a lot of things I could write in someFunc that would result in an undefined showing up later.

@RyanCavanaugh Yes, JavaScript doesn‘t lend itself to immutability. In this case either a ! should be necessary, or a ReadonlyArray array: someFunc(arr: ReadonlyArray<number>, i: number).

@yawaramin For sparse arrays, the element type probably needs to include undefined unless TypeScript can deduce that it‘s used like a tuple. In the code that @danielnixon linked to (https://github.com/microsoft/TypeScript/issues/13778#issuecomment-536248028), tuples are also treated special and do not include undefined in the returned element type since the compiler guarantees that only set indices are accessed.

this is a conscious decision. it would be very annoying for this code to be an error:

var a = [];
for (var i =0; i< a.length; i++) {
    a[i]+=1; // a[i] is possibly undefined
}

Hey, i recognize that syntax; i write loops like this maybe once per year!

I found that in most instances where i actually do index into an array, i actually want to check for undefined afterwards.

Fears of making this particular case harder to work with seem overblown. If you just want to iterate over the elements you can use for .. of. If you need the element index for some reason, use forEach or iterate over entries. In general it is extremely rare that you really need an index-based for loop.

If there are better reasons of why one would want the status quo, i would like to see them, but regardless: This is an inconsistency in the system and having a flag to fix it would be highly appreciated by many, it seems.

Hey everyone, I feel like much of the discussion to be had here has been
had. The package owners have been quite clear about their reasoning and
have considered most of these arguments already. If they plan to address
this, I’m sure they will make it known. Other than that, I don’t think this
thread is really productive.

On Fri, Oct 25, 2019 at 11:59 AM brunnerh notifications@github.com wrote:

this is a conscious decision. it would be very annoying for this code to
be an error:

var a = [];for (var i =0; i< a.length; i++) {
a[i]+=1; // a[i] is possibly undefined
}

Hey, i recognize that syntax; i write loops like this maybe once per year!

I found that in most instances where i actually do index into an array, i
actually want to check for undefined afterwards.

Fears of making this particular case harder to work with seem overblown.
If you just want to iterate over the elements you can use for .. of. If
you need the element index for some reason, use forEach or iterate over
entries. In general it is extremely rare that you really need an
index-based for loop.

If there are better reasons of why one would want the status quo, i would
like to see them, but regardless: This is an inconsistency in the system
and having a flag to fix it would be highly appreciated by many, it seems.


You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/microsoft/TypeScript/issues/13778?email_source=notifications&email_token=ACAJU3DQ7U6Y3MUUM26J4JDQQM62XA5CNFSM4C6KEKAKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOECJIE7Y#issuecomment-546472575,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/ACAJU3EWVM3CUFG25UF5PGDQQM62XANCNFSM4C6KEKAA
.

I feel like, while there may not be movement from package owners, continued feedback from the community is still valuable, because it shows there is still a demand for better tooling around this issue.

@brunnerh I agree with you, nowadays I don't even need to use for loops unless it's for performance tuning, which happens 0% of the time in the codebase of my company, because most of the time changing map/filter/reduce to for loop rarely improves the performance, the real culprit is always about inefficient logic, network issue and database connections.

I am suprised no one talked about as const yet.

const test = [1, 2, 3] as const;

(test[100]).toFixed(5);
// Tuple type 'readonly [1, 2, 3]' of length '3' has no element at index '100'.

More generally, and not exactly related to the initial message of this issue I've been using the following defensive programming patterns for the past months, and it worked well (for me)

const xs: Array<number | undefined> = [1,2,3];

// for objects but kind of related as well
Record<string, User | undefined>

interface Something {
  [key: string]: User | undefined
}

Even though there is not short notation for it (like ?), I feel it's fine. Telling the compiler yourself that you may not be sure the value is defined is imho fine.

@martpie as const is great if you can use it, that is.

There are at least two reasons why i would rather not use Array<T | undefined>:

  1. It is one more thing you have to remember to do every time you use an array. Also, you cannot just use implicit typing any more, which is nice to have.
  2. It changes the signatures of forEach, map and filter, which do not pass undefined as element argument (unless that index is explicitly set that way). Depending on how much you use those functions that could be annoying to deal with.

I also want to chime in that this causes a lot of false positives in eslint/typescript now:

const a: string[] = [];
const foo = a[1000];
if (foo) { // eslint says this is an unnecessary conditional
  console.log(foo.length);
}

Eslint (correctly) infers from the type that this is an unnecessary check, because foo can never be null. So now not only do I have to consciously remember to do a null check on things I get back from an array, I _also_ have to add an eslint disable line! And accessing things outside of a for loop like this is virtually the only way we ever do array access, since (like probably most TS devs these days), we use forEach/map/filter/etc functions when looping over arrays.

I really think we should just have a new compiler flag that's set to true by default if strict is true, and if people don't like it, they can opt out. We've added new breaking strict compiler checks in recent versions anyways.

This is the source of quite likely the _only_ runtime prod bugs I've seen in recent memory that was a failure of the type system.

Considering the ergonomics of the now available optional chaining operator, maybe it's time to revisit the decision to _not_ make this behaviour available via a flag?

If I get an array of Thing from the server, and I need to display the last Thing in the array, something I run into commonly is that the array is actually empty, and accessing a property on that Thing crashes the app.

Example:

// `things` is `Thing[]`, but is empty, i.e., `[]`
const { things } = data; 

// We are accessing `things[-1]`, which is obviously `undefined`, 
// but TypeScript thinks `latestThing` is a `Thing`
const latestThing = things[things.length - 1];

// TypeError: Cannot read property 'foo' of undefined
return latestThing.foo; 

Maybe this is an issue with the design of our API, but it really feels like TS should understand that when you access something in the array, it might not actually be there.

Edit: Just want to clarify I meant "our API" as in "the API created by team I am working on"

yes, everyone pretty much agrees it's a very bad design decision, one contemporary to a remote time when TS was completely unsafe.

As a workaround, I use a small dirty hack. I just add in package.json the simple postinstall script:

{
...
  "scripts": {
    "postinstall": "sed -i 's/\\[n: number\\]: T;/[n: number]: T | undefined;/g' node_modules/typescript/lib/lib.es5.d.ts",
    ...
  },
...
}

Of course, it won't work on windows, but that's better than nothing.

All the other issues that run adjacent to this one seem to get redirected here, so I'll assume this is the best place to ask: is a shorthand syntax for optional index signatures off the table? As @martpie points out, you have to write interface Foo { [k: string]: Bar | undefined; } which is less ergonomic than interface Foo { [k: string]?: Bar; }.

Can we get ?: operator support? If not, why not? That is, the ergonomic benefit was sufficient to add the feature for single property definition -- is it not "helpful enough" for index signatures?

type Foo = { [_ in string]?: Bar } also works. Not as pretty, but fairly terse. Could also make your own Dict type if you wanted. Wouldn't argue against a ?: extension, though

This is beginning to feel like one of those "Javascript: The Bad Parts" talks:

ts type Foo1 = { [_ in string]?: Bar } // Yup type Foo2 = { [_: string]?: Bar } // Nope interface Foo3 { k?: Bar } // Yup interface Foo4 { [_: string]?: Bar } // Nope

Using T | undefined for the type signature is really not useful. We need a way to make it so that the index operator [n] has a T|undefined type, but e.g. using map on an array should not give us T|undefined values, because in that situation we should know they exist.

@radix For the actual functions on Array, I don't think this will be a problem because they all have their own type definitions that output the right type: e.g. map: https://github.com/microsoft/TypeScript/blob/master/lib/lib.es5.d.ts#L1331

The only common code usage that the | undefined construct poses an actual regression of experience is in for ... of loops. Unfortunately (from the perspective of this issue), those are rather common constructs.

@riggs I think you're talking about making interface Array<T> { } have [index: number]: T | undefined, but @radix is likely talking about what seems to be the current recommendation, which is to use Array<T | undefined> in your own code.

The latter is bad for several reasons, not least of which is that you don't control other packages' types, but the former also has some issues, namely that you can assign undefined to the array, and that it gives undefined even in known to be safe cases. 🤷‍♂️

Ahh, yes, my misunderstanding. I was, indeed, only referring to using the definition[index: number]: T | undefined. I completely agree that defining a type as Array<T | undefined> is a terrible work-around that causes more problems than it solves.

Is there a graceful way to override lib.es5.d.ts or is a postinstall script the best way to go?

@nicu-chiciuc https://www.npmjs.com/package/patch-package Totally fits into a heretic's toolbox alongside https://www.npmjs.com/package/yalc ✌️

Is there a graceful way to override lib.es5.d.ts or is a postinstall script the best way to go?

In our project we have a global.d.ts file that we use for 1) adding types definitions for built-in APIs that are not yet in typescript's default type definitions (e.g. WebRTC-related APIs which are constantly changing and inconsistent across browsers) and 2) overriding some of typescript's default type definitions (e.g. overriding the type of Object.entries so that it returns arrays containing unknown instead of any).

I'm not sure if this approach could be used to override typescript's array types to solve this issue, but it might be worth a try.

The above works when the declaration merging gives the result you want, but to simplify they intersect interfaces, but here the less restrictive option is what you want.

You could instead try copy-pasting all the lib.*.d.ts files you are using into your project, included in tsconfig.json's files or include, then edit them to what you want. Since they include /// <reference no-default-lib="true"/>, you shouldn't need any other magic, but I'm not sure if you have to remove the /// <reference lib="..."/> they have to their dependencies. This is bad for all the obvious maintainability reasons, of course.

@butchler We're also using this approach but it seemed that overriding the indexing operator was not possible, although we've overridden some other Array functions (filter with type guards) successfully.

I'm having a strange case with overriding the indexing operator:

function test() {
  const arr: string[] = [];

  const [first] = arr;
  const zero = arr[0];

  const str1: string = first;
  const str2: string = zero;
}

Screenshot 2020-02-05 at 10 39 20 AM

The second assignment errs (as it should), yet the first one does not.
Even stranger is that hovering over first as it being destructured shows that it has a type of string | undefined yet hovering over it while it is being assigned shows that it has a type of string.
Screenshot 2020-02-05 at 10 40 25 AM
Screenshot 2020-02-05 at 10 40 32 AM

Is there a different type definition for array destructuring?

Looking for a solution to this as errors relating to missing indexes have been a frequent source of bugs in my code.

Specifying a type like { [index: string]: string | undefined } is not a solution, as it messes up typing for iterators like Object.values(x).forEach(...) which will never include undefined values.

I'd like to see TypeScript generate errors when I don't check for undefined after doing someObject[someKey], but not when doing Object.values(someObject).forEach(...).

@danielnixon That's not a solution, that's a workaround. There's nothing stopping you or another developer from mistakenly (if you can even call it that) using the language's built-in tools for the same objectives. I mean, I use fp-ts for this stuff, but this issue is still valid and needs fixing.

While I can follow the contra arguments from @RyanCavanaugh with:

for (let i = 0; i < arr.length; i++) {
  // TypeScript makes me use ! with my arrays, sad.
  console.log(arr[i]!.name);

I would drop in some short thoughts on it. At the core, TypeScript aims to make development, well... safer. To remember the first goal of TypeScript:

1. Statically identify constructs that are likely to be errors.

If we look at the compiler, we have already some magic behind the type inference. In the particular case the really excuse is: "we can't infer the right type on this construct" and it's totally fine. Over the time we have less and less cases, where compile is not able to infer the types. In other words, for me it's just matter of time (and spending time on this) to get more sophisticated type inference.

From technical point of view, constructs like:

const x = ['a', 'b', 'c']
console.log(x[3]) // type: string, reality: undefined

breaks the first goal of TypeScript. But this doesn't happens if TypeScript knows the more precise types:

const x = ['a', 'b', 'c'] as const
console.log(x[3]) // compile error: Tuple type 'readonly ["a", "b", "c"]' of length '3' has no element at index '3'.ts(2493)

From practical point of view, it's a trade-off yet. This issue and multiple closed issues shows that there is a high demand of the community for this change: ATM 238 upvoters vs. 2 downvoters. Of course it's pity that above for loop doesn't inference the "right" type, but well, I'm pretty sure that the most of upvoters can live with ! and the new ? as sign of attention and force it in safe cases. But on the other side get the right types on "dangerous" access.

Because I agree that it's a very sensitive change for big code bases, I vote for a configuration property, like it was proposed here. At least for getting feedback from community. If it's not possible, then, well, TS 4.0 should get it.

This is not a proposed solution, but just an experiment: I tried modifying lib.es5.d.ts inside my node_modules on an existing project that uses TypeScript just to see what it would be like if we _did_ have a compiler option for this. I modified Array and ReadonlyArray:

interface ReadonlyArray<T> {
  ...
  [n: number]: T | undefined; // was just T
}

interface Array<T> {
  ...
  [n: number]: T | undefined; // was just T
}

Because this is a very large project, this caused several hundred type errors. I went through just a few of them to get an idea of what kind of issues I'd run into and how hard they would be to work around if there was a compiler option for this. Here's a few of the issues I ran into:

  1. This caused type errors not only in our codebase, but in one of our dependencies: io-ts. Since io-ts allows you to create types that you use in your own code, I don't think it's possible to only apply this option to your own codebase and not also apply it to io-ts's types. This means that io-ts and probably some other libraries would still have to be updated to work with this option even if it were only introduced as a compiler option. At first I thought making this a compiler option would make this less controversial, but if people that do choose to use the option start complaining to a bunch of different library authors about incompatibilities, it may be even more controversial than it simply being a TS 4.0 breaking change.

  2. Sometimes I had to add some extra type guards to eliminate the possibility of undefined. This is not necessarily a bad thing, and is kind of the whole point of this proposal, but I'm just mentioning it for completeness.

  3. I had a type error inside a for (let i = 0; i < array.length; i++) loop that was looping over an arbitrary Array<T>. I couldn't just add a type guard to check for undefined, because T itself might include undefined. The only solutions I could think of were to A) use a type assertion or @ts-ignore to silence the type error or B) use a for-of loop instead. Personally I don't find this to be too bad (there are probably few cases where using an for-of loop to iterate over an array is not better anyway), but it may be controversial.

  4. There are many cases where the existing code was already making some assertion about the value of .length and then accessing an element in the array. These now caused type errors despite the .length check, so I had to either change the code to not depend on a .length check or just add a redundant !== undefined check. It would be really nice if TypeScript could somehow allow using a .length check to avoid the need for the !== undefined check. I assume actually implementing that would not be trivial, though.

  5. Some code was using A[number] to get the type of the elements of a generic array type. However, this now returned T | undefined instead of just T, causing type errors elsewhere. I made a helper to work around this:

    type ArrayValueType<A extends { [n: number]: unknown }> = (
      A extends Array<infer T> ? T :
      A extends ReadonlyArray<infer T> ? T :
      A[number] // Fall back to old way of getting array element type
    );
    

    but even with this helper to ease the transition, this is still a large breaking change. Perhaps TypeScript could have some kind of special case for A[number] to avoid this issue, but it would be nice to avoid weird special cases like that if possible.

I only went through a small handful of the hundreds of type errors, so this is probably a very incomplete list.

For anyone else interested in fixing this issue, it might be useful to try doing the same thing with some existing projects and share what other issues you run into. Hopefully the specific examples can provide some guidance to anyone who actually tries to implement this.

@butchler I've also ran through some of these issues, I started viewing something[i] as something(i), that is, I cannot just use something like if (Meteor.user() && Meteor.user()._id) {...}, so I don't expect to have this approach for array indexing; I have to first get the value that I want to inspect out of the array. What I'm trying to say is that relying on TS to understand that checking the length property and make some assertion based on that might put too much stress on the type system. One thing that made me put off the transition (besides the fact that some other libraries also have errors, as you've said) is that array destructuring is not yet working correctly and the value destructured will not be T | undefined but T (see my other comment).
Other than that, I think overriding lib.es5.d.ts is a good approach. Even if thing will change in the future, reverting the change will not add additional errors, but will ensure that some edge-cases are already covered.
We've already started to use patch-package for change some type definitions in react and this approach seems to work fine.

I've also ran through some of these issues, I started viewing something[i] as something(i), that is, I cannot just use something like if (Meteor.user() && Meteor.user()._id) {...}, so I don't expect to have this approach for array indexing; I have to first get the value that I want to inspect out of the array.

Yeah, I also ended up using this approach when I was trying my experiment. For example, code that was previously written as:

if (array[i]) {
  array[i].doSomething(); // causes a type error with our modified Array types
}

had to be changed to something like:

const arrayValue = array[i]
if (arrayValue) {
  arrayValue.doSomething();
}

What I'm trying to say is that relying on TS to understand that checking the length property and make some assertion based on that might put too much stress on the type system.

Probably true. It might actually be easier to write a codemod for automatically re-writing code that depends on assertions about .length to use some other approach, than to make TypeScript smart enough to infer more about the types based on assertions about the .length 😛

Even if thing will change in the future, reverting the change will not add additional errors, but will ensure that some edge-cases are already covered.

This is a good point. While including undefined in index types is a huge breaking change, going in the other direction is not a breaking change. Code that is able to handle getting T | undefined should also be able to handle T without any changes. This means, for example, that libraries could be updated to handle the T | undefined case if there was a compiler flag and still be used by projects that don't have that compiler flag enabled.

Ignoring arrays, and focusing on Record<string, T> for a moment, my personal wishlist is that writing only allows T but reading may be T|undefined.

declare const obj : Record<string, T>;
declare const t : T;
obj["k"] = t; //ok
obj["k"] = undefined; //error, undefined not assignable to T

//T|undefined inferred,
//since we don't know if "k2" is an "ownProperty" of obj
const v = obj["k2"];

The only way for the above to be ergonomic would be some kind of dependent typing, and dependent type guards. Since we don't have those, adding this behavior would cause all kinds of problems.

//Shouldn't just be string[]
//Should also be something like (keyof valueof obj)[],
//A dependent type
const keys = Object.keys(obj);

Going back to arrays, the problem is that the index signature for arrays does not have the same intent as Record<number, T>.

So, you would require entirely different dependent type guards like,

for (let i=0; i<arr.length; ++i) {
  //i is not just number
  //i should also be something like keyof valueof arr 
}

So, the index signature for arrays isn't really Record<number, T>. It's more like Record<(int & (0 <= i < this["length"]), T> (a range and number-backed integer type)


So, while the original post just talks about arrays, the title seems to suggest index signatures of "just" string or "just" number. And they're two completely different discussions.

TL;DR Original post and title talk about different things, implementing either looks heavily dependent (ha) on dependent typing.

Given that functions like forEach, map, filter, etc exist (and are IMHO much more preferable), I would not expect the TS team to greatly complicate their inference engine to support looping over an array with a regular for loop. I have a feeling that there is just too much unexpected complexity from trying to accomplish that. For instance, since i is not a constant, what happens if someone changes the value of i inside the loop? Sure, it's an edge case, but it's something they need to handle in a (hopefully) intuitive way.

Fixing index signatures, however, should be relatively simple(ish), as the above comments have shown.

4. There are many cases where the existing code was already making some assertion about the value of .length and then accessing an element in the array. These now caused type errors despite the .length check, so I had to either change the code to not depend on a .length check or just add a redundant !== undefined check. It would be really nice if TypeScript could somehow allow using a .length check to avoid the need for the !== undefined check. I assume actually implementing that would not be trivial, though.

@butchler
Out of curiosity, how many of these were accessing with a changing variable, vs accessing with a fixed number? Because, in the latter case, you're presumably using the array as a tuple, for which TS does keep track of array length. If you are frequently using arrays as tuples, I'd be curious what re-typing them as actual tuples does to the amount of errors.

If you are frequently using arrays as tuples, I'd be curious what re-typing them as actual tuples does to the amount of errors.

I did run into one case where I fixed a type error by just adding an as const to an array literal to turn the variable's type into a tuple. I have absolutely no idea how common that is in our codebase, though, since I only looked at about 10 errors out of several hundred.

I also hit an issue with this today.
We're accessing from an array via a computed index. That index could be out of range.
So we have something like this in our code:

const noNext = !items[currentIndex + 1];

This results in noNext being defined as false. Which is wrong. It can be true.
I also don't want to define items as Array<Item | undefined> because that gives a wrong expectation.
If an index is there it should never be undefined. But if you're using the wrong index, it _is_ undefined.
Sure, the above could probably be solved by using a .length check instead or defining noNext explicitly as boolean.
But in the end this is something that bothers me since I started using TypeScript and I never understood why | undefined is not included by default.

Since I expect most to use typescript-eslint, is there any rule that could enforce that values have to be checked after indexing before they can be used? This might be helpful before support from TS is implemented.

In case it may be of interest to someone. In a previous comment I mentioned that the approach of patching the Array indexing definition in lib.es5.d.ts has some strange behavior on array destructuring since it seemed that it was not influenced by the change. It seems that it depends on the target option in tsconfig.json and it works correctly when it is es5 (https://github.com/microsoft/TypeScript/issues/37045).

For our project that's not a problem since we're using the noEmit option and the transpilation is handled by meteor. But for projects where that might be a problem, a solution that might work is the safe-array-destructuring (https://github.com/typescript-eslint/typescript-eslint/pull/1645).
It is still a draft and it might not be of value when the whole issue will be fixed in the TypeScript compiler but if you think it might be useful, feel free to address any concerns/improvements in the typescript-eslint rule PR

Sadly, the fix with eslint in your PR only supports Tuples and not arrays. The main issue and main impact that I have with this is in array destructuring.

const example = (args: string[]) => {
  const [userID, nickname] = args
}

I think this entire issue went on a side where I disagree. I don't think there should be enforced checks inside forEach, map etc... I also try and avoid all uses of for when possible so the reasonings to avoid this make less sense to me. Nonetheless, i still think this is crucial to support array destructuring.

Well, it just errs for any array destructuring, so you're forced to do it using indexing.

Isn't that bad though? Array destructuring is an amazing feature. The issue is the typings aren't accurate. Disabling the feature isn't really a good solution. Using indexes is in my opinion uglier(harder to read and understand) code and more error-prone.

It is more ugly, but it forces the developer to check whether the element is defined or not. I've tried to come up with more elegant solutions, but it seems that this is the one that has the least downsides. At least for our usecase.

@caseyhoward As noted previously in this issue, this causes unwanted behavior with the various Array.prototype functions:

x.forEach( (i: string) => { ... } )  // Error because i has type string | undefined

This doesn't need fixing in array.prototype functions. This is a massive issue for array destructuring!

Ran into this again today. I still think the right compromise is this, which I'd love to get some feedback on.

Another use case for this issue—I ran into this problem when using typescript-eslint and enabling no-unnecessary-condition. In the past, when accessing an array by index to do some operation on the element at that index, we use optional chaining to ensure that that index is defined (in case the index is out of bounds), as in array[i]?.doSomething(). With the problem outlined in this issue, however, no-unnecessary-condition flags that optional chaining as unnecessary (as the type is not nullable according to typescript) and the autofix removes the optional chaining, leading to runtime errors when the array access at the index i actually is undefined.

Without this feature my application became very buggy as I'm dealing with multi-dimensional array, I have to always remind myself to access elements using xs[i]?.[j] instead of xs[i][j], also I have to explicitly cast the accessed element like const element = xs[i]?.[j] as Element | undefined to ensure type safety.

Encountering this was my first big wtf in Typescript. The language otherwise I have found to be wonderfully reliable and this let me down, and it is quite cumbersome to add "as T | undefined" to array accesses. Would love to see one of the proposals implemented.

Yep, same here. It forced us to rollout our own types that have (| undefined) to have type safety. Mostly for object access (probably it's a separate opened issue), but same logic applies to accessing array indexes.

Not sure what you mean can you link issue or show example?

On Wed, Apr 29, 2020 at 2:41 AM Kirill Groshkov notifications@github.com
wrote:

Yep, same here. It forced us to rollout our own types that have ( |
undefined) to have type safety. Mostly for object access (probably it's a
separate opened issue), but same logic applies to accessing array indexes.


You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/microsoft/TypeScript/issues/13778#issuecomment-621096030,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AAAGFJJT37N54I7EH2QLBYDRO7Y4NANCNFSM4C6KEKAA
.

@alangpierce

One idea that seems like it could be promising: what if you could use ?: when defining an index signature to indicate that you expect values to be missing sometimes under normal use? It would act like | undefined mentioned above, but without the awkward downsides. It would need to disallow explicit undefined values, which I guess is a difference from usual ?:.

This is an interesting idea that points in the right direction. However, I would invert the logic: The default case should be that index signatures include | undefined. If you want to tell the compiler that the undefined case can never occur in your code (so you never access invalid keys), you can add !: to the signature, like this:

type AlwaysDefined = {[key: string]!: string};
const t: AlwaysDefined = {};

t['foo'] // Has type string

and the default case without !: would look like we know it, but would be safer:

type MaybeUndefined = {[key: string]: string};
const t: MaybeUndefined = {};

t['foo'] // Has type string | undefined

That way you have the safety by default and at the same way you don't have to explicitly include undefined which would allow to write undefined by accident.

Sorry if that was already suggested, I could not read all the comments in this thread.

In general, I really think that the default behavior that we have now is not what users expect when using TypeScript. It's better to be safe than sorry, especially when an error can be caught as easily as this one by the compiler.

I couldn't get the index.d.ts to work with my project and didn't really want to deal with it.

I created this little hack that forces a type guard:

 const firstNode = +!undefined && nodes[0];

Although the type ends up as: 0 | T, it still does the job.

Wouldn‘t const firstNode = nodes?.[0] work?

@ricklove any reason you preferred +!undefined over simply 1?

Wouldn‘t const firstNode = nodes?.[0] work?

No, as typescript will (incorrectly, in my opinion) not treat it as optional (see #35139).

According to Flow's documentation the behavior with index signatures is the same as currently with TypeScript:

When an object type has an indexer property, property accesses are assumed to have the annotated type, even if the object does not have a value in that slot at runtime. It is the programmer’s responsibility to ensure the access is safe, as with arrays.

var obj: { [number]: string } = {};
obj[42].length; // No type error, but will throw at runtime

So in this concern both TS and Flow require the user to think of undefined keys instead of the compiler helping them :(

Would be an advantage of TS if the compiler prevented users from making these kinds of bugs. Reading all the comments in this thread, it seems that the vast majority here would love to have this feature. Is the TS team still 100% against it?

It seems silly that TS is doing its best to prevent access to undefined members, except this. This is by far the most common bug I've been fixing in our codebase.

[ moving discussion over here from #38470]
Accepting all the arguments for why this wouldn't be practical for arrays..
I think this should definitely be addressed for Tuples:

// tuple 
var str = '';
var num = 100
var aa = [str, num] as const

// Awesome
var shouldBeString = aa[0] // string
var shouldBeNumber = aa[1] // number
var shouldError = aa[10000]; // type error

// Not so awesome 
var foo = aa[num] // string | number

why not make foo string | number | undefined?

Giving this to tuples shouldn't effect the majority of users, and the users that it effect need to have strictNullChecks enabled, and declare their arrays as constants...
They are obviously looking for stricter type saftey.

could also help out with these

var notString1 = aa[Infinity]; // no error, not undefined
var notString2 = aa[NaN]; // no error, not undefined

Which originally caused my runtime error as a number became a NaN, then returned an undefined from a tuple..
All of which was type safe

I'm not sure that tuples are really so different. They have to accommodate rest types (e.g. [str, ...num[]]), and technically aa[Infinity] is perfectly valid Javascript, so I could see it getting complicated to carve out a special case for them.

Your post also made me think about what it would look like if we did get some support for treating index-returns as undefined. If, as in your example, aa[num] actually did type as string | number | undefined, would I have to write for (let i = 0; i < 2; i++) { foo(aa[i]!); } even though I know the index will stay in bounds? For me, when strict null checks flag something I wrote, I like to be able to fix it by either typing things better in the first place, or using runtime guards -- if I have to resort to a non-null assertion, I typically view it as a failure on my part. I don't see a way around it in this case, though, and that bothers me.

If tuple[number] always typed to T | undefined, how would you want to handle the case where you know the index is bounded correctly?

@thw0rted I can't remember the last time I used a "classic" for loop in Typescript (.map and .reduce ftw). That's why a compiler option for this would be great imo (which can be off by default). Many projects don't ever encounter anything like the case you provided, and only use index signatures for array destructuring, etc.

(my original comment: https://github.com/microsoft/TypeScript/issues/13778#issuecomment-517759210)

Oh in all honesty, I also rarely index with a for-loop. I make extensive use of Object.entries or Array#map, but every once in a while I need to pass around keys to index into an object or (probably less often still) indexes into an array or tuple. The for loop was a contrived example, certainly, but it raises the point that either option (undefined or not) has downsides.

but it raises the point that either option (undefined or not) has downsides.

Yeah and it would be nice if the user of TypeScript could choose which downside they prefer :wink:

Oh in all honesty, I also rarely index with a for-loop. I make extensive use of Object.entries or Array#map, but every once in a while I need to pass around keys to index into an object or (probably less often still) indexes into an array or tuple.

In those cases where you do need to, you can use Array#entries:

for (const [index, foo] of array.entries()) {
    bar(index, foo)
}

I personally don't use Array#forEach all that much, and I never use Array#map for iteration (I use it all the time for mapping though). I prefer to keep my code flat, and I appreciate the ability to break out of a for...of loop.

I'm not sure that tuples are really so different. They have to accommodate rest types (e.g. [str, ...num[]]), and technically aa[Infinity] is perfectly valid Javascript, so I could see it getting complicated to carve out a special case for them.

Your post also made me think about what it would look like if we did get some support for treating index-returns as undefined. If, as in your example, aa[num] actually did type as string | number | undefined, would I have to write for (let i = 0; i < 2; i++) { foo(aa[i]!); } even though I know the index will stay in bounds? For me, when strict null checks flag something I wrote, I like to be able to fix it by either typing things better in the first place, or using runtime guards -- if I have to resort to a non-null assertion, I typically view it as a failure on my part. I don't see a way around it in this case, though, and that bothers me.

If tuple[number] always typed to T | undefined, how would you want to handle the case where you know the index is bounded correctly?

I'm not sure there's that much for(let i..) iterating over tuples..
If the tuple has a single type then The user would prob use an Array,

In my experience tuples are great for mixed types and if that's the general case then you are type checking anyway

var str = '';
var num = 100
var aa = [str, num] as const
for (let i = 0; i < aa.length; i++) {
 aa[i] // needs some type check anyway to determine if 'string' or 'number'
}

Also assuming this proposed change happens what is stopping people from doing this?

// this sucks
for (let i = 0; i < 2; i++) { 
   foo(aa[i]!); // ! required
}

// this would work
for (let i = 0 as 0 | 1; i < 2; i++) { 
   foo(aa[i]); //  ! not required 
}

you could use keyof typeof aa instead of 0 | 1 when they fix this issue

Sure there wouldn't be any type safety when doing i arithmetic,
but it really allows people to choose this for type safety over the default "Not strictly type safe but convenient" array

I think having the ?? and ?. operators means it is no longer overly cumbersome to do random access to Arrays.
But also, it is probably far more common to

  • iterate (for eg. with .map or .forEach, where there's no need for "T | undefined", the callback is never run at out-of-bounds indices) or
  • traverse the Array in loops, some of which could be statically determined to be safe (for...of.. almost, except for _sparse arrays_, for...in I believe is safe).

And also, a variable could be considered a "safe index" if it's checked with <index> in <array> first.


_Clarification_: I am talking about the original reason for Array<T> not having the index type [number]: T | undefined (but instead just [number]: T), which was that it would be too cumbersome to use.
I mean that it is no longer the case, so undefined could be added.

@vp2177 The problem are not the features (yes, ?. works), the problem is the compiler not warning us about it and preventing to shoot ourselves in the foot.

Maybe a linting rule could do it, but still.

yes, ?. works

I beg to differ. ?. will work for accessing properties on the indexed value, but the compiler will still say the value is defined regardless of whether it is or isn't (see #35139).

To add to that, if you use eslint and want to use no-unnecessary-condition, such accesses will be flagged as unnecessary and removed because the compiler says it is never undefined and that would make ?. unnecessary.

@martpie Sorry I think you misunderstood my comment, added a clarification.

I mean... seems like a good idea to have a --strictIndexChecks or similar that allows people to opt into this type of behavior. It then wont be a breaking change for everyone and it gives a more strict type environment.

I do believe Flow works this way by default for indexes and doesn't have the aforementioned issues but it has been awhile since I have used it so I could be mistaken.

@bradennapier Flow doesn't use T | undefined either in the index signature for Arrays (so same as TypeScript, unfortunately.)

I have previously suggested an approach that could make everybody happy. Hope it's okay to repeat it in other words.

Basically the default behavior of the index signature would be changed to

  • Include | undefined when reading from an array or object
  • Do not include | undefined when writing (because we only want T values in our object)

Definition would be like that:

type MaybeUndefined = {[key: string]: string};
const t: MaybeUndefined = {};

const x = t['foo'] // Has type string | undefined
t['foo'] = undefined // ERROR! 
t['foo'] = "test" // Ok

For people who do not want undefined included in the type, they can either disable it using a compiler option or disable it only for some types using the ! operator:

type AlwaysDefined = {[key: string]!: string};
const t: AlwaysDefined = {};

const x = t['foo'] // Has type string
t['foo'] = undefined // ERROR! 
t['foo'] = "test" // Ok

To not break existing code, it might also be better to introduce a new compiler option that causes undefined to be included (as shown with the MaybeUndefined type above). If that option is not specified, all index signature types behave as if the ! operator was used within the declaration.

Another idea that comes up: One might want to use the same type at some places in the code with the two different behaviors (include undefined or not). A new Type mapping could be defined:

type MakeDefined<T> = {[K in keyof T]!: T[K]}

Would that be a good extension of TypeScript?

I like the idea but I suspect we'd have to get https://github.com/microsoft/TypeScript/issues/2521 first -- which, don't get me wrong, I still believe we absolutely should, but I'm not holding my breath.

@bradennapier Flow doesn't use T | undefined either in the index signature for Arrays (so same as TypeScript, unfortunately.)

Hmm what about object indexes? I never have run into the need for arrays but fairly certain it makes you check when using objects.

@bradennapier Object indexes? As in, o[4] where o is type object? TypeScript doesn't allow you to do that.....

expression of type '4' can't be used to index type '{}'
Property '4' does not exist on type '{}'.ts(7053)

@RDGthree not to be white knighting for Microsoft of all things, but you apparently missed this bit from mhegazy in the first reply:

With the exception of strictNullChecks, we do not have flags that change the type system behavior. flags usually enable/disable error reporting.

and a bit later on RyanCavanaugh has:

We remain quite skeptical that anyone would get any benefit from this flag in practice. Maps and maplike things can already opt in to | undefined at their definition sites, and enforcing EULA-like behavior on array access doesn't seem like a win. We'd likely need to substantially improve CFA and type guards to make this palatable.

So basically - they are perfectly aware of the fact that a flag is being asked for, and so far they have not been convinced it's worth the work for them for the rather dubious benefit that's been suggested people would get from this. Honestly, I'm not surprised, since everyone keeps coming in here with pretty much exactly the same suggestions and examples that were already addressed. Ideally, you should just use for of or methods for arrays, and Map instead of objects (or less effectively, 'T | undefined` object values).

I'm here because I maintain a popular DT package and people keep asking for |undefined to be added to things like http header maps, which is totally reasonable, except for the bit where it would break pretty much every existing usage, and the fact that it makes other safe usages like Object.entries() much worse to use. Other than complaining about how your opinion is better than that of Typescript's creators, what actually is your contribution here?

Simon, maybe I misread your comment but it sort of sounds like an argument in favor of the original suggestion. The "last mile" missing capability is to treat properties as being | undefined sometimes, depending on context, but not in other cases. That's why I drew an analogy to #2521 upthread.

In an ideal scenario, I could declare an array or object such that, given

ts const arr: Array<T>; const n: number; const obj: {[k: K]: V}; const k: K;

I can somehow wind up with

  • arr[n] types as T | undefined
  • arr[n] = undefined is an error
  • I have access to some kind of iteration on arr that gives me values typed as T, not T | undefined
  • obj[k] types as V | undefined
  • obj[k] = undefined is an error
  • Object.entries() for example should give me tuples of [K, V] and nothing can be undefined

It's the asymmetric nature of the problem that defines this issue in my view.

I would say that the statement that it's a problem for devs who use for loops is failing to take into account devs who use Array Destructuring. It is broken in Typescript and won't be fixed either because of this issue. It provides inaccurate typings. IMO Array Destructuring being effected is a lot bigger problem than those still using for loops.

const example = (args: string[]) => {
  const [userID, duration, ...reason] = args
  // userID and duration is AUTOMATICALLy inferred to be a string here. 
 // However, if for whatever reason args is an empty array 
// userID is actually `undefined` and NOT a `string`. 

// This is valid but it should not be because userID could be undefined
userID.toUpperCase()
}

I don't want to get too far off from the original point of the issue, but you should really declare that example method using a tuple type for the function argument. If the function was declared as ([userId, duration, ...reason]: [string, number, ...string[]]) => {} then you wouldn't have to worry about this in the first place.

I mean sure I could just cast it but that's not really a solution. Similarly, anyone here could simply cast the for loop array indexing as well making this entire issue mute.

for (let i = 0; i < array.length; i++) {
  const value = array[i] as string | undefined
}

The problem is that Typescript doesn't warn for this behavior. As the developer, we have to remember to manually cast these as undefined which is bad behavior for a tool to cause more work for devs. Also to be fair, the cast would actually have to look something like this:

const example = (args: [string | undefined, string | undefined, ...string[] | ...undefined[]]) => {

}

That's not nice at all. But like I said above the main issue isn't even needing to type it like this, its that TS doesn't warn for this at all. That leads to devs who forget this to push code, cool CI found no issues on tsc merge and deploy. TS gives a sense of confidence in the code and having inaccurate typings being provided gives a false sense of confidence. => Errors on runtime!

That's not a "cast", that's properly typing your formal parameters. My point was that if you type your method correctly, then anybody who was calling it unsafely with the old signature (args: string[]) will now get errors at compile-time, as they should, when they're not guaranteed to pass the right number of arguments. If you want to keep letting people get away with failing to pass all the arguments you need, and checking for them yourself at run-time, it's actually very easy to write as (args: [string?, number?, ...string[]]). That already works great and isn't really pertinent to this issue.

That would just move the problem from that function to another function which is calling this function. As you still need to parse the users provided string into separate values and if you use array destructuring to parse it inside the commandHandler which calls this function you end up with the same issue.

There is no way around having inaccurate typings provided for Array Destructuring.

I don't want to get too far off from the original point of the issue

Note: Although I agree with you they should be treated as separate issues, anytime someone is opening an issue showing the problems with array destructuring, it is being closed and users forwarded here. #38259 #36635 #38287 etc... So I don't think we are straying away from the topic of this issue.

Some of my frustrations with this issue would be resolved by some simple syntax to apply | undefined to the type which would otherwise be inferred from the object access. E.g.:

const complexObject: { [key: string]: ComplexType } = { ... };
const maybeKey = 'two';

// From:
const maybeValue = complexObject[maybeKey] as
  | (Some<Complex<Type>> & Pick<WithPlentyOf, 'Utility'> & Types)
  | undefined;

// To:
const maybeValue2 = complexObject[maybeKey] as ?;

So stricter type-checking remains opt-in. The below works but requires duplication (and quickly adds a similar amount of visual noise):

const maybeValue2 = complexObject[maybeKey] as
  | typeof complexObject[typeof maybeKey]
  | undefined;

I imagine this sort of "type assert to maybe" syntax (as ?) would also make it easier to implement a require-safe-index-signature-access eslint rule which isn't terribly confusing for new users. (E.g. "You can fix the lint error by adding as ? at the end.") The rule could even include a safe autofixer since it could only cause compilation to break, never runtime problems.

function eh<T>(v: T): T | undefined {
    return v;
}

const arr = [0, 1, 2, 3, 4, 5];
const thing1 = arr[1]; // number
const thing2 = eh(arr[1]); // number | undefined

I don't think there's a single application of this aside from this surprise TS feature though hehe

function eh<T>(v: T): T | undefined {
    return v;
}

const arr = [0, 1, 2, 3, 4, 5];
const thing1 = arr[1]; // number
const thing2 = eh(arr[1]); // number | undefined

Thanks for the response. While a function like this would work (and will almost certainly be optimized away by most JS engines), I'd rather accept the extra syntax noise mentioned (as ComplexType | undefined) than have the compiled JS file retain this method "wrapper" anywhere it's used.

Another syntax option (which would be less use-case specific than the as ? above) could be to allow the use of the infer keyword in type assertions as a placeholder for the otherwise-inferred type, e.g.:

const maybeValue = complexObject[maybeKey] as infer | undefined;

Currently infer declarations are only permitted in the extends clause of a conditional type (ts(1338)), so giving the infer keyword a meaning in the context of as [...] type assertions wouldn't interfere.

That could still be reasonable to enforce using infer | undefined via an eslint rule while being general enough for other cases, e.g. where the resulting type can be refined with utility types:

// Strange example, maybe from an unusual compiler originally written in JS:
if(someRegularExpression.test(maybeKey)) {
  /**
  * If we are inside this code block and this `maybeKey` exists on `partialObject`, 
  * we know its `regexSuccess` must also be defined, though this knowledge 
  * cannot be encoded using TypeScript's type system.
  */
  const maybeValue = partialObject[maybeKey] as (Required<Pick<infer, 'regexSuccess'>> & infer) | undefined;
  // ...
}
// Here we don't know if `regexSuccess` is defined on `maybeValue2`:
const maybevalue2 = partialObject[maybeKey] as infer | undefined;
function eh<T>(v: T): T | undefined {
    return v;
}

const arr = [0, 1, 2, 3, 4, 5];
const thing1 = arr[1]; // number
const thing2 = eh(arr[1]); // number | undefined

It's useless if we have to keep remembering to use a different, safer pattern anyway. The point is for TypeScript to have our back, whether we're on a very good day, tired or new to a codebase.

Our current behavior is that the type expression Array<string>[number] is interpreted as "The type that occurs when you index an array by a number". You're allowed to write

const t: Array<string>[number] = "hello";

as a roundabout way of writing const t: string.

If this flag existed, what should happen?

If Array<string>[number] is string | undefined, then you have a soundness issue on writes:

function write<T extends Array<unknown>>(arr: T, v: T[number]) {
    arr[0] = v;
}
const arr = ["a", "b", "c"];
// Would be OK
write(arr, undefined);

If Array<string>[number] is string, then you have the same issue as is being described on reads:

function read<T extends Array<unknown>>(arr: T): T[number] {
    return arr[14];
}
const arr = ["a", "b", "c"];
// Would be OK
const k: string = read(arr);

It seems overcomplex to add type syntax for both "The index read type of" and "The index write type of". Thoughts?

I should add that changing the meaning of Array<string>[number] is really problematic since it would imply that this flag changes the interpretation of declaration files. This is nerve-wracking because this seems like the kind of flag that a substantial number of people would enable, but could cause errors to occur/not-occur in things published from DefinitelyTyped, so we'd probably have to make sure that everything builds cleanly in both directions. The number of flags we can add with this behavior is obviously extremely small (we can't CI test 64 different configurations of every types package) so I think we'd have to leave Array<T>[number] as T just to avoid that problem.

@RyanCavanaugh In your example with write(arr, undefined), the call to write would be accepted, but wouldn't the assignment arr[0] = v; no longer compile?

If the write function above came from a library, and was JS that was compiled without the flag, then sure it would be unsound, However, that's already an existing problem because libraries can be compiled with whatever flags they choose. If a library was compiled without strict null checks, then a function in it could return string, whereas it really should be returning string|undefined. But most libraries these days seem to implement null checks, otherwise people complain. Likewise, once this new flag exists, hopefully libraries start implementing it and eventually most libraries will have it set.

Also, while I can't find any concrete examples off the top of my head, I've certainly had to set skipLibCheck in order to get my app to compile. It's pretty a standard tsconfig option in our boilerplate whenever creating a new TS repo, because we've had so many issues with it.

Personally (and maybe selfishly), I'm ok taking a hit on edge cases, if it makes _my_ code generally safer. I still believe I would run into null pointers from accessing arrays by index more often than I would run into other edge cases, though I could be convinced otherwise.

If Array[number] is string | undefined, then you have a soundness issue on writes:

In my opinion, Array<string>[number] should always be string|undefined. It is the reality, if I index into an array with any number, I will get either the array's item type or undefined. You can't really be any more specific there unless you are encoding array length as you could with tuples. Your example for writing would not type check, because you cannot assign string|undefined to an array index.

It seems overcomplex to add type syntax for both "The index read type of" and "The index write type of". Thoughts?

This seems to be exactly what should be as they are two different things. No array will have ever index defined, so you'd always have the potential to get undefined. The type of Array<string>[number] would be string|undefined. In order to specify what you want the T in an Array<T>, a utility type could be used (naming not great): ArrayItemType<Array<string>> = string. This doesn't help with Record types, which may need something like RecordValue<Record<string, number>, string> = string.

I agree that there's no great solutions here, but I'm pretty sure I'd prefer the the soundness on the index reads.

I don't feel particularly strongly about the need for this on arrays, since many other languages (including "safe" ones like Rust) leave the onus for bounds checks to the user, and so I and many developers are already accustomed to doing it. Also, the syntax tends to make it quite obvious in these cases, because they _always_ use bracket notation like foo[i].

That said, I _do_ feel strongly about adding this to string-indexed objects, because it's very difficult to tell whether the signature of foo.bar is correct (due to the field bar being explicitly defined), or possibly undefined (because it is part of an index signature). If this case (which I think is pretty high-value) gets solved, then the array case likely becomes trivial and probably worth doing too.


It seems overcomplex to add type syntax for both "The index read type of" and "The index write type of". Thoughts?

The getter and setter syntax from javascript could be extended to type definitions. For example, the following syntax is fully understood by TypeScript:

const foo = {
  _bar: "",
  get bar(): string {
    return this._bar;
  },
  set bar(value: string) {
    this._bar = value;
  }
}

However, TS currently "merges" these into a single type, since it doesn't track "get" and "set" types of a field separately. This approach works for most common cases, but has its own shortcomings, such as requiring getters and setters to share the same type, and incorrectly assigning a getter type to a field that only defines a setter.

If TS were to track "get" and "set" separately for all fields (which likely has some real performance costs) it would solve these quirks, and also provide a mechanism for describing the feature described in this issue:

type foo = {
  [key: string]: string
}

would essentially become shorthand for:

type foo = {
  get [key: string](): string | undefined;
  set [key: string](string): string;
}

I should add that changing the meaning of Array[number] is really problematic since it would imply that this flag changes the interpretation of declaration files.

If generated declaration files always used the full getter+setter syntax, this would _only_ be a problem for hand-written ones. Applying the current rules to remaining shorthand syntax _in definition files only_ might be a solution here. It would certainly solve the compatibility issues, but would also add to the mental cost of reading .ts files vs .d.ts files.


Aside:

Now of course, this doesn't address _all_ the subtle caveats with getters/setters:

foo.bar = "hello"

// TS assumes that bar is now a string, which technically isn't guaranteed when
// custom setters and getters are used.
const result: string = foo.bar

This is a further edge case, and it's probably worth considering if it's within the scope of TS goals at all... but either way it could probably be solved separately, since my proposal here is consistent with TypeScript's current behavior.

I really feel like I just have to pop in every couple pages of comments and drop a link to #2521 again.

It seems overcomplex to add type syntax for both "The index read type of" and "The index write type of". Thoughts?

Nope! And at least (clicks link...) 116 others here agree with me. I would be so happy to at least have the option of typing reads and writes differently. This is just yet another great use case for that feature.

I would be _so happy_ to at least have the option of typing reads and writes differently. This is just _yet another_ great use case for that feature.

I believe the intent was to question if there should be a way to refer to the read and write types, eg should there be something like Array<string>[number]= to refer to the write type?

I would be happy with the resolution of both #2521 and this issue if two things happen:

  • First, type getters and setters differently, as per #2521
  • Then, create a flag as discussed in this issue, such that the "write type" of Array<string>[number] is string while the "read type" is string | undefined.

I (mistakenly?) assume that the former would lay groundwork for latter, see https://github.com/microsoft/TypeScript/issues/13778#issuecomment-630770947. I don't have any particular need for an explicit syntax to define the "write type" myself.

@RyanCavanaugh quick meta-question: would it be more useful to have my rather-specific suggestion above as a separate issue for the purpose of more clearly debating and tracking its strengths/weaknesses/costs (especially as perf costs will be nontrivial)? Or would this just add to issue noise?

@RyanCavanaugh

const t: Array<string>[number] = "hello";

as a roundabout way of writing const t: string.

I guess this is the same reasoning for tuples?

const t : [string][number] = 'hello' // const t: string

however tuples are aware of boundaries and properly return undefined for more specific number types:

const t0: [string][0] = 'hello' // ok
const t1: [string][1] = 'hello' // Type '"hello"' is not assignable to type 'undefined'.

// therefore this is already true!
const t2: [string][0|1] // string | undefined 

why would type number act any differently?

It seems like you have the mechanisms in place to do this: https://github.com/microsoft/TypeScript/issues/38779

This would fulfil my needs without adding an additional flag.

In case it helps anyone, here's an ESLint plugin that flags unsafe array access and array/object destructuring. It doesn't flag safe usages like tuples and (non-record) objects.

https://github.com/danielnixon/eslint-plugin-total-functions

Cheers, but I hope it's clear this still needs to be added at the language level as it's purely within the realm of typechecking and some projects do not have a linter or the right rules.

@RyanCavanaugh

we've also tested out what this feature looks like in practice and I can tell you that it's not pretty

Just curious, can you tell us what is the not pretty part about? Perhaps we can sort it out if you can share more about this.

This shouldn't be an option, it should be the default.

That is, if TypeScript's type system is to describe at compile-time the types that JavaScript values can take on at run-time:

  • indexing an Array can of course return undefined in JavaScript (index out of bounds or sparse array), so the correct read signature would be T | undefined.
  • "causing" an index in an Array to be undefined is also possible via the delete keyword.

As TypeScript 4 prevents

const a = [1, 2]
delete a[1]

there is a good case for also preventing a[1] = undefined.
This suggest that Array<T>[number] should indeed be different for reads and writes.

It might be "complex" but it would allow to model run-time possibilities in JavaScript more accurately;
Which is what TypeScript was _made for_, right?

There is no point in discussing the idea of making | undefined return the default behavior -- they will never release a version of Typescript that just explodes millions of lines of legacy code due to a compiler update. A change like that would be the single most breaking change in any project I've ever followed.

I do agree with the rest of the post though.

There is no point in discussing the idea of making | undefined return the default behavior -- they will _never_ release a version of Typescript that just explodes millions of lines of legacy code due to a compiler update. A change like that would be the single most breaking change in any project I've ever followed.

I do agree with the rest of the post though.

Not if it's a configuration option.

Not if it's a configuration option.

That's fair, but at the very least I would expect a long "lead in" period where it's off by default, giving people a lot of time (6 months? a year? more?) to convert legacy code before it becomes on-by-default. Like, add the flag in v4.0, set it to on-by-default in v5.0, that sort of thing.

But that's far future, at this point, because we don't even have buy-in that the prerequisite feature (different types for setter/getter) is worth doing. So I stand by the statement that there's no point discussing default behavior for a long, long time.

There is no point in discussing the idea of making | undefined return the default behavior -- they will _never_ release a version of Typescript that just explodes millions of lines of legacy code due to a compiler update. A change like that would be the single most breaking change in any project I've ever followed.

I do agree with the rest of the post though.

To your point @thw0rted I proposed a compromise here: https://github.com/microsoft/TypeScript/issues/38779

It allows this functionality, in a feature that was introduced much later, that still is't as widely used, and it seems like it's "almost" there already and just needs minor changes.

For "strict" consistency something of Array<T> should be viewed as "a list of type T that is infinitly long with no gaps" Anytime this is not desired you should use tuples..

Yes there are draw backs to this, but once again I feel like it is a fair "compromise" between this feature never getting implemented and implemented exactly as we're asking for it here.

If you agree please upvote

I disagree that this needs a whole new option/configuration.

I don't think it should be default behavior when the strict option is disabled but when I enable the strict option BE STRICT and enforce them even on older projects. Enabling strict is what people opt into the strictest/accurate typings possible. If I enable a strict option, I would like to have confidence in my code and that's the best part about TS. But these inaccurate typings even with strict enabled is just a headache.

In the interests of keeping this thread readable, I've hidden a number of tangential comments that weren't meaningfully adding to the conversation. In general, you can assume that comments in the following vein have been made before in the preceding 212 comments and don't need repeating:

  • TypeScript should introduce an impractically large breaking change in checking behavior for the first time ever - no, we haven't done that before, and don't intend to do that in the future, please be more engaged with the long-term design goals of the project before coming here to advocate for unprecedented changes
  • Arguments predicated on the idea that no one has given consideration to the fact that configuration flag exists - we're aware that putting things behind a flag means they don't need to be on by default, this is not anyone's first rodeo
  • What's the hold up, just do it already! - well now that you put it that way I'm convinced 😕

There are very real practical and technical challenges that need solving here and it's disappointing that we can't do meaningful work in this thread because it's filled with people dropping in to suggest that anything behind a flag has no real consequences or that any error that can be ts-ignore'd is not an upgrade burden.

@RyanCavanaugh

thank you for your summary.

please be more engaged with the long-term design goals of the project before coming here to advocate for unprecedented changes

I've deleted my comment and if you wish, I can delete this comment too. But how to understand the first goal of the project:

1. Statically identify constructs that are likely to be errors.

(Source: https://github.com/Microsoft/TypeScript/wiki/TypeScript-Design-Goals )

I don't understand your quoted comment, because unchecked index access can leads likely to a runtime error. If in JavaScript you expect things like that, in TypeScript you get dangerous and wrong confidence. Wrong types are worse than any.

well now that you put it that way I'm convinced

Of course, you are one of the core maintainers and thank you and the team and the contributors for ts. But it's not just about you anymore. Compare upvotes: 365 with downvotes: 6! Only 6! It shows a huge demand for typesafety.

But let's speak about solutions. Is there any solution the team would make happens or can think of?

Can you please elaborate a little bit more on whats wrong with ts-ignore in this case? I mean it can be automated with tools like codemod and doesn't change the behavior of current code. Of course it's not beautiful workaround, but well, it's one possible way to introduce this change without flags.

What do you think about automatic publishing of a patched version of ts, for example 4.0.0-breaking? It introduce a some (much?) work on conflicts, but it allows to everyone to test changes and prepare the code base (not only for this feature request). This could be done for a some limited time period, like 3-6 months. We would be the first, who will use this version.

anymore. Compare upvotes: 365 with downvotes: 6! Only 6! It shows a huge demand for typesafety.
- @Bessonov

ha-ha.
Not that I agree with @RyanCavanaugh 's whole post.. but... c'mon..
there is what a millions? TS users out there?!? 365 want this feature enough to comment and upvote in this thread...

There is no notification from github, but to everyone who would try and play with undefined index from this issue, take a look to the draft PR above my comment.

@RyanCavanaugh thank you very much for giving us an opportunity to play with it. I run it against a pretty small (~105 ts(x) files) code base and played with it a little bit. I didn't found any important issue. One line which was changed from:

const refHtml = useRef(useMemo(() => document.getElementsByTagName('html')[0], []))

to:

const refHtml = useRef(useMemo(() => document.getElementsByTagName('html')[0] ?? null, []))

I will try it on a medium project next week.

@lonewarrior556 it's 60 times more and it's on the first page if you sort by upvotes :)

@Bessonov I think you should try it out on the Typescript compiler codebase, it will probably cause massive breakage due to non-trivial usage of for-loops.

You could even have flags to opt in/out to provide backward compatibility.

Every flag you add is another configuration matrix that needs to be tested and supported. For that reason TypeScript wants to keep flags to a minimum.

@MatthiasKunnen If this were some small edge case feature then I'd agree that that's a viable reason not to add a flag for this. But direct array access shows up in code quite frequently, and is both a glaring hole in the type system and also a breaking change to fix, so I feel a flag would be warranted.

A flag is the wrong approach to this, in part because of the combinatorial
problem. We should just call this TypeScript v5... that kind of approach
turns the number of testable combinations from 2^N to just N...

On Sat, 15 Aug 2020 at 15:56, Michael Burkman notifications@github.com
wrote:

@MatthiasKunnen https://github.com/MatthiasKunnen If this were some
small edge case feature then I'd agree that that's a viable reason not to
add a flag for this. But direct array access shows up in code quite
frequently, and is both a glaring hole in the type system and also a
breaking change to fix, so I feel a flag would be warranted.


You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
https://github.com/microsoft/TypeScript/issues/13778#issuecomment-674408067,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/AAADY5AXEY65S5HDGNGIPZDSA2O3FANCNFSM4C6KEKAA
.

Just had an unexpected undefined error bring down my application for hundreds of users, an error which would have been caught at build time if this type check were in place. TypeScript has been an amazing way for us to deliver more reliable software, but its great utility is being undermined by this single omission.

How about the next 10 people who want to comment here just test out the draft PR #39560 instead?

This is very annoying if you want to enable ESLint's @typescript-eslint/no-unnecessary-condition rule because it then complains about all the instances of

if (some_array[i] === undefined) {

It thinks it is an unnecessary condition (because Typescript says it is!) but it isn't. I don't really want to have to add // eslint-disable-next-line @typescript-eslint/no-unnecessary-condition all over my codebase.

If fixing this properly is too much for work for lazy Javascript programmers perhaps we could have an alternative array access syntax that would add undefined, e.g.

if (some_array[?i] === undefined) {

(random syntax suggestion; alternatives welcome)

@Timmmm please read the comment just above the one you made. There's a build you can try that implements a version of this suggestion -- you could let us know there if it resolves your eslint issue.

@the0rted I did read that comment. It does not implement a version of my suggestion. Maybe you should just read it again.

Sorry, when I said "this" suggestion I meant the capability in the OP, i.e. treating array_of_T[i] as T | undefined. I do see that you're asking about a syntax to mark a specific index operator as "maybe undefined", but if you use the implementation in Ryan's PR, you wouldn't need that, because all index operators would be "maybe undefined". Wouldn't that meet your need?

Yes it would - if/when it lands I will definitely use it. But I got the impression from this discussion that a lot of people were resistant to it because it adds an extra flag and makes code that looks like it should work more complicated (the simple for loop example).

So I wanted to offer an alternative suggestion, but apparently people hate it. :-/

All well intentioned input is appreciated, don't take it the wrong way @Timmmm. I think the downvotes only convey that this is already well-trodden ground in this thread at this point.

For me, adding a flag that improves type safety across the board is much lower cost than introducing new opt-in syntax.

I didn't mention it above, but a syntax for one-off overrides already exists: if ((some_array[i] as MyType | undefined) === undefined). It's not as terse as a new shorthand but hopefully it's not a construct you'd have to use very often.

_Originally posted by @osyrisrblx in https://github.com/microsoft/TypeScript/issues/40435#issuecomment-690017567_

While this seems like a much safer option, it would be nice to opt-out of this behavior with a separate syntax for the rare occasion where you're 100% sure it's always defined.

Maybe !: could assert always defined?

interface X {
    [index: string]!: number; // -> number
}

interface Y {
    [index: string]: number; // -> number | undefined
}

That suggestion mischaracterizes the problem. The issue isn't with TypeScript's understanding of the type, itself. The issue comes from _accessing_ the data, which is where new syntax has already been proposed earlier in this thread.

You can try this out in the TypeScript 4.1 beta, coming out soon ™ (or tonight's nightly if you need it right now! 🙂)

Sorry if this was offered before, but could {[index: string]?: number} // -> number | undefined be supported? It's consistent with interface optional properties syntax - required by default, possibly undefined with "?".

Compiler option in 4.1 is cool, but having more granular control would be good too.

If you want to try it out in your repo (took me a bit to figure out):

  1. Install typescript@next yarn (add|upgrade) typescript@next
  2. Add flag (for me in tsconfig.json) "noUncheckedIndexedAccess": true

In the process of enabling this rule on my project, I came across this interesting error:

type MyRecord = { a: number; b: string };

declare const myRecord: MyRecord;

declare const key: 'a' | 'b';
const value = myRecord[key]; // string | number ✅

// ❌ Unexpected error
// Type 'MyRecord[Key] | undefined' is not assignable to type 'MyRecord[Key]'
const fn = <Key extends keyof MyRecord>(key: Key): MyRecord[Key] => myRecord[key];

In this case I did not expect myRecord[key] to return type MyRecord[Key] | undefined, because the key is constrained to keyof MyRecord.

Update: filed issue https://github.com/microsoft/TypeScript/issues/40666

That's probably a bug/oversight though!

In the process of enabling this rule on my project, I came across this interesting error:

type MyRecord = { a: number; b: string };

declare const myRecord: MyRecord;

declare const key: 'a' | 'b';
const value = myRecord[key]; // string | number ✅

// ❌ Unexpected error
// Type 'MyRecord[Key] | undefined' is not assignable to type 'MyRecord[Key]'
const fn = <Key extends keyof MyRecord>(key: Key): MyRecord[Key] => myRecord[key];

In this case I did not expect myRecord[key] to return type MyRecord[Key] | undefined, because the key is constrained to keyof MyRecord.

I'd say that's a bug. Basically if keyof Type only includes string/number literal types (as opposed to actually including string or number), then Type[Key] where Key extends keyof Type should not include undefined, I'd think.

cross-linking to #13195, which also looks at differences and similarities between "there's no property here" and "there is a property here but it's undefined"

Tracking some issues related to design limitations here

  • #41612

FYI: I caught two bugs in my codebase thanks to this flag, so big thank you!
It's a bit annoying that checking arr.length > 0 is not sufficient to guard arr[0], but that's a minor inconvenience (I can rewrite the check to make tsc happy) compared to the extra safety, IMO.

@rubenlg I agree about arr.length. For just checking the first element I've rewritten our code like

const firstEl = arr[0];
if (firstEl !== undefined) {
  ...
}

But there are a few places where we do if (arr.length > 2) or more, which are a bit awkward. However I don't think the checking .length would be totally type safe anyway since you can just modify it:

const a: number[] = [];

a.length = 1;

if (a.length > 0) {
    const b: number = a[0];
    console.log(b);
}

Prints undefined.

However I don't think the checking .length would be totally type safe anyway since you can just modify it:

const a: number[] = [];

a.length = 1;

if (a.length > 0) {
    const b: number = a[0];
    console.log(b);
}

Prints undefined.

That would essentially be a sparse array, which is out of scope for this sort of thing. There's always non-standard or sneaky things that could be done to get around the type checker.

There's always a way to invalidate the previous assumption, and that's the problem. Analyzing every possible execution path would be way too costly.

const a: number[] = [1]
if (a.length > 0) {
    a.pop();
    console.log(a[0])
}
Was this page helpful?
0 / 5 - 0 ratings