Announcing TypeScript 4.7 Beta

Daniel Rosenwasser

Today we are excited to announce the beta release of TypeScript 4.7!

To get started using the beta, you can use npm with the following command:

npm install typescript@beta

You can also get editor support by

Here’s a quick list of what’s new in TypeScript 4.7!

ECMAScript Module Support in Node.js

For the last few years, Node.js has been working to support ECMAScript modules (ESM). This has been a very difficult feature, since the Node.js ecosystem is built on a different module system called CommonJS (CJS). Interoperating between the two brings large challenges, with many new features to juggle; however, support for ESM in Node.js was largely implemented in Node.js 12 and later. Around TypeScript 4.5 we rolled out nightly-only support for ESM in Node.js to get some feedback from users and let library authors ready themselves for broader support.

TypeScript 4.7 adds this functionality with two new module settings: node12 and nodenext.

{
    "compilerOptions": {
        "module": "nodenext",
    }
}

These new modes bring a few high-level features which we’ll explore here.

type in package.json and New Extensions

Node.js supports a new setting in package.json called type. "type" can be set to either "module" or "commonjs".

{
    "name": "my-package",
    "type": "module",

    "//": "...",
    "dependencies": {
    }
}

This setting controls whether .js files are interpreted as ES modules or CommonJS modules, and defaults to CommonJS when not set. When a file is considered an ES module, a few different rules come into play compared to CommonJS:

  • import/export statements (and top-level await in nodenext) can be used.
  • Relative import paths need full extensions (we have to write import "./foo.js" instead of import "./foo").
  • Imports might resolve differently from dependencies in node_modules.
  • Certain global-like values like require() and process cannot be used directly.
  • CommonJS modules get imported under certain special rules.

We’ll come back to some of these.

To overlay the way TypeScript works in this system, .ts and .tsx files now work the same way. When TypeScript finds a .ts, .tsx, .js, or .jsx file, it will walk up looking for a package.json to see whether that file is an ES module, and use that to determine:

  • how to find other modules which that file imports
  • and how to transform that file if producing outputs

When a .ts file is compiled as an ES module, ECMAScript import/export statements are left alone in the .js output; when it’s compiled as a CommonJS module, it will produce the same output you get today under --module commonjs.

This also means paths resolve differently between .ts files that are ES modules and ones that are CJS modules. For example, let’s say you have the following code today:

// ./foo.ts
export function helper() {
    // ...
}

// ./bar.ts
import { helper } from "./foo"; // only works in CJS

helper();

This code works in CommonJS modules, but will fail in ES modules because relative import paths need to use extensions. As a result, it will have to be rewritten to use the extension of the output of foo.ts – so bar.ts will instead have to import from ./foo.js.

// ./bar.ts
import { helper } from "./foo.js"; // works in ESM & CJS

helper();

This might feel a bit cumbersome at first, but TypeScript tooling like auto-imports and path completion will typically just do this for you.

One other thing to mention is the fact that this applies to .d.ts files too. When TypeScript finds a .d.ts file in package, it is interpreted based on the containing package.

New File Extensions

The type field in package.json is nice because it allows us to continue using the .ts and .js file extensions which can be convenient; however, you will occasionally need to write a file that differs from what type specifies. You might also just prefer to always be explicit.

Node.js supports two extensions to help with this: .mjs and .cjs. .mjs files are always ES modules, and .cjs files are always CommonJS modules, and there’s no way to override these.

In turn, TypeScript supports two new source file extensions: .mts and .cts. When TypeScript emits these to JavaScript files, it will emit them to .mjs and .cjs respectively.

Furthermore, TypeScript also supports two new declaration file extensions: .d.mts and .d.cts. When TypeScript generates declaration files for .mts and .cts, their corresponding extensions will be .d.mts and .d.cts.

Using these extensions is entirely optional, but will often be useful even if you choose not to use them as part of your primary workflow.

CommonJS Interop

Node.js allows ES modules to import CommonJS modules as if they were ES modules with a default export.

// ./foo.cts
export function helper() {
    console.log("hello world!");
}

// ./bar.mts
import foo from "./foo.cjs";

// prints "hello world!"
foo.helper();

In some cases, Node.js also synthesizes named exports from CommonJS modules, which can be more convenient. In these cases, ES modules can use a "namespace-style" import (i.e. import * as foo from "..."), or named imports (i.e. import { helper } from "...").

// ./foo.cts
export function helper() {
    console.log("hello world!");
}

// ./bar.mts
import { helper } from "./foo.cjs";

// prints "hello world!"
helper();

There isn’t always a way for TypeScript to know whether these named imports will be synthesized, but TypeScript will err on being permissive and use some heuristics when importing from a file that is definitely a CommonJS module.

One TypeScript-specific note about interop is the following syntax:

import foo = require("foo");

In a CommonJS module, this just boils down to a require() call, and in an ES module, this imports createRequire to achieve the same thing. This will make code less portable on runtimes like the browser (which don’t support require()), but will often be useful for interoperability. In turn, you can write the above example using this syntax as follows:

// ./foo.cts
export function helper() {
    console.log("hello world!");
}

// ./bar.mts
import foo = require("./foo.cjs");

foo.helper()

Finally, it’s worth noting that the only way to import ESM files from a CJS module is using dynamic import() calls. This can present challenges, but is the behavior in Node.js today.

You can read more about ESM/CommonJS interop in Node.js here.

package.json Exports, Imports, and Self-Referencing

Node.js supports a new field for defining entry points in package.json called "exports". This field is a more powerful alternative to defining "main" in package.json, and can control what parts of your package are exposed to consumers.

Here’s an package.json that supports separate entry-points for CommonJS and ESM:

// package.json
{
    "name": "my-package",
    "type": "module",
    "exports": {
        ".": {
            // Entry-point for `import "my-package"` in ESM
            "import": "./esm/index.js",

            // Entry-point for `require("my-package") in CJS
            "require": "./commonjs/index.cjs",
        },
    },

    // CJS fall-back for older versions of Node.js
    "main": "./commonjs/index.cjs",
}

There’s a lot to this feature, which you can read more about on the Node.js documentation. Here we’ll try to focus on how TypeScript supports it.

With TypeScript’s original Node support, it would look for a "main" field, and then look for declaration files that corresponded to that entry. For example, if "main" pointed to ./lib/index.js, TypeScript would look for a file called ./lib/index.d.ts. A package author could override this by specifying a separate field called "types" (e.g. "types": "./types/index.d.ts").

The new support works similarly with import conditions. By default, TypeScript overlays the same rules with import conditions – if you write an import from an ES module, it will look up the import field, and from a CommonJS module, it will look at the require field. If it finds them, it will look for a corresponding declaration file. If you need to point to a different location for your type declarations, you can add a "types" import condition.

// package.json
{
    "name": "my-package",
    "type": "module",
    "exports": {
        ".": {
            // Entry-point for `import "my-package"` in ESM
            "import": {
                // Where TypeScript will look.
                "types": "./types/esm/index.d.ts",

                // Where Node.js will look.
                "default": "./esm/index.js"
            },
            // Entry-point for `require("my-package") in CJS
            "require": {
                // Where TypeScript will look.
                "types": "./types/commonjs/index.d.cts",

                // Where Node.js will look.
                "default": "./commonjs/index.cjs"
            },
        }
    },

    // Fall-back for older versions of TypeScript
    "types": "./types/index.d.ts",

    // CJS fall-back for older versions of Node.js
    "main": "./commonjs/index.cjs"
}

TypeScript also supports the "imports" field of package.json in a similar manner (looking for declaration files alongside corresponding files), and supports packages self-referencing themselves. These features are generally not as involved, but are supported.

Your Feedback Wanted!

As we continue working on TypeScript 4.7, we expect to see more documentation and polish go into this functionality. Supporting these new features has been an ambitious under-taking, and that’s why we’re looking for early feedback on it! Please try it out and let us know how it works for you.

For more information, you can see the implementing PR here.

Control over Module Detection

One issue with the introduction of modules to JavaScript was the ambiguity between existing "script" code and the new module code. JavaScript code in a module runs slightly differently, and has different scoping rules, so tools have to make decisions as to how each file runs. For example, Node.js requires module entry-points to be written in a .mjs, or have a nearby package.json with "type": "module". TypeScript treats a file as a module whenever it finds any import or export statement in a file, but otherwise, will assume a .ts or .js file is a script file acting on the global scope.

This doesn’t quite match up with the behavior of Node.js where the package.json can change the format of a file, or the --jsx setting react-jsx, where any JSX file contains an implicit import to a JSX factory. It also doesn’t match modern expectations where most new TypeScript code is written with modules in mind.

That’s why TypeScript 4.7 introduces a new option called moduleDetection. moduleDetection can take on 3 values: "auto" (the default), "legacy" (the same behavior as 4.6 and prior), and "force".

Under the mode "auto", TypeScript will not only look for import and export statements, but it will also check whether

  • the "type" field in package.json is set to "module" when running under --module nodenext/--module node12, and
  • check whether the current file is a JSX file when running under --jsx react-jsx

In cases where you want every file to be treated as a module, the "force" setting ensures that every non-declaration file is treated as a module. This will be true regardless of how module, moduleResoluton, and jsx are configured.

Meanwhile, the "legacy" option simply goes back to the old behavior of only seeking out import and export statements to determine whether a file is a module.

You can read up more about this change on the pull request.

Control-Flow Analysis for Computed Properties

TypeScript 4.7 now analyzes the type of computed properties and narrows them correctly. For example, take the following code:

const key = Symbol();

const numberOrString = Math.random() < 0.5 ? 42 : "hello";

let obj = {
    [key]: numberOrString,
};

if (typeof obj[key] === "string") {
    let str = obj[key].toUpperCase();
}

Previously, TypeScript would not consider any type guards on obj[key], and would have no idea that obj[key] was really a string. Instead, it would think that obj[key] was still a string | number and accessing toUpperCase() would trigger an error.

TypeScript 4.7 now knows that obj[key] is a string.

This also means that under --strictPropertyInitialization, TypeScript can correctly check that computed properties are initialized by the end of a constructor body.

const key = Symbol();

class C {
    [key]: string;

    constructor(str: string) {
        // oops, forgot to set this[key]
    }

    screamString() {
        return this[key].toUpperCase();
    }
}

Under TypeScript 4.7, --strictPropertyInitialization reports an error telling us that the [key] property wasn’t definitely assigned by the end of the constructor.

We’d like to extend our gratitude to Oleksandr Tarasiuk who provided this change!

Improved Function Inference in Objects and Methods

TypeScript 4.7 can now perform more granular inferences from functions within objects and arrays. This allows the types of these functions to consistently flow in a left-to-right manner just like for plain arguments.

declare function f<T>(arg: {
    produce: (n: string) => T,
    consume: (x: T) => void }
): void;

// Works
f({
    produce: () => "hello",
    consume: x => x.toLowerCase()
});

// Works
f({
    produce: (n: string) => n,
    consume: x => x.toLowerCase(),
});

// Was an error, now works.
f({
    produce: n => n,
    consume: x => x.toLowerCase(),
});

// Was an error, now works.
f({
    produce: function () { return "hello"; },
    consume: x => x.toLowerCase(),
});

// Was an error, now works.
f({
    produce() { return "hello" },
    consume: x => x.toLowerCase(),
});

Inference failed in some of these examples because knowing the type of their produce functions would indirectly request the the of arg before finding a good type for T. TypeScript now gathers functions that could contribute to the inferred type of T and infers from them lazily.

For more information, you can take a look at the specific modifications to our inference process.

Instantiation Expressions

Occasionally functions can be a bit more general than we want. For example, let’s say we had a makeBox function.

interface Box<T> {
    value: T;
}

function makeBox<T>(value: T) {
    return { value };
}

Maybe we want to create a more specialized set of functions for making Boxes of Wrenches and Hammers. To do that today, we’d have to wrap makeBox in other functions, or use an explicit type for an alias of makeBox.

function makeHammerBox(hammer: Hammer) {
    return makeBox(hammer);
}

// or...

const makeWrenchBox: (wrench: Wrench) => Box<Wrench> = makeBox;

These work, but wrapping a call to makeBox is a bit wasteful, and writing the full signature of makeWrenchBox could get unwieldy. Ideally, we would be able to say that we just want to alias makeBox while replacing all of the generics in its signature.

TypeScript 4.7 allows exactly that! We can now take functions and constructors and feed them type arguments directly.

const makeHammerBox = makeBox<Hammer>;
const makeWrenchBox = makeBox<Wrench>;

So with this, we can specialize makeBox to accept more specific types and reject anything else.

const makeStringBox = makeBox<string>;

// TypeScript correctly rejects this.
makeStringBox(42);

This logic also works for constructor functions such as Array, Map, and Set.

// Has type `new () => Map<string, Error>`
const ErrorMap = Map<string, Error>;

// Has type `// Map<string, Error>`
const errorMap = new ErrorMap();

When a function or constructor is given type arguments, it will produce a new type that keeps all signatures with compatible type parameter lists, and replaces the corresponding type parameters with the given type arguments. Any other signatures are dropped, as TypeScript will assume that they aren’t meant to be used.

For more information on this feature, check out the pull request.

extends Constraints on infer Type Variables

Conditional types are a bit of a power-user feature. They allow us to match and infer against the shape of types, and make decisions based on them. For example, we can write a conditional type that returns the first element of a tuple type if it’s a string-like type.

type FirstString<T> =
    T extends [infer S, ...unknown[]]
        ? S extends string ? S : never
        : never;

 // string
type A = FirstString<[string, number, number]>;

// "hello"
type B = FirstString<["hello", number, number]>;

// "hello" | "world"
type C = FirstString<["hello" | "world", boolean]>;

// never
type D = FirstString<[boolean, number, number]>;

FirstString matches against any tuple with at least one element and grabs the type of the first element as S. Then it checks if S is compatible with string and returns that type if it is.

Note that we had to use two conditional types to write this. We could have written FirstString as follows:

type FirstString<T> =
    T extends [string, ...unknown[]]
        // Grab the first type out of `T`
        ? T[0]
        : never;

This works, but it’s slightly more "manual" and less declarative. Instead of just pattern-matching on the type and giving the first element a name, we have to fetch out the 0th element of T with T[0]. If we were dealing with types more complex than tuples, this could get a lot trickier, so conditionals can simplify things.

Using nested conditionals to infer a type and then match against that inferred type is pretty common. To avoid that second level of nesting, TypeScript 4.7 now allows you to place a constraint on any infer type.

type FirstString<T> =
    T extends [infer S extends string, ...unknown[]]
        ? S
        : never;

This way, when TypeScript matches against S, it also ensures that S has to be a string. If S isn’t a string, it takes the false path, which in this cases is never.

For more details, you can read up on the change on GitHub.

Optional Variance Annotations for Type Parameters

Let’s take the following types.

interface Animal {
    animalStuff: any;
}

interface Dog extends Animal {
    dogStuff: any;
}

// ...

type Getter<T> = () => T;

type Setter<T> = (value: T) => void;

Imagine we had two different instances of Getters. Figuring out whether any two different Getters are substitutable for one another depends entirely on T. In the case of whether an assignment of Getter<Dog>Getter<Animal> is valid, we could just check whether DogAnimal is valid. Because each type for T just gets related in the same "direction", we say that this Getter is covariant on T. On the other hand, checking whether Setter<Dog>Setter<Animal> is valid involves checking whether AnimalDog is valid. That "flip" in direction is kind of like how in math, checking -x < -y is the same as checking whether y < x. We say that Setter is contravariant on T when we have to flip directions.

With TypeScript 4.7, we’re now able to explicitly specify variance on type parameters.

So now, if we want to make it explicit that Getter is covariant on T, we can now give it an out modifier.

type Getter<out T> = () => T;

And similarly, if we also want to make it explicit that Setter is contravariant on T, we can give it an in modifier.

type Setter<in T> = (value: T) => void;

out and in are used here because a type parameter’s variance depends on whether it’s used in in an output or an input. Instead of thinking about variance, you can just think about if T is used in output and input positions.

There are also cases for using both in and out.

interface State<in out T> {
    get: () => T;
    set: (value: T) => void;
}

When a T is used in both an output and input position, it becomes invariant. Two different State<T>s can’t be interchanged unless their Ts are the same. In other words, State<Dog> and State<Animal> aren’t substitutable for the other.

Now technically speaking, in a purely structural type system, type parameters and their variance don’t really matter – you can just plug in types in place of each type parameter and check whether each matching member is structurally compatible. So if TypeScript uses a structural type system, why are we interested in the variance of type parameters? And why might we ever want to annotate them?

One reason is that it can be a useful for a reader to explicitly see how a type parameter is used at a glance. For much more complex types, it can be difficult to tell whether a type is meant to be read, written, or both. TypeScript will also help us out if we forget to mention how that type parameter is used. As an example, if we forgot to specify both in and out on State, we’d get an error.

interface State<out T> {
    //          ~~~~~
    // error!
    // Type 'State<sub-T>' is not assignable to type 'State<super-T>' as implied by variance annotation.
    //   Types of property 'set' are incompatible.
    //     Type '(value: sub-T) => void' is not assignable to type '(value: super-T) => void'.
    //       Types of parameters 'value' and 'value' are incompatible.
    //         Type 'super-T' is not assignable to type 'sub-T'.
    get: () => T;
    set: (value: T) => void;
}

Another reason is precision and speed! TypeScript already tries to infer the variance of type parameters as an optimization. By doing this, it can type-check larger structural types in a reasonable amount of time. Calculating variance ahead of time allows the type-checker to skip deeper comparisons and just compare type arguments which can be much faster than comparing the full structure of a type over and over again. But often there are cases where this calculation is still fairly expensive, and the calculation may find circularities that can’t be accurately resolved, meaning there’s no clear answer for the variance of a type.

type Foo<T> = {
    x: T;
    f: Bar<T>;
}

type Bar<U> = (x: Baz<U[]>) => void;

type Baz<V> = {
    value: Foo<V[]>;
}

declare let foo1: Foo<unknown>;
declare let foo2: Foo<string>;

foo1 = foo2;  // Should be an error but isn't ❌
foo2 = foo1;  // Error - correct ✅

Providing an explicit annotation can speed up type-checking at these circularities and provide better accuracy. For instance, marking T as invariant in the above example can help stop the problematic assignment..

- type Foo<T> = {
+ type Foo<in out T> = {
      x: T;
      f: Bar<T>;
  }

We don’t necessarily recommend annotating every type parameter with its variance; For example, it’s possible (but not recommended) to make variance a little stricter than is necessary, so TypeScript won’t stop you from marking something as invariant if it’s really just covariant, contravariant, or even independent. So if you do choose to add explicit variance markers, we would encourage thoughtful and precise use of them.

But if you’re working with deeply recursive types, especially if you’re a library author, you may be interested in using these annotations to the benefit of your users, providing wins in both both accuracy and type-checking speed, which can even affect their code editing experience. Determining when variance calculation is a bottleneck on type-checking time can be done experimentally, and determined using tooling like our analyze-trace utility.

For more details on this feature, you can read up on the pull request.

typeof on #private Fields

TypeScript 4.7 now allows us to perform typeof queries on private fields.

class Container {
    #data = "hello!";

    get data(): typeof this.#data {
        return this.#data;
    }

    set data(value: typeof this.#data) {
        this.#data = value;
    }
}

This change was provided courtesy of Oleksandr Tarasiuk.

Resolution Customization with moduleSuffixes

TypeScript 4.7 now supports a moduleSuffixes option to customize how module specifiers are looked up.

{
    "compilerOptions": {
        "moduleSuffixes": [".ios", ".native", ""]
    }
}

Given the above configuration, an import like the following…

import * as foo from "./foo";

will try to look at the the relative files ./foo.ios.ts, ./foo.native.ts, and finally ./foo.ts.

This feature can be useful for React Native projects where each target platform can use a separate tsconfig.json with differing moduleSuffixes.

The moduleSuffixes option was contributed thanks to Adam Foxman!

resolution-mode

With Node’s ECMAScript resolution, the mode of the containing file and the syntax you use determines how imports are resolved; however it would be useful to reference the types of a CommonJS module from an ECMAScript module, or vice-versa.

TypeScript now allows /// <reference types="..." /> directives and import type statements to specify a resolution strategy.

import type can specify an import assertion to achieve this.

// Resolve `pkg` as if we were importing with a `require()`
import type { TypeFromRequire } from "pkg" assert {
    "resolution-mode": "require"
};

// Resolve `pkg` as if we were importing with an `import`
import type { TypeFromImport } from "pkg" assert {
    "resolution-mode": "import"
};

export interface MergedType extends TypeFromRequire, TypeFromImport {}

These import assertions can also be used on import() types.

export type TypeFromRequire =
    import("pkg", { assert: { "resolution-mode": "require" } }).TypeFromRequire;

export type TypeFromImport =
    import("pkg", { assert: { "resolution-mode": "import" } }).TypeFromImport;

export interface MergedType extends TypeFromRequire, TypeFromImport {}

Similarly, we can use resolution-mode on a /// <reference types="..." /> directive.

/// <reference types="pkg" resolution-mode="require" />

// or

/// <reference types="pkg" resolution-mode="import" />

You can see the respective changes for reference directives and for type import assertions.

Groups-Aware Organize Imports

TypeScript has an Organize Imports editor feature for both JavaScript and TypeScript. Unfortunately, it could be a bit of a blunt instrument, and would often naively sort your import statements.

For instance, if you ran Organize Imports on the following file…

// local code
import * as bbb from "./bbb";
import * as ccc from "./ccc";
import * as aaa from "./aaa";

// built-ins
import * as path from "path";
import * as child_process from "child_process"
import * as fs from "fs";

// some code...

You would get something like the following

// local code
import * as child_process from "child_process";
import * as fs from "fs";
// built-ins
import * as path from "path";
import * as aaa from "./aaa";
import * as bbb from "./bbb";
import * as ccc from "./ccc";


// some code...

This is… not ideal. Sure, our imports are sorted by their paths, and our comments and newlines are preserved, but not in a predictable way. Much of the time, if we have our imports grouped in a specific way, then we want to keep them that way.

TypeScript 4.7 performs Organize Imports in a group-aware manner. Running it on the above code looks a little bit more like what you’d expect:

// local code
import * as aaa from "./aaa";
import * as bbb from "./bbb";
import * as ccc from "./ccc";

// built-ins
import * as child_process from "child_process";
import * as fs from "fs";
import * as path from "path";

// some code...

We’d like to extend our thanks to Minh Quy who provided this feature.

Object Method Snippet Completions

TypeScript now provides snippet completions for object literal methods. When completing members in an object, TypeScript will provide a typical completion entry for just the name of a method, along with a separate completion entry for the full method definition!

Completion a full method signature from an object

For more details, see the implementing pull request.

Breaking Changes

lib.d.ts Updates

While TypeScript strives to avoid major breaks, even small changes in the built-in libraries can cause issues. We don’t expect major breaks as a result of DOM and lib.d.ts updates, but there may be some small ones.

Type Parameters No Longer Assignable to {} in strictNullChecks

Originally, the constraint of all type parameters in TypeScript was {} (the empty object type). Eventually the constraint was changed to unknown which also permits null and undefined. Outside of strictNullChecks, these types are interchangeable, but within strictNullChecks, unknown is not assignable to {}.

In TypeScript 4.7, under strictNullChecks, the type-checker disables a type safety hole that was maintained for backwards-compatibility, where type parameters were considered to always be assignable to {} and object.

function foo<T>(x: T) {
    const a: {} = x;
    //    ~
    // Type 'T' is not assignable to type '{}'.

    const b: object = x;
    //    ~
    // Type 'T' is not assignable to type 'object'.
}

In such cases, you may need a type assertion on x, or a constraint of {} on T.

function foo<T extends {}>(x: T) {
    // Works
    const a: {} = x;

    // Works
    const b: object = x;
}

This behavior can come up in calls to

function keysEqual<T>(x: T, y: T) {
    const xKeys = Object.keys(x);
    const yKeys = Object.keys(y);
    if (xKeys.length !== yKeys.length) return false;
    for (let i = 0; i < xKeys.length; i++) {
        if (xKeys[i] !== yKeys[i]) return false;
    }
    return true;
}

For the above, you might see an error message that looks like this:

No overload matches this call.
  Overload 1 of 2, '(o: {}): string[]', gave the following error.
    Argument of type 'T' is not assignable to parameter of type '{}'.
  Overload 2 of 2, '(o: object): string[]', gave the following error.
    Argument of type 'T' is not assignable to parameter of type 'object'.

For more information, take a look at the breaking PR here.

readFile Method is No Longer Optional on LanguageServiceHost

If you’re creating LanguageService instances, then provided LanguageServiceHosts will need to provide a readFile method. This change was necessary to support the new moduleDetection compiler option.

You can read more on the change here.

readonly Tuples Have a readonly length Property

A readonly tuple will now treat its length property as readonly. This was almost never witnessable for fixed-length tuples, but was an oversight which could be observed for tuples with trailing optional and rest element types.

As a result, the following code will now fail:

function overwriteLength(tuple: readonly [string, string, string]) {
    // Now errors.
    tuple.length = 7;
}

You can read more on this change here.

What’s Next?

In the coming weeks, we’ll be polishing TypeScript 4.7 to get it ready for a Release Candidate, making sure that everything works just as you would expect. If you’d like to plan around our release schedule, our target release dates are available on the TypeScript 4.7 iteration plan.

We hope that this release brings some exciting and helpful features to make coding a joy.

Happy Hacking!

– Daniel Rosenwasser and the TypeScript Team

7 comments

Discussion is closed. Login to edit/delete existing comments.

  • Amit Beckenstein 0

    Why is `resolution-mode` necessary when using `type` keyword to import the types? It doesn’t emit actual `import` or `require` statements, hence the module type doesn’t matter. What am I missing?

  • Xavier Lefebvre 0

    While I’m always glad to see the new features bring to Typescript language, I’m kind of worried by the evolutions on control-flow and inference systems always getting “smarter and smarter”. I have a project that already takes more than 2 minutes to build in full and more than 20 seconds in incremental and I’m worried that it will take more and more time upon each Typescript release because of this kind of type-checking evolutions.

    But maybe the variance annotation system will save me some compiling time.

    Either way, I would really like Typescript to evolve their build engine to be as fast as some alternatives (esbuild, swc, etc.). It’s really the last missing brick for me to make Typescript really enjoyable to use on a daily basis.

  • Omri Luzon 0

    The

    -x < -y

    part was very confusing because in some resolutions the line breaks after the first `-`, I had to read like 4 times until I noticed it.

    • Daniel RosenwasserMicrosoft employee 0

      I’ll try to fix this in the RC.

  • Ali Altun 0

    I’m afraid that as new features (which makes the language more complex)) are added, a journey from the most loved to the most hated will take place.

  • Ivan Nikulin 0

    Can I suggest to change this example

    // never
    type D = FirstString<[boolean, number, number]>;
    

    to this

    // never
    type D = FirstString<[boolean, string, number]>;
    

    I didn’t read through the function definition thoroughly at first and had the false impression that it would detect any string argument from the tuple, instead of only the first one. This example would’ve made me double-check a bit earlier 🙂 Thanks!

    • Daniel RosenwasserMicrosoft employee 0

      Yes, I’ll do that for the RC post, and I’ll rename it to `FirstIfString`. Thank you!

Feedback usabilla icon