Language-tools: Enabling the Semantic Tokens feature on large files causes the Language Server to be unusably slow

Created on 18 Jan 2021  ·  4Comments  ·  Source: sveltejs/language-tools

Describe the bug

The VS Code Svelte plugin v104.0.0 introduced support for Semantic Tokens. By default, this is enabled:

// The default
"svelte.plugin.typescript.semanticTokens.enable": true

... However, this leads to significant slowdown in .svelte files where a <script lang="ts"> element consists of a large amount of TypeScript (in my case, 2,600 lines). The Language Server becomes effectively unresponsive, with type-checking taking over a minute and tooltips getting frozen on "Loading..." indefinitely.

To Reproduce

With the VS Code Svelte plugin v104.0.0 installed, and the default option of "svelte.plugin.typescript.semanticTokens.enable": true applied, create a .svelte file with, say, 2,600 lines of TypeScript code. Deliberately create a compile-time error and watch as all type-checking takes a very long time to respond.

You can update the settings to "svelte.plugin.typescript.semanticTokens.enable": false and see it instantly become responsive again. No need to restart the Language Server, nor restart VS Code, to observe the change.

Expected behavior

.svelte files containing TypeScript scripts should remain responsive no matter how large the scripts get, and no extra configuration by the developer should be required to maintain usable performance.

System (please complete the following information):

  • OS: macOS 10.15.7
  • IDE: VS Code 1.52.1
  • Plugin/Package: "Svelte for VSCode"

Additional context

As discussed with @dummdidumm on the Svelte Discord in #language-tools. For the attention of @jasonlyu123.

From @dummdidumm on Discord:

dummdidumm: Yeah, Semantic Tokens can be slow on large files, especially since we need to do many sourcemap-mappings
dummdidumm: @jasonlyu09 maybe we should add some limits for large files/ranges, essentially not making the request if the requested range to analyze becomes too big. I think VS Code does something similar.

Fixed bug

All 4 comments

Yeah, VSCode's typescript extension capped it at 100000. But I found it still very slow for us, maybe because of the source mapping, I think I would cap it at 50000;

Yeah, VSCode's typescript extension capped it at 100000. But I found it still very slow for us, maybe because of the source mapping, I think I would cap it at 50000;

@jasonlyu123 100,000 of what? My 2,600 line file has 128,292 characters, so if it's supposed to disable Semantic Tokens for files with more than 100,000 characters, it didn't seem to do so at all.

Though if it's about the length of some generated file (your PR mentions TSX, so maybe it is), rather than the source itself, I don't know what length my generated files are.

Characters, if js/ts file exceeds that character the typescript extension won't process the full file semantic tokens. I tried the number on the svelte extension in debug mode and it still seems to be too slow.

There're multiple types of semantic token requests that the language server can implement. We currently implement range and full-file like typescript extension. From my understanding, it seems like vscode will try to only get the ranged results when the full file is not available. And the range one is why you'll see a sudden color change when scrolling down the document.

It's now capped. Let us now if it's usable again when turning semantic tokens on.

Was this page helpful?
0 / 5 - 0 ratings