Terminal: Design Clickable Links & Link preview feature/extension

Created on 8 May 2019  ·  30Comments  ·  Source: microsoft/terminal

The marketing video for Terminal included a part where they show links previewing on hover. Is this functionality currently available?

Area-Extensibility Issue-Feature Product-Terminal

Most helpful comment

I'm writing to respectfully request that this start small with eyes to the larger feature request. Getting http(s):// links to open in the default browser, when CTRL+clicking them, would be a great start! So many command line tools spit out web links that right now must be copied and pasted into a browser. If we only had that functionality, that could be enough for quite a while, in my opinion. Thank you for your consideration!

All 30 comments

Nope.

The marketing promo is more of a "what's to come" than a "what's available now". We appreciate your patience as we continue to work on this project :)

Also for linking, #555

@tackyunicorn - the terminal video is aspirational - it's goal is to illustrate some of the key features we're working hard to ship in the Terminal. Some of the features illustrated are already in the Terminal. Most will ship in v1.0 by the end of 2019. Others may require more design, build & bake time, and will ship after v1.0.

Currently, the extensibility host and extensions like "Link Preview" are not yet designed or implemented, and are likely going to ship after v1.0. We want to spend the time & effort to make sure we build such crucial features "right" to minimize disruption later.

We're working on our future roadmap and will share with the community in the next couple of weeks.

Summary of the new feature/enhancement

Detect URLs and make them clickable (open the URL in the default browser). This is a convenience feature present in many other terminals.

I believe the value is self-evident, but here's a concrete use case example regardless: when I run yarn outdated (Yarn is a package manager) it outputs a repository/homepage URL for every listed package. I want to be able to click one of those URLs to open the page quickly/easily and have a look at what has changed in the new version of the package.

Implementation details

  • Probably don't need to support anything more than text starting with http:// or https://.
  • URLs that span multiple lines (due to being truncated by the window width) should be handled correctly.
  • There should probably be an indication that the URL is clickable, e.g. cursor change + underline on mouse hover.
  • Most other terminals require an alt or ctrl click, I presume to guard against accidental clicks when copying and so forth.

You can look at something like VS Code's terminal for inspiration. Again this is all probably self-evident.

Stretch goal (covered in #204)


IMO it should support all schemas for the sake of completion, avoiding further issues asking for addition of new ones and reusability of the code/library.

One thing that may get complex to get right is handling of parentheses: most of implementations I've met incorrectly skip the closing parenthesis character ()) if it's the last character in the link breaking multiple Wikipedia links. Probably there should be used some parentheses matching algorithm to decide if this last character is a part of link or not.

EDIT: typo


most of implementations I've met incorrectly skip the closing parenthesis character ()) if it's the last character in the link breaking multiple Wikipedia links

On the other hand, URLs are often enclosed within parentheses in text flow (see e.g. http://example.com/foobar) <- like here, and so are in Markdown files.

In gnome-terminal we addressed these two contradicting requirements by allowing parentheses () and square brackets [] as long as they occur in balanced pairs. This was implemented using recursive regular expression. See g-t 763980.

Another similar tricky case is the trailing apostrophe, see g-t 448044.


we can't figure this out without polling the server

Haha, this is also a possibility – I personally wouldn't go for it, though. I'm sure you share my concerns about data leakage as well as slowness.

It's all guesswork, there's no perfect solution (well, there's #204 to address this gap). We found the matching parens thingy to work well enough, we haven't received a report since we implemented that. At least it definitely turned out to be better than either always matching the paren or never matching. It was a bit tricky to implement, digging into rarely used and hardly known regex features of certain regex libraries (so I'm not surprised that many terminals don't bother with it), but I think it was worth it. :)

I think we need to detect folders path and make them clickable too.
Easier to debug code:
WindowsTerminal_2019-07-18_09-11-04

The TeachingTip control could be used for the Link Preview feature

You can look at something like VS Code's terminal for inspiration. Again this is all probably self-evident.

Why not just share the same code?

Probably don't need to support anything more than text starting with http:// or https://.

I think you should add www. detection.

Why not just share the same code?

Well, it's written in TypeScript, a dialect of javascript ... so it won't exactly work in our C++ project.

Can't wait for this feature!! I love being able to hold CTRL and see a link get underlined in ConEmu where left clicking it opens it in the default browser. Hopefully a simple regex implementation of URL highlighting makes it in before the end of the year :)

The TeachingTip control could be used for the Link Preview feature

image

You can imagine this control appearing over the link, with the Favicon, Site Name, and URL or Bing description for the link

@mdtauk & @awarmfastbear - You mean like the link preview pop up that we illustrated in the Terminal's sizzle video?

image

The feature was hinted at in that sizzle reel as you point out @bitcrazed - My comment is more about using a standard XAML control to implement it, and not to do it in a way that is inconsistent with Windows UI. Think of it as a pre-emptive strike 😉

@bitcrazed that is literally the first sentence of this issue, btw ;)

@mdtauk Agreed. We definitely don't want to unnecessarily re-invent the wheel. If we can use standard XAML controls to implement features like this, we will.

If not, we'll work with the XAML controls teams to figure out if changes to existing controls need to be made, or if a new control with the correct behavior is required.

To be honest though, the biggest challenge here is unlikely to be which control to use, it'll be how to build an extensibility mechanism that delivers all the required features, while remaining sufficiently fast, stable, and safe.

It would be wonderful if you could support https://gist.github.com/egmontkob/eb114294efbcd5adb1944c9f3cb5feda

At one of the Ignite 2019 sessions, it was indicated that this would be supported via an extension feature. So, we need to wait for extension support to be done.

@bitcrazed Right now, Windows Terminal is insanely fast on the "Rainbow mode" benchmark at https://github.com/p-e-w/ternimal. I definitely share the same worry that "clickable links" is going to introduce a lot of parsing latency every time characters are written to screen, and severely slow down Windows Terminal, since a proper parser would have to re-scan every character that is connected to the character that was written to screen, to see if any of them form valid links.

I have an idea: What if links are not parsed at all and NOT clickable UNTIL the user holds down Control (or perhaps Alt) AND hovers their mouse over the Windows Terminal. In that scenario, the code would scan the line (with support for lines that are wrapped at preceding/following lines), and look for a URL or FILE PATH exactly at the mouse cursor. IF so, then highlight the URL, show the preview, make it clickable (as long as Ctrl is still held), etc.

Furthermore, if the user right-clicks, do a parsing of the unbroken character sequence at that spot (so stop at whitespace before/after the clicked sequence), thus highlighting the whole word. Then do link parsing. If a link is detected, show a "Open Link" menu item.

In both the ctrl-and-hover and the right-click scenarios, the parser is identical:

  • Get the hovered "row+column" of the mouse cursor.
  • If that column is whitespace, do nothing, since there is nothing to scan.
  • If that column ISN'T whitespace, collect all "connected" characters before and after it, by scanning until you encounter whitespace (newlines, spaces, etc; both vertical and horiz whitespace). The result will be a word, or a link, or anything else.
  • Do a simple link parsing of that, to detect whether it's a URL. Also do filesystem parsing, if something looks like a local file path (for example C:\Tools\foo.cpp:80 would be parsed from left to right and detect that the C:\Tools\foo.cpp is a file).
  • If a URL/file path is detected, handle the link.

These two solutions would be THE most high-performance answer, and holding Ctrl (or perhaps Alt) is no big deal. Many terminals deal with it that way, such as macOS terminal. In the macOS terminal, you hold Command + Double-click to open links, or alternatively you right-click to get a menu which "parses" the highlighted word/link (if you right click a link, that fact is detected and highlights the whole link), and then choose "Open Link".

All alternatives would be wasteful and much slower.

PS: In case it's helpful, the Alacritty terminal defines these characters as "character sequence Link-separators": ",│`|:\"' ()[]{}<>\t", which may be useful for defining how to split strings to detect paths, URLs, etc, in the clicked/hovered character sequence. If I was writing a parser, I'd use the "get the complete non-whitespace character sequence" technique I described above, followed by a pass through the "strip surrounding garbage such as <> or () after that, to detect the final link.

Hey @VideoPlayerCode - thanks for submitting your thoughts here.

I agree, only parsing for URLs when the user hits a key/chord would allow is to defer regex scanning, but it would still mean we couldn't color highlight URLs embedded in text - something that a lot of users have already asked for.

Also, URLs are just one of a class of text patterns that we'll want to look for and identify: For example, I'd love to have the ability to tell Terminal to draw a red box with a yellow fill under text that matches a specified text-pattern/regex/etc. - imagine being able to highlight error messages this way in a web server log as it streams-by. 😁

I am pretty sure we'll be able to find a strategy for being able to identify patterns/regex matches in a pretty performant manner. Eventually. One day 😜

@bitcrazed

I agree, only parsing for URLs when the user hits a key/chord would allow is to defer regex scanning, but it would still mean we couldn't color highlight URLs embedded in text - something that a lot of users have already asked for.

Ahh, I just realized that the "always-active scanner" feature was presented in the marketing video and is expected by users. Oops.

Also, URLs are just one of a class of text patterns that we'll want to look for and identify: For example, I'd love to have the ability to tell Terminal to draw a red box with a yellow fill under text that matches a specified text-pattern/regex/etc. - imagine being able to highlight error messages this way in a web server log as it streams-by. 😁

Woah, I love that idea! Being able to set a regex in the user config and selecting how to highlight matches would be fantastic.

I am pretty sure we'll be able to find a strategy for being able to identify patterns/regex matches in a pretty performant manner. Eventually. One day 😜

Well, if there needs to be continuous scanning... here are some ideas:

  • When doing the scans, only do the currently visible screen buffer. If the screen buffer starts with a non-whitespace character, also include the text that was off-screen up until the previous whitespace (such as if the top line is hub.com/foo but is actually a wrapped continuation of https://git<wrap>hub.com/foo).
  • Running the regex on that buffer would need to be throttled in one of four ways (or combinations of all):
  • A: Only do a scan whenever nothing has been written to the screen (no new text/changed characters) for a sane amount of time, such as 300ms. That means that when rapidly scrolling logs or "cat" is active, they'll run uninhibited.
  • Or B: Run it at a pulsing interval such as 300ms, where you block the execution, do a quick scan, mark highlights where needed, then continue executing (consuming the incoming stdout data).
  • Or C: Scan in realtime when the scrolling is slow, but completely stop realtime scanning when the output rate is very high. For example "cat 10mbfile.txt" would disable realtime scanning since the terminal would notice an insane output rate. And as soon as the insane output stops, do a one-time pass through the old output buffer that was just streamed. I like this idea the most. And combine it with the technique of only scanning the visible screen buffer.
  • Or D: Scan all data live as it comes in (gonna suck for performance), and cache the results. Whenever lines are added or characters are replaced in the buffer, scan/re-scan only those lines.
  • Also consider only supporting links, at least for realtime scanning. Because custom regexps could literally be insane enough that they would match the whole buffer. Imagine how you would even BEGIN to match a regexp that matches from the words "Log Result:" all the way down to a line that says "End of Log", in your "custom regexp" example. That could be thousands of lines of data that should match. Heck, that regexp wouldn't even know how to match anything until it sees the "End of Log" clamping at the end of the match. Very crazy...
  • If you only support links, you can optimize like crazy by only scanning for "http:" and "https:" (case-insensitive) with a pretty simple, optimized pattern-matcher, and then doing a "full regexp engine" scan _only_ on the whitespace-delimited character sequence around that match.
  • Also be careful about people doing "cat" on files that have run-on characters endlessly, since it's going to kill performance.
  • For methods A, B and C: When the user scrolls backwards in the buffer to see old text, they'll be picky and won't expect links to "pop" into view during a throttled pulse interval. So do faster scanning when scrolling manually with the scrollbars. But ouch, I can barely imagine how rough the code would be for scanning realtime flowing/self-modifying text in a performant way.

In short: Oh dear, good luck meeting user expectations of a pretty insane (in a bad performance way) feature. This problem is kinda like trying to write a fast regexp to scan a HTML webpage that constantly downloads more and more data and self-modifies old data... 😛 Hope at least some of this can inspire some lines of thought to an eventual solution!

Thanks for sharing your thoughts :) Glad you like this feature as much as I do ;)

Whatever scanning mechanism we use, pretty sure it'll end up being:

1) Async
2) Efficient
3) Configurable

You hit the nail on the head tho re. your comments on perf - this is a major concern of a feature like this and one we need to pay close attention to. It simply wasn't possible to engineer a general solution atop the old Console engine, but we're nearing the point where the newly re-architected engine underlying the Terminal in particular should give us the access and features we need.

Now, if only we had more dev's and/or sufficiently motivated community members with the skills and time to help out 😜

The second you have extension points I'm sure there is plenty of us out there with the skills. And we'll try very hard to make the time :) Count myself in at the very least.

Looking forward to that day :)

When this does land, it would be great to be able to override the default application on a per profile basis. The ultimate dream would be to be able to use different chords to launch a URL in different browsers.

I imagine you are already aware, but there's some security risks that have caused similar problems in the past, particularly iTerm had issues doing DNS lookups when hovering over links to highlight them before clicking through. This was a fairly major issue for security analysts investigating live attacks, but also caused problems as it was sending DNS lookups for things that resembled urls (including sensitive info - e.g. passwords) in plaintext.

I'll not go as far as saying this behaviour is right or wrong, but definitely something to bear in mind when implementing this. For reference:
https://gitlab.com/gnachman/iterm2/issues/6050
https://gitlab.com/gnachman/iterm2/issues/3688
https://gitlab.com/gnachman/iterm2/issues/5303
https://gitlab.com/gnachman/iterm2/-/wikis/dnslookupissue
https://nvd.nist.gov/vuln/detail/CVE-2015-9231

I'm writing to respectfully request that this start small with eyes to the larger feature request. Getting http(s):// links to open in the default browser, when CTRL+clicking them, would be a great start! So many command line tools spit out web links that right now must be copied and pasted into a browser. If we only had that functionality, that could be enough for quite a while, in my opinion. Thank you for your consideration!

I like the windows terminal, but I hate switching to a different terminal every time I want to click links. That's all.

Jumping on the bandwagon here to ask for this feature. Coming from macOS, I really miss a quick way to click URLs without having to go through the awkward copy/paste ritual.

On behalf of some the subsribers watching this for notable changes / updates; can we lock / mute this issue for now?

The additional asks, +1s, don't provide value and bring more pollution to our inboxes. If someone wants to contribute to this, they can make a CR and reference this issue. :)

Was this page helpful?
0 / 5 - 0 ratings

Related issues

waf picture waf  ·  3Comments

carlos-zamora picture carlos-zamora  ·  3Comments

ghvanderweg picture ghvanderweg  ·  3Comments

mdtauk picture mdtauk  ·  3Comments

zadjii-msft picture zadjii-msft  ·  3Comments