Upng.js: Use dithering

Created on 27 Mar 2018  ·  5Comments  ·  Source: photopea/UPNG.js

It's important for colorful images.

Source image [174285 B]
firefox-512

After upng, web interface [69927 B]
firefox-512 upng

Result of pngquant looks better. [69932 B]
firefox-512 pngquant

Most helpful comment

Yes, pngquant calculates mean square error, and applies dithering only in areas with high error. This way areas that don't need dithering don't get the extra noise.

pngquant also does edge detection (similar to Prewitt algorithm) and disables dithering on the edges. This prevents anti-aliasing look like fur.

In pngquant 90% of time is spent on extra runs of K-means. If you use --speed 10 the whole recompression (on i7 2.3Ghz) takes ~80ms dithered, 50ms undithered.

(BTW, TinyPNG doesn't have its own algorithm. It's just a GUI for pngquant).

All 5 comments

I implemented Floyd-Steinberg dithering in the past, but it was not worth it.

I think we need some compression-friendly dithering. Do you know anybody who could help us?

pngquant uses Floyd-Steinberg modified for better color handling.

I believe, that dithering will always increase filesize because of its' random nature.
Only purpose of this feature — to pleasure our eyes.
Dithering can be hidden under flag, just like in Ps. Users will decide.

I think we need some compression-friendly dithering. Do you know anybody who could help us?

I think, we may ask @kornelski.

I mean, I made three versions of image:

  • A: 50 colors: 15 kB
  • B: 50 colors + Dithering: 23 kB
  • C: 100 colors: 22 kB

B looked as nice as C, but was slightly larger, so I thought that allowing more colors is better than dithering (both increase the file size).

I think we need dithering, that consists of some repetitive patterns, i.e. it should be "friendly" to Deflate algorithm - make B have only 20 kB (so it is still as nice as C, but smaller).

BTW. I also think, that pngquant performs a better Deflate (which also takes about 100x more time than UPNG.js: e.g. 30ms vs. 3000ms), so it can make B have only 20 kB, while using the same dithering as I did.

Oh, I see.
I don't know dithering algorithm, that can handle this case.

pngquant computes mse error, have min and max quality settings and don't write file if its' size too big or quality degrades dramatically.

Maybe you find this thread useful
https://encode.ru/threads/1757-Lossy-DEFLATE-lossy-PNG

And this project particually
https://github.com/foobaz/lossypng

Yes, pngquant calculates mean square error, and applies dithering only in areas with high error. This way areas that don't need dithering don't get the extra noise.

pngquant also does edge detection (similar to Prewitt algorithm) and disables dithering on the edges. This prevents anti-aliasing look like fur.

In pngquant 90% of time is spent on extra runs of K-means. If you use --speed 10 the whole recompression (on i7 2.3Ghz) takes ~80ms dithered, 50ms undithered.

(BTW, TinyPNG doesn't have its own algorithm. It's just a GUI for pngquant).

Was this page helpful?
0 / 5 - 0 ratings

Related issues

mn4367 picture mn4367  ·  16Comments

HRK44 picture HRK44  ·  9Comments

MartinMuzatko picture MartinMuzatko  ·  6Comments

samccone picture samccone  ·  3Comments

sontek picture sontek  ·  3Comments