-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Description
- I have searched the issues for my request and found nothing related and/or helpful
- I have searched the FAQ for help
- I have searched the Wiki for help
Is your feature request related to a problem? Please describe.
Some of the nerd-font packages can be really large, for example the Iosevka font clocks in at 244MiB on the latest release.
This can be painful for people who have slow connection and/or limited bandwidth.
Describe the solution you'd like
It'd be great if the project can provide alternative tarballs with better compression methods. From my (somewhat limited) research, zstandard, lzip and xz seems to be good contender for that spot.
To test things out, I've decided to go with VictorMono, which is currently 66MiB with the zip format (and 121MiB uncompressed).
To create the tarballs, I'm using the following commands. I'm also using plzip and pxz, the parallel versions of lzip and xz for better compression speed. For zstandard, I'm using the default/reference zstd package.
$ du -h VictorMono
121M VictorMono
$ time tar -I 'zstd --ultra -22 -T10' -cf VictorMono.tar.zst VictorMono
4.53s user 0.16s system 100% cpu 4.672 total
$ time tar -I 'plzip -9 -n10' -cf VictorMono.tar.lz VictorMono
39.18s user 0.24s system 185% cpu 21.217 total
$ time tar -I 'pxz -9 -T10' -cf VictorMono.tar.xz VictorMono
17.00s user 0.22s system 100% cpu 17.090 total
$ du -h VictorMono.*
67M VictorMono.zip
5.9M VictorMono.tar.lz
3.8M VictorMono.tar.xz
4.7M VictorMono.tar.zstAs you can see, the size difference is massive compared to the default zip file!
Some test for decompression speed:
$ time tar --use-compress-program="zstd" -xf VictorMono.tar.zst
0.04s user 0.08s system 134% cpu 0.093 total
$ time tar --use-compress-program="plzip" -xf VictorMono.tar.lz
0.82s user 0.09s system 187% cpu 0.488 total
$ time tar --use-compress-program="pxz" -xf VictorMono.tar.xz
0.26s user 0.07s system 115% cpu 0.284 totalzstd seems to be performing really well while xz and plzip required noticeable amount of time to extract.
Given these data, to me it seems like zstd would be the ideal choice. It gives compression ratio comparable with the alternatives while still being significantly faster to both decompress and compress.
I understand that providing multiple variant of the same archive can add some maintenance burden, but given the fact that producing is a one time effort while there's going to be thousands of people consuming the result - I believe the effort will be well worth it.
- Related article: Comparison of Compression Algorithms (the results on the article more or less lines up with my test as well).
Describe alternatives you've considered
(None)
Additional context
(None)