<p align="center"><img src="https://raw.githubusercontent.com/facebook/zstd/dev/doc/images/zstd_logo86.png" alt="Zstandard"></p>
__Zstandard__, or `zstd` as short version, is a fast lossless compression algorithm,
targeting real-time compression scenarios at zlib-level and better compression ratios.
It's backed by a very fast entropy stage, provided by [Huff0 and FSE library](https://github.com/Cyan4973/FiniteStateEntropy).
The project is provided as an open-source dual [BSD](LICENSE) and [GPLv2](COPYING) licensed **C** library,
and a command line utility producing and decoding `.zst`, `.gz`, `.xz` and `.lz4` files.
Should your project require another programming language,
a list of known ports and bindings is provided on [Zstandard homepage](http://www.zstd.net/#other-languages).
**Development branch status:**
[![Build Status][travisDevBadge]][travisLink]
[![Build status][AppveyorDevBadge]][AppveyorLink]
[![Build status][CircleDevBadge]][CircleLink]
[![Build status][CirrusDevBadge]][CirrusLink]
[![Fuzzing Status][OSSFuzzBadge]][OSSFuzzLink]
[travisDevBadge]: https://travis-ci.org/facebook/zstd.svg?branch=dev "Continuous Integration test suite"
[travisLink]: https://travis-ci.org/facebook/zstd
[AppveyorDevBadge]: https://ci.appveyor.com/api/projects/status/xt38wbdxjk5mrbem/branch/dev?svg=true "Windows test suite"
[AppveyorLink]: https://ci.appveyor.com/project/YannCollet/zstd-p0yf0
[CircleDevBadge]: https://circleci.com/gh/facebook/zstd/tree/dev.svg?style=shield "Short test suite"
[CircleLink]: https://circleci.com/gh/facebook/zstd
[CirrusDevBadge]: https://api.cirrus-ci.com/github/facebook/zstd.svg?branch=dev
[CirrusLink]: https://cirrus-ci.com/github/facebook/zstd
[OSSFuzzBadge]: https://oss-fuzz-build-logs.storage.googleapis.com/badges/zstd.svg
[OSSFuzzLink]: https://bugs.chromium.org/p/oss-fuzz/issues/list?sort=-opened&can=1&q=proj:zstd
## Benchmarks
For reference, several fast compression algorithms were tested and compared
on a server running Arch Linux (`Linux version 5.5.11-arch1-1`),
with a Core i9-9900K CPU @ 5.0GHz,
using [lzbench], an open-source in-memory benchmark by @inikep
compiled with [gcc] 9.3.0,
on the [Silesia compression corpus].
[lzbench]: https://github.com/inikep/lzbench
[Silesia compression corpus]: http://sun.aei.polsl.pl/~sdeor/index.php?page=silesia
[gcc]: https://gcc.gnu.org/
| Compressor name | Ratio | Compression| Decompress.|
| --------------- | ------| -----------| ---------- |
| **zstd 1.4.5 -1** | 2.884 | 500 MB/s | 1660 MB/s |
| zlib 1.2.11 -1 | 2.743 | 90 MB/s | 400 MB/s |
| brotli 1.0.7 -0 | 2.703 | 400 MB/s | 450 MB/s |
| **zstd 1.4.5 --fast=1** | 2.434 | 570 MB/s | 2200 MB/s |
| **zstd 1.4.5 --fast=3** | 2.312 | 640 MB/s | 2300 MB/s |
| quicklz 1.5.0 -1 | 2.238 | 560 MB/s | 710 MB/s |
| **zstd 1.4.5 --fast=5** | 2.178 | 700 MB/s | 2420 MB/s |
| lzo1x 2.10 -1 | 2.106 | 690 MB/s | 820 MB/s |
| lz4 1.9.2 | 2.101 | 740 MB/s | 4530 MB/s |
| **zstd 1.4.5 --fast=7** | 2.096 | 750 MB/s | 2480 MB/s |
| lzf 3.6 -1 | 2.077 | 410 MB/s | 860 MB/s |
| snappy 1.1.8 | 2.073 | 560 MB/s | 1790 MB/s |
[zlib]: http://www.zlib.net/
[LZ4]: http://www.lz4.org/
The negative compression levels, specified with `--fast=#`,
offer faster compression and decompression speed in exchange for some loss in
compression ratio compared to level 1, as seen in the table above.
Zstd can also offer stronger compression ratios at the cost of compression speed.
Speed vs Compression trade-off is configurable by small increments.
Decompression speed is preserved and remains roughly the same at all settings,
a property shared by most LZ compression algorithms, such as [zlib] or lzma.
The following tests were run
on a server running Linux Debian (`Linux version 4.14.0-3-amd64`)
with a Core i7-6700K CPU @ 4.0GHz,
using [lzbench], an open-source in-memory benchmark by @inikep
compiled with [gcc] 7.3.0,
on the [Silesia compression corpus].
Compression Speed vs Ratio | Decompression Speed
---------------------------|--------------------
![Compression Speed vs Ratio](doc/images/CSpeed2.png "Compression Speed vs Ratio") | ![Decompression Speed](doc/images/DSpeed3.png "Decompression Speed")
A few other algorithms can produce higher compression ratios at slower speeds, falling outside of the graph.
For a larger picture including slow modes, [click on this link](doc/images/DCspeed5.png).
## The case for Small Data compression
Previous charts provide results applicable to typical file and stream scenarios (several MB). Small data comes with different perspectives.
The smaller the amount of data to compress, the more difficult it is to compress. This problem is common to all compression algorithms, and reason is, compression algorithms learn from past data how to compress future data. But at the beginning of a new data set, there is no "past" to build upon.
To solve this situation, Zstd offers a __training mode__, which can be used to tune the algorithm for a selected type of data.
Training Zstandard is achieved by providing it with a few samples (one file per sample). The result of this training is stored in a file called "dictionary", which must be loaded before compression and decompression.
Using this dictionary, the compression ratio achievable on small data improves dramatically.
The following example uses the `github-users` [sample set](https://github.com/facebook/zstd/releases/tag/v1.1.3), created from [github public API](https://developer.github.com/v3/users/#get-all-users).
It consists of roughly 10K records weighing about 1KB each.
Compression Ratio | Compression Speed | Decompression Speed
------------------|-------------------|--------------------
![Compression Ratio](doc/images/dict-cr.png "Compression Ratio") | ![Compression Speed](doc/images/dict-cs.png "Compression Speed") | ![Decompression Speed](doc/images/dict-ds.png "Decompression Speed")
These compression gains are achieved while simultaneously providing _faster_ compression and decompression speeds.
Training works if there is some correlation in a family of small data samples. The more data-specific a dictionary is, the more efficient it is (there is no _universal dictionary_).
Hence, deploying one dictionary per type of data will provide the greatest benefits.
Dictionary gains are mostly effective in the first few KB. Then, the compression algorithm will gradually use previously decoded content to better compress the rest of the file.
### Dictionary compression How To:
1. Create the dictionary
`zstd --train FullPathToTrainingSet/* -o dictionaryName`
2. Compress with dictionary
`zstd -D dictionaryName FILE`
3. Decompress with dictionary
`zstd -D dictionaryName --decompress FILE.zst`
## Build instructions
### Makefile
If your system is compatible with standard `make` (or `gmake`),
invoking `make` in root directory will generate `zstd` cli in root directory.
Other available options include:
- `make install` : create and install zstd cli, library and man pages
- `make check` : create and run `zstd`, tests its behavior on local platform
### cmake
A `cmake` project generator is provided within `build/cmake`.
It can generate Makefiles or other build scripts
to create `zstd` binary, and `libzstd` dynamic and static libraries.
By default, `CMAKE_BUILD_TYPE` is set to `Release`.
### Meson
A Meson project is provided within [`build/meson`](build/meson). Follow
build instructions in that directory.
You can also take a look at [`.travis.yml`](.travis.yml) file for an
example about how Meson is used to build this project.
Note that default build type is **release**.
### VCPKG
You can build and install zstd [vcpkg](https://github.com/Microsoft/vcpkg/) dependency manager:
git clone https://github.com/Microsoft/vcpkg.git
cd vcpkg
./bootstrap-vcpkg.sh
./vcpkg integrate install
./vcpkg install zstd
The
没有合适的资源?快使用搜索试试~ 我知道了~
cmake-3.21.1.tar.gz
需积分: 12 5 下载量 6 浏览量
2022-08-31
15:45:35
上传
评论
收藏 9.18MB GZ 举报
温馨提示
cmake-3.21.1.tar.gz cmake-3.21.1.tar.gz cmake-3.21.1.tar.gz cmake-3.21.1.tar.gz cmake-3.21.1.tar.gz cmake-3.21.1.tar.gz cmake-3.21.1.tar.gz cmake-3.21.1.tar.gz cmake-3.21.1.tar.gz
资源推荐
资源详情
资源评论
收起资源包目录
cmake-3.21.1.tar.gz
(2000个子文件)
xmlparse.c 261KB
nghttp2_session.c 231KB
archive_write_set_format_iso9660.c 207KB
zstd_compress.c 179KB
frm_driver.c 138KB
openssl.c 138KB
http.c 136KB
ftp.c 131KB
url.c 128KB
sectransp.c 124KB
archive_write_disk_posix.c 123KB
libssh2.c 115KB
archive_read_support_format_zip.c 113KB
archive_read_support_format_rar5.c 107KB
multi.c 106KB
archive_string.c 103KB
nghttp2_hd_huffman_data.c 103KB
archive_read_support_format_7zip.c 97KB
archive_read_support_format_iso9660.c 92KB
zstdmt_compress.c 92KB
fs.c 91KB
setopt.c 90KB
ProcessWin32.c 86KB
libssh.c 86KB
ProcessUNIX.c 84KB
archive_read_support_format_rar.c 84KB
archive_read_support_format_lha.c 83KB
archive_read_support_format_cab.c 83KB
archive_read_support_format_tar.c 83KB
schannel.c 81KB
archive_read_support_format_xar.c 80KB
archive_write_set_format_xar.c 79KB
cmListFileLexer.c 78KB
archive_write_disk_windows.c 77KB
tty.c 77KB
zstd_decompress.c 77KB
http2.c 76KB
nss.c 74KB
archive_read_disk_posix.c 72KB
pipe.c 70KB
archive_read_disk_windows.c 67KB
transfer.c 63KB
deflate.c 63KB
nghttp2_hd.c 61KB
zstd_decompress_block.c 61KB
imap.c 60KB
archive_write_set_format_pax.c 60KB
archive_write_set_format_mtree.c 59KB
bzip2.c 57KB
smtp.c 56KB
archive_write_set_format_7zip.c 55KB
zstd_opt.c 54KB
ngtcp2.c 54KB
fs.c 54KB
mime.c 54KB
xmltok.c 54KB
divsufsort.c 53KB
xmltok_impl.c 52KB
archive_entry.c 52KB
zstd_lazy.c 52KB
util.c 52KB
archive_acl.c 51KB
gtls.c 50KB
connect.c 50KB
huf_decompress.c 50KB
archive_read_support_format_mtree.c 49KB
archive_write_set_format_zip.c 49KB
archive_read.c 48KB
inflate.c 48KB
cookie.c 47KB
archive_match.c 47KB
tcp.c 47KB
bzlib.c 45KB
pop3.c 44KB
zdict.c 44KB
stream.c 43KB
trees.c 43KB
telnet.c 43KB
zstd_compress_superblock.c 42KB
cover.c 41KB
urlapi.c 41KB
tftp.c 41KB
gskit.c 39KB
vtls.c 38KB
process.c 37KB
archive_digest.c 36KB
udp.c 36KB
x509asn1.c 35KB
udp.c 35KB
wolfssl.c 35KB
core.c 35KB
index.c 35KB
xmlrole.c 35KB
wolfssh.c 34KB
hostip.c 34KB
easy.c 33KB
fastpos_table.c 33KB
mbedtls.c 33KB
aix.c 32KB
archive_read_support_format_cpio.c 32KB
共 2000 条
- 1
- 2
- 3
- 4
- 5
- 6
- 20
资源评论
A_ZI_MAO
- 粉丝: 1
- 资源: 22
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
- YOLO-yolo资源
- 适用于 Java 项目的 Squash 客户端库 .zip
- 适用于 Java 的 Chef 食谱.zip
- Simulink仿真快速入门与实践基础教程
- js-leetcode题解之179-largest-number.js
- js-leetcode题解之174-dungeon-game.js
- Matlab工具箱使用与实践基础教程
- js-leetcode题解之173-binary-search-tree-iterator.js
- js-leetcode题解之172-factorial-trailing-zeroes.js
- js-leetcode题解之171-excel-sheet-column-number.js
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功