Data compression algorithms and tools

From Archiveteam
Jump to navigation Jump to search

This list contains the most popular data compression algorithms and tools. All of them are free and open source, an important detail if you want to preserve data for a long time from now and to be able to decompress the data in the future.

General purpose compression

7z

Gzip

  • http://www.gzip.org/
    • Not strong but fast and very widely supported.
    • Pre-installed on pretty much every Linux computer.

Bzip2

LZMA/LZMA2

Aside from 7z, there's also

  • xz
    • Based on LZMA SDK
    • Commonly included by default in Linux distros
  • lzip
    • Claims to be stronger yet faster than bzip2.
    • Not as widely supported as xz, bzip2, and gzip.
    • Well defined file format and emphasis on file integrity
    • lziprecover can correct some bit-flip errors and merge damaged copies.

Zip

  • Available by default in any Windows version available today, but if you need cross-platform, use 7-zip.

Zstandard

  • https://facebook.github.io/zstd/
  • Very efficient in both time and compression ratio.
  • First-class support for custom dictionaries, which is particularly useful when compressing many small data units (e.g. WARC file with many HTML pages from one particular website). Using a trained dictionary for the compression massively improves the compression ratio in such scenarios.

Heavy duty compression

These programs often use large amounts of memory to get the best possible compression ratio.

lrzip

"This is a compression program optimised for large files" -lrzip readme

lrzip is fantastic for archiving - the compression ratio improves as the size of the input file grows - albeit a terribly slow compressor. lrzip really shines when compressing large sets of redundant information - but distant, and otherwise unconnected. General purpose compression algorithms would never see this, given their tiny compression window.

lrzip benchmarks

ZPAQ

  • http://mattmahoney.net/dc/zpaq.html
  • Uses deduplication, journaling, and several different compression algorithms (LZ77, BWT, and PAQ context mixing)
  • Supported by lrzip
  • EXTREMELY slow

KGB

Uses the PAQ6 compression algorithm. Excellent compression ratio (better than 7z), but a bit slow.

You can install it in Ubuntu with: sudo apt-get install kgb

How to:

  • kgb -m file.kgb originalfile
  • m is a number from 0 to 9 (lowest compression ratio from higher; higher use 1616 MB of RAM, a lot of CPU and time)

not recommended

LZO

A format that is best avoided is LZO, given that its developer, Markus Oberhumer, has excluded his site which contains the source code (Oberhumer.com) from the Wayback Machine, so contrary to the developers of LZip, Oberhumer clearly is not interested in having the source code for his format preserved.

While we can't prevent people from requesting exclusions from the Wayback Machine, what we can do is distrusting them and avoiding usage of their software.

In addition, Oberhumer sells a proprietary compression format called "LZO Professional", with the proclaimed benefit of an improved compression ratio and speed[1], but without mentioning which format it is compared to, and it is unclear how it compares to existing freely licensed open-source formats.

Given that "LZO Professional" is both an obscure and a proprietary format, it is prone to digital obsolescence, so using it is strongly recommended against.

External links