First results of LZ_XOR on enwik8 (a common 100MB test file of Wikipedia data, higher ratio is smaller file):
LZ4: 58.09% 2.369 GiB/sec decompress
Zstd: 69.03% .639 GiB/sec
LZ_XOR: 63.08% 1.204 GiB/sec (36,922,969 bytes)
lzham_codec_devel: 74.93% .205 GiB/sec
In this run LZ_XOR used a 128KB dictionary and I limited its parsing depth to speed up the encode.
Options:
LZ4_HC: level LZ4HC_CLEVEL_MAX
Zstd: level 11
lzham_codec_devel: defaults
LZ_XOR: 128KB, GPU assisted, exhaustive partial match searching disabled
Some LZ_XOR statistics:
Total LIT: 263887
Total XOR: 2380411
Total COPY: 9544076
Total used distance history: 1238007
Total LIT bytes: 755210
Total XOR bytes: 15157561
Total COPY bytes: 84087229
Total instructions: 12188374