Data compression is the compacting of info by decreasing the number of bits that are stored or transmitted. In this way, the compressed data requires less disk space than the initial one, so much more content can be stored on the same amount of space. You'll find various compression algorithms that function in different ways and with a number of them only the redundant bits are deleted, therefore once the data is uncompressed, there's no decrease in quality. Others erase excessive bits, but uncompressing the data later will lead to lower quality compared to the original. Compressing and uncompressing content consumes a significant amount of system resources, and in particular CPU processing time, so every Internet hosting platform which uses compression in real time needs to have sufficient power to support that feature. An example how data can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" the number of sequential 1s or 0s there should be instead of keeping the entire code.

Data Compression in Web Hosting

The ZFS file system which is run on our cloud web hosting platform employs a compression algorithm called LZ4. The aforementioned is substantially faster and better than any other algorithm out there, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk, which improves the overall performance of Internet sites hosted on ZFS-based platforms. As the algorithm compresses data quite well and it does that very quickly, we're able to generate several backup copies of all the content kept in the web hosting accounts on our servers every day. Both your content and its backups will take less space and since both ZFS and LZ4 work very fast, the backup generation will not affect the performance of the hosting servers where your content will be stored.