The term data compression refers to reducing the number of bits of data which needs to be saved or transmitted. You can do this with or without the loss of info, so what will be deleted in the course of the compression can be either redundant data or unneeded one. When the data is uncompressed afterwards, in the first case the content and its quality shall be identical, while in the second case the quality will be worse. You'll find different compression algorithms that are more effective for different sort of data. Compressing and uncompressing data in most cases takes a lot of processing time, which means that the server carrying out the action should have adequate resources in order to be able to process your data fast enough. An example how information can be compressed is to store just how many consecutive positions should have 1 and how many should have 0 in the binary code instead of storing the particular 1s and 0s.

Data Compression in Shared Hosting

The ZFS file system that operates on our cloud Internet hosting platform uses a compression algorithm called LZ4. The aforementioned is significantly faster and better than every other algorithm out there, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data faster than it is read from a hard drive, which improves the overall performance of websites hosted on ZFS-based platforms. Because the algorithm compresses data very well and it does that very fast, we are able to generate several backups of all the content kept in the shared hosting accounts on our servers every day. Both your content and its backups will need reduced space and since both ZFS and LZ4 work extremely fast, the backup generation will not affect the performance of the web hosting servers where your content will be stored.