The term data compression describes lowering the number of bits of information which needs to be stored or transmitted. This can be achieved with or without losing information, so what will be deleted throughout the compression will be either redundant data or unneeded one. When the data is uncompressed afterwards, in the first case the content and its quality shall be identical, whereas in the second case the quality will be worse. You will find different compression algorithms which are more effective for various sort of information. Compressing and uncompressing data in most cases takes lots of processing time, therefore the server executing the action must have ample resources in order to be able to process the data quick enough. One simple example how information can be compressed is to store how many sequential positions should have 1 and just how many should have 0 inside the binary code as an alternative to storing the particular 1s and 0s.

Data Compression in Shared Web Hosting

The ZFS file system that is run on our cloud hosting platform employs a compression algorithm identified as LZ4. The aforementioned is substantially faster and better than every other algorithm on the market, especially for compressing and uncompressing non-binary data i.e. internet content. LZ4 even uncompresses data faster than it is read from a hard disk, which improves the overall performance of websites hosted on ZFS-based platforms. As the algorithm compresses data very well and it does that quickly, we can generate several backup copies of all the content stored in the shared web hosting accounts on our servers daily. Both your content and its backups will require less space and since both ZFS and LZ4 work very quickly, the backup generation will not influence the performance of the hosting servers where your content will be stored.