Data compression is the compacting of info by reducing the number of bits that are stored or transmitted. As a result, the compressed data will take less disk space than the original one, so more content could be stored using identical amount of space. You can find many different compression algorithms which function in different ways and with several of them just the redundant bits are removed, therefore once the info is uncompressed, there's no decrease in quality. Others delete excessive bits, but uncompressing the data at a later time will lead to lower quality compared to the original. Compressing and uncompressing content takes a significant amount of system resources, in particular CPU processing time, therefore every web hosting platform which uses compression in real time needs to have sufficient power to support that feature. An example how info can be compressed is to replace a binary code such as 111111 with 6x1 i.e. "remembering" how many sequential 1s or 0s there should be instead of saving the entire code.

Data Compression in Shared Web Hosting

The compression algorithm that we use on the cloud internet hosting platform where your new shared web hosting account shall be created is named LZ4 and it's applied by the leading-edge ZFS file system which powers the system. The algorithm is greater than the ones other file systems use because its compression ratio is higher and it processes data a lot quicker. The speed is most noticeable when content is being uncompressed as this happens more quickly than info can be read from a hard drive. Consequently, LZ4 improves the performance of any Internet site located on a server that uses this algorithm. We take advantage of LZ4 in an additional way - its speed and compression ratio allow us to make a couple of daily backup copies of the entire content of all accounts and store them for 30 days. Not only do the backup copies take less space, but also their generation doesn't slow the servers down like it often happens with other file systems.