The term data compression refers to reducing the number of bits of info which has to be stored or transmitted. This can be done with or without the loss of data, so what will be removed in the course of the compression shall be either redundant data or unneeded one. When the data is uncompressed afterwards, in the first case the information and its quality shall be identical, whereas in the second case the quality will be worse. You can find different compression algorithms which are more efficient for various type of information. Compressing and uncompressing data normally takes plenty of processing time, which means that the server carrying out the action must have plenty of resources to be able to process the info quick enough. One simple example how information can be compressed is to store how many consecutive positions should have 1 and how many should have 0 within the binary code rather than storing the actual 1s and 0s.

Data Compression in Shared Web Hosting

The ZFS file system that operates on our cloud web hosting platform employs a compression algorithm named LZ4. The latter is significantly faster and better than any other algorithm you can find, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk, which improves the performance of Internet sites hosted on ZFS-based platforms. As the algorithm compresses data quite well and it does that quickly, we can generate several backup copies of all the content stored in the shared web hosting accounts on our servers on a daily basis. Both your content and its backups will need less space and since both ZFS and LZ4 work very fast, the backup generation will not affect the performance of the web servers where your content will be stored.

Data Compression in Semi-dedicated Servers

The ZFS file system which runs on the cloud platform where your semi-dedicated server account will be created uses a powerful compression algorithm called LZ4. It's one of the best algorithms out there and definitely the most efficient one when it comes to compressing and uncompressing website content, as its ratio is very high and it'll uncompress data faster than the same data can be read from a hard disk drive if it were uncompressed. Thus, using LZ4 will boost every site that runs on a platform where the algorithm is present. This high performance requires plenty of CPU processing time, which is provided by the multitude of clusters working together as part of our platform. What's more, LZ4 makes it possible for us to generate several backup copies of your content every day and save them for a month as they will take much less space than typical backups and will be generated considerably quicker without loading the servers.