Data compression is the compacting of info by decreasing the number of bits that are stored or transmitted. Because of this, the compressed information takes less disk space than the original one, so much more content can be stored on the same amount of space. There're different compression algorithms which work in different ways and with a number of them just the redundant bits are removed, which means that once the information is uncompressed, there is no decrease in quality. Others erase unnecessary bits, but uncompressing the data subsequently will result in lower quality in comparison with the original. Compressing and uncompressing content consumes a significant amount of system resources, in particular CPU processing time, so any Internet hosting platform that employs compression in real time should have enough power to support that attribute. An example how info can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of storing the entire code.