Data compression is the compacting of information by reducing the number of bits which are stored or transmitted. In this way, the compressed info will need substantially less disk space than the initial one, so additional content can be stored on the same amount of space. You'll find different compression algorithms which work in different ways and with some of them only the redundant bits are removed, therefore once the information is uncompressed, there is no loss of quality. Others delete excessive bits, but uncompressing the data later on will result in reduced quality compared to the original. Compressing and uncompressing content requires a huge amount of system resources, especially CPU processing time, so any hosting platform that uses compression in real time must have ample power to support this feature. An example how information can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" what number of consecutive 1s or 0s there should be instead of storing the whole code.