Data compression is the compacting of information by reducing the number of bits which are stored or transmitted. As a result, the compressed info will require less disk space than the original one, so additional content can be stored on identical amount of space. You'll find different compression algorithms that work in different ways and with a number of them just the redundant bits are deleted, which means that once the info is uncompressed, there's no decrease in quality. Others remove unneeded bits, but uncompressing the data later on will result in reduced quality compared to the original. Compressing and uncompressing content consumes a significant amount of system resources, and in particular CPU processing time, therefore every hosting platform that uses compression in real time must have adequate power to support that attribute. An example how information can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" the number of consecutive 1s or 0s there should be instead of storing the entire code.