Megabytes (MB) to Bits (bit) Conversion
Megabytes
The megabyte (MB) is a unit of digital information equal to 1,000,000 bytes (10⁶ bytes) in the SI decimal definition, or 1,048,576 bytes in binary convention. A minute of CD-quality audio requires approximately 10 MB uncompressed; a typical JPEG photograph ranges from 2 to 8 MB; a standard definition video runs roughly 1 GB per hour. The megabyte was the dominant storage unit for personal computers throughout the 1980s and 1990s before the gigabyte became standard.
Bits
The bit (binary digit) is the fundamental unit of information in computing and digital communications, representing a single binary value — either 0 or 1. Formalised by Claude Shannon in his landmark 1948 paper "A Mathematical Theory of Communication," the bit is the atomic unit of information theory: the information content of a fair coin flip is exactly 1 bit. All digital data — text, images, audio, video, and executable code — is ultimately stored and transmitted as sequences of bits.
| Megabytes (MB) | Bits (bit) |
|---|---|
| 0.1 MB | 800000 bit |
| 1 MB | 8000000 bit |
| 2 MB | 16000000 bit |
| 3 MB | 24000000 bit |
| 5 MB | 40000000 bit |
| 10 MB | 80000000 bit |
| 20 MB | 160000000 bit |
| 30 MB | 240000000 bit |
| 50 MB | 400000000 bit |
| 100 MB | 800000000 bit |
| 1000 MB | 8000000000 bit |