Bits (bit) to Megabytes (MB) Conversion
Bits
The bit (binary digit) is the fundamental unit of information in computing and digital communications, representing a single binary value — either 0 or 1. Formalised by Claude Shannon in his landmark 1948 paper "A Mathematical Theory of Communication," the bit is the atomic unit of information theory: the information content of a fair coin flip is exactly 1 bit. All digital data — text, images, audio, video, and executable code — is ultimately stored and transmitted as sequences of bits.
Megabytes
The megabyte (MB) is a unit of digital information equal to 1,000,000 bytes (10⁶ bytes) in the SI decimal definition, or 1,048,576 bytes in binary convention. A minute of CD-quality audio requires approximately 10 MB uncompressed; a typical JPEG photograph ranges from 2 to 8 MB; a standard definition video runs roughly 1 GB per hour. The megabyte was the dominant storage unit for personal computers throughout the 1980s and 1990s before the gigabyte became standard.
| Bits (bit) | Megabytes (MB) |
|---|---|
| 0.1 bit | 1.25E-8 MB |
| 1 bit | 1.25E-7 MB |
| 2 bit | 2.5E-7 MB |
| 3 bit | 3.75E-7 MB |
| 5 bit | 6.25E-7 MB |
| 10 bit | 1.25E-6 MB |
| 20 bit | 2.5E-6 MB |
| 30 bit | 3.75E-6 MB |
| 50 bit | 6.25E-6 MB |
| 100 bit | 1.25E-5 MB |
| 1000 bit | 0.000125 MB |