Bits (bit) to Megabits (Mbit) Conversion
Bits
The bit (binary digit) is the fundamental unit of information in computing and digital communications, representing a single binary value — either 0 or 1. Formalised by Claude Shannon in his landmark 1948 paper "A Mathematical Theory of Communication," the bit is the atomic unit of information theory: the information content of a fair coin flip is exactly 1 bit. All digital data — text, images, audio, video, and executable code — is ultimately stored and transmitted as sequences of bits.
Megabits
The megabit (Mbit) is a unit of digital information equal to exactly 1,000,000 bits (10⁶ bits). It is the standard unit for expressing internet connection speeds: a "100 Mbps" broadband connection transfers 100 million bits per second, equivalent to 12.5 megabytes (MB/s). This factor-of-8 difference between megabits and megabytes is a frequent source of consumer confusion when comparing marketed connection speeds against actual file download rates.
| Bits (bit) | Megabits (Mbit) |
|---|---|
| 0.1 bit | 1.0E-7 Mbit |
| 1 bit | 1.0E-6 Mbit |
| 2 bit | 2.0E-6 Mbit |
| 3 bit | 3.0E-6 Mbit |
| 5 bit | 5.0E-6 Mbit |
| 10 bit | 1.0E-5 Mbit |
| 20 bit | 2.0E-5 Mbit |
| 30 bit | 3.0E-5 Mbit |
| 50 bit | 5.0E-5 Mbit |
| 100 bit | 0.0001 Mbit |
| 1000 bit | 0.001 Mbit |