Megabits (Mbit) to Bits (bit) Conversion
Megabits
The megabit (Mbit) is a unit of digital information equal to exactly 1,000,000 bits (10⁶ bits). It is the standard unit for expressing internet connection speeds: a "100 Mbps" broadband connection transfers 100 million bits per second, equivalent to 12.5 megabytes (MB/s). This factor-of-8 difference between megabits and megabytes is a frequent source of consumer confusion when comparing marketed connection speeds against actual file download rates.
Bits
The bit (binary digit) is the fundamental unit of information in computing and digital communications, representing a single binary value — either 0 or 1. Formalised by Claude Shannon in his landmark 1948 paper "A Mathematical Theory of Communication," the bit is the atomic unit of information theory: the information content of a fair coin flip is exactly 1 bit. All digital data — text, images, audio, video, and executable code — is ultimately stored and transmitted as sequences of bits.
| Megabits (Mbit) | Bits (bit) |
|---|---|
| 0.1 Mbit | 100000 bit |
| 1 Mbit | 1000000 bit |
| 2 Mbit | 2000000 bit |
| 3 Mbit | 3000000 bit |
| 5 Mbit | 5000000 bit |
| 10 Mbit | 10000000 bit |
| 20 Mbit | 20000000 bit |
| 30 Mbit | 30000000 bit |
| 50 Mbit | 50000000 bit |
| 100 Mbit | 100000000 bit |
| 1000 Mbit | 1000000000 bit |