Terabytes (TB) to Bits (bit) Conversion
Terabytes
The terabyte (TB) is a unit of digital information equal to 1,000,000,000,000 bytes (10¹² bytes) in SI decimal, or 1,099,511,627,776 bytes in binary convention. Consumer hard drives and SSDs are now commonly sold in 1–4 TB capacities. The entire text collection of the US Library of Congress is estimated at approximately 20 TB. In data centres, individual server storage arrays routinely exceed hundreds of terabytes.
Bits
The bit (binary digit) is the fundamental unit of information in computing and digital communications, representing a single binary value — either 0 or 1. Formalised by Claude Shannon in his landmark 1948 paper "A Mathematical Theory of Communication," the bit is the atomic unit of information theory: the information content of a fair coin flip is exactly 1 bit. All digital data — text, images, audio, video, and executable code — is ultimately stored and transmitted as sequences of bits.
| Terabytes (TB) | Bits (bit) |
|---|---|
| 0.1 TB | 800000000000 bit |
| 1 TB | 8000000000000 bit |
| 2 TB | 16000000000000 bit |
| 3 TB | 24000000000000 bit |
| 5 TB | 40000000000000 bit |
| 10 TB | 80000000000000 bit |
| 20 TB | 1.6E+14 bit |
| 30 TB | 2.4E+14 bit |
| 50 TB | 4.0E+14 bit |
| 100 TB | 8.0E+14 bit |
| 1000 TB | 8.0E+15 bit |