Megabyte

Last updated
Multiples of bytes
Decimal
Value Metric
1000kB kilobyte
10002MB megabyte
10003GB gigabyte
10004TB terabyte
10005PB petabyte
10006EB exabyte
10007ZB zettabyte
10008YB yottabyte
Binary
Value IEC JEDEC
1024KiB kibibyte KBkilobyte
10242MiB mebibyte MBmegabyte
10243GiB gibibyte GBgigabyte
10244TiB tebibyte
10245PiB pebibyte
10246EiB exbibyte
10247ZiB zebibyte
10248YiB yobibyte

The megabyte is a multiple of the unit byte for digital information. Its recommended unit symbol is MB. The unit prefix mega is a multiplier of 1000000 (106) in the International System of Units (SI). [1] Therefore, one megabyte is one million bytes of information. This definition has been incorporated into the International System of Quantities.

Contents

However, in the computer and information technology fields, several other definitions are used that arose for historical reasons of convenience. A common usage has been to designate one megabyte as 1048576bytes (220 B), a measurement that conveniently expresses the binary multiples inherent in digital computer memory architectures. However, most standards bodies have deprecated this usage in favor of a set of binary prefixes, [2] in which this quantity is designated by the unit mebibyte (MiB). Less common is a convention that uses the megabyte to mean 1000×1024 (1024000) bytes. [2]

Definitions

The megabyte is commonly used to measure either 10002 bytes or 10242 bytes. The interpretation of using base 1024 originated as a compromise technical jargon for the byte multiples that needed to be expressed by the powers of 2 but lacked a convenient name. As 1024 (210) approximates 1000 (103), roughly corresponding to the SI prefix kilo-, it was a convenient term to denote the binary multiple. In 1998 the International Electrotechnical Commission (IEC) proposed standards for binary prefixes requiring the use of megabyte to strictly denote 10002 bytes and mebibyte to denote 10242 bytes. By the end of 2009, the IEC Standard had been adopted by the IEEE, EU, ISO and NIST. Nevertheless, the term megabyte continues to be widely used with different meanings:

Base 10
1 MB = 1000000 bytes (= 10002 B = 106 B) is the definition recommended by the International System of Units (SI) and the International Electrotechnical Commission IEC. [2] This definition is used in networking contexts and most storage media, particularly hard drives, flash-based storage, [3] and DVDs, and is also consistent with the other uses of the SI prefix in computing, such as CPU clock speeds or measures of performance. The Mac OS X 10.6 file manager is a notable example of this usage in software. Since Snow Leopard, file sizes are reported in decimal units. [4]

In this convention, one thousand megabytes (1000 MB) is equal to one gigabyte (1 GB), where 1 GB is one billion bytes.

Base 2
1 MB = 1048576 bytes (= 10242 B = 220 B) is the definition used by Microsoft Windows in reference to computer memory, such as RAM. This definition is synonymous with the unambiguous binary prefix mebibyte.

In this convention, one thousand and twenty-four megabytes (1024 MB) is equal to one gigabyte (1 GB), where 1 GB is 10243 bytes.

Mixed
1 MB = 1024000 bytes (= 1000×1024 B) is the definition used to describe the formatted capacity of the 1.44 MB 3.5-inch HD floppy disk, which actually has a capacity of 1474560bytes. [5]

Semiconductor memory doubles in size for each address lane added to an integrated circuit package, which favors counts that are powers of two. The capacity of a disk drive is the product of the sector size, number of sectors per track, number of tracks per side, and the number of disk platters in the drive. Changes in any of these factors would not usually double the size. Sector sizes were set as powers of two (most common 512 bytes or 4096 bytes) for convenience in processing. It was a natural extension to give the capacity of a disk drive in multiples of the sector size, giving a mix of decimal and binary multiples when expressing total disk capacity.

Examples of use

1.44 MB floppy disks can store 1,474,560 bytes of data. MB in this context means 1,000x1,024 bytes. 3,5"-Diskette.jpg
1.44 MB floppy disks can store 1,474,560 bytes of data. MB in this context means 1,000×1,024 bytes.

Depending on compression methods and file format, a megabyte of data can roughly be:

The human genome consists of DNA representing 800 MB of data. The parts that differentiate one person from another can be compressed to 4 MB. [6]

See also

Related Research Articles

A binary prefix is a unit prefix for multiples of units in data processing, data transmission, and digital information, notably the bit and the byte, to indicate multiplication by a power of 2.

The gigabyte is a multiple of the unit byte for digital information. The prefix giga means 109 in the International System of Units (SI). Therefore, one gigabyte is one billion bytes. The unit symbol for the gigabyte is GB.

Giga ( or ) is a unit prefix in the metric system denoting a factor of a (short-form) billion (109 or 1000000000). It has the symbol G.

The kilobyte is a multiple of the unit byte for digital information.

The kilobit is a multiple of the unit bit for digital information or computer storage. The prefix kilo- (symbol k) is defined in the International System of Units (SI) as a multiplier of 103 (1 thousand), and therefore,

The mebibyte is a multiple of the unit byte for digital information. The binary prefix mebi means 220; therefore one mebibyte is equal to 1048576bytes, i.e., 1024 kibibytes. The unit symbol for the mebibyte is MiB.

The gibibyte is a multiple of the unit byte for digital information. The binary prefix gibi means 230, therefore one gibibyte is equal to 1073741824bytes = 1024 mebibytes. The unit symbol for the gibibyte is GiB. It is one of the units with binary prefixes defined by the International Electrotechnical Commission (IEC) in 1998.

The kibibyte is a multiple of the unit byte for quantities of digital information. The binary prefix kibi means 210, or 1024; therefore, 1 kibibyte is 1024 bytes. The unit symbol for the kibibyte is KiB.

The megabit is a multiple of the unit bit for digital information. The prefix mega (symbol M) is defined in the International System of Units (SI) as a multiplier of 106 (1 million), and therefore

The gigabit is a multiple of the unit bit for digital information or computer storage. The prefix giga (symbol G) is defined in the International System of Units (SI) as a multiplier of 109 (1 billion, short scale), and therefore

An order of magnitude is a factor of ten. A quantity growing by four orders of magnitude implies it has grown by a factor of 10,000 or 104.

The mebibit is a multiple of the bit, a unit of information, prefixed by the standards-based multiplier "mebi" (symbol Mi), a binary prefix meaning 220. The unit symbol of the mebibit is Mibit.

A unit prefix is a specifier or mnemonic that is prepended to units of measurement to indicate multiples or fractions of the units. Units of various sizes are commonly formed by the use of such prefixes. The prefixes of the metric system, such as kilo and milli, represent multiplication by powers of ten. In information technology it is common to use binary prefixes, which are based on powers of two. Historically, many prefixes have been used or proposed by various sources, but only a narrow set has been recognised by standards organisations.

File size is a measure of how much data a computer file contains or, alternately, how much storage it consumes. Typically, file size is expressed in units of measurement based on the byte. By convention, file size units use either a metric prefix or a binary prefix.

IEEE 1541-2002 is a standard issued in 2002 by the Institute of Electrical and Electronics Engineers (IEEE) concerning the use of prefixes for binary multiples of units of measurement related to digital electronics and computing.

The zebibyte is a multiple of the unit byte for digital information. It is a member of the set of units with binary prefixes defined by the International Electrotechnical Commission (IEC). Its unit symbol is ZiB.

ISO 80000 or IEC 80000 is an international standard promulgated jointly by the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC).

In telecommunications, data-transfer rate is the average number of bits (bitrate), characters or symbols (baudrate), or data blocks per unit time passing through a communication link in a data-transmission system. Common data rate units are multiples of bits per second (bit/s) and bytes per second (B/s). For example, the data rates of modern residential high-speed Internet connections are commonly expressed in megabits per second (Mbit/s).

This article presents a timeline of binary prefixes used to name memory units, in comparison of decimal and binary prefixes for measurement of information and computer storage.

In computing and telecommunications, a unit of information is the capacity of some standard data storage system or communication channel, used to measure the capacities of other systems and channels. In information theory, units of information are also used to measure the entropy of random variables and information contained in messages.

References

  1. "Archived copy". Archived from the original on June 7, 2007. Retrieved June 1, 2007.CS1 maint: archived copy as title (link)
  2. 1 2 3 "Definitions of the SI units: The binary prefixes". National Institute of Standards and Technology.
  3. SanDisk USB Flash Drive "Note: 1 megabyte (MB) = 1 million bytes; 1 gigabyte (GB) = 1 billion bytes."
  4. "How Mac OS X reports drive capacity". Apple Inc. 2009-08-27. Retrieved 2009-10-16.
  5. Tracing the History of the Computer - History of the Floppy Disk
  6. Christley, S. .; Lu, Y. .; Li, C. .; Xie, X. . (2008). "Human genomes as email attachments". Bioinformatics. 25 (2): 274–275. doi:10.1093/bioinformatics/btn582. PMID   18996942.