Megabyte

Last updated

Multiple-byte units
Decimal
Value Metric
1000kB kilobyte
10002MB megabyte
10003GB gigabyte
10004TBterabyte
10005PBpetabyte
10006EBexabyte
10007ZBzettabyte
10008YByottabyte
10009RBronnabyte
100010QBquettabyte
Binary
Value IEC Memory
1024KiBkibibyteKBkilobyte
10242MiBmebibyteMBmegabyte
10243GiBgibibyteGBgigabyte
10244TiBtebibyteTBterabyte
10245PiBpebibyte
10246EiBexbibyte
10247ZiBzebibyte
10248YiByobibyte
10249
102410
Orders of magnitude of data

The megabyte is a multiple of the unit byte for digital information. Its recommended unit symbol is MB. The unit prefix mega is a multiplier of 1000000 (106) in the International System of Units (SI). [1] Therefore, one megabyte is one million bytes of information. This definition has been incorporated into the International System of Quantities.

Contents

In the computer and information technology fields, other definitions have been used that arose for historical reasons of convenience. A common usage has been to designate one megabyte as 1048576bytes (220 B), a quantity that conveniently expresses the binary architecture of digital computer memory. Standards bodies have deprecated this binary usage of the mega- prefix in favor of a new set of binary prefixes, [2] by means of which the quantity 220 B is named mebibyte (symbol MiB).

Definitions

The unit megabyte is commonly used for 10002 (one million) bytes or 10242 bytes. The interpretation of using base 1024 originated as technical jargon for the byte multiples that needed to be expressed by the powers of 2 but lacked a convenient name. As 1024 (210) approximates 1000 (103), roughly corresponding to the SI prefix kilo-, it was a convenient term to denote the binary multiple. In 1999, the International Electrotechnical Commission (IEC) published standards for binary prefixes requiring the use of megabyte to denote 10002 bytes, and mebibyte to denote 10242 bytes. By the end of 2009, the IEC Standard had been adopted by the IEEE, EU, ISO and NIST. Nevertheless, the term megabyte continues to be widely used with different meanings.

Base 10
1 MB = 1000000 bytes (= 10002 B = 106 B) is the definition following the rules of the International System of Units (SI), and the International Electrotechnical Commission (IEC). [2] This definition is used in computer networking contexts and most storage media, particularly hard drives, flash-based storage, [3] and DVDs, and is also consistent with the other uses of the SI prefix in computing, such as CPU clock speeds or measures of performance. The Mac OS X 10.6 file manager is a notable example of this usage in software. Since Snow Leopard, file sizes are reported in decimal units. [4]

In this convention, one thousand megabytes (1000 MB) is equal to one gigabyte (1 GB), where 1 GB is one billion bytes.

Base 2
1 MB = 1048576 bytes (= 10242 B = 220 B) is the definition used by Microsoft Windows in reference to computer memory, such as random-access memory (RAM). This definition is synonymous with the unambiguous binary unit mebibyte. In this convention, one thousand and twenty-four megabytes (1024 MB) is equal to one gigabyte (1 GB), where 1 GB is 10243 bytes (i.e., 1  GiB).
Mixed
1 MB = 1024000 bytes (= 1000×1024 B) is the definition used to describe the formatted capacity of the 1.44 MB 3.5-inch HD floppy disk, which actually has a capacity of 1474560bytes. [5]

Randomly addressable semiconductor memory doubles in size for each address lane added to an integrated circuit package, which favors counts that are powers of two. The capacity of a disk drive is the product of the sector size, number of sectors per track, number of tracks per side, and the number of disk platters in the drive. Changes in any of these factors would not usually double the size.

Examples of use

1.44 MB floppy disks can store 1,474,560 bytes of data. MB in this context means 1,000x1,024 bytes. Imation 3.5" diskette 20050729.jpg
1.44 MB floppy disks can store 1,474,560 bytes of data. MB in this context means 1,000×1,024 bytes.

Depending on compression methods and file format, a megabyte of data can roughly be:

The novel The Picture of Dorian Gray , by Oscar Wilde, hosted on Project Gutenberg as an uncompressed plain text file, is 0.429 MB. Great Expectations is 0.994 MB, [6] and Moby Dick is 1.192 MB. [7] The human genome consists of DNA representing 800 MB of data. The parts that differentiate one person from another can be compressed to 4 MB. [8]

See also

Related Research Articles

The byte is a unit of digital information that most commonly consists of eight bits. Historically, the byte was the number of bits used to encode a single character of text in a computer and for this reason it is the smallest addressable unit of memory in many computer architectures. To disambiguate arbitrarily sized bytes from the common 8-bit definition, network protocol documents such as the Internet Protocol refer to an 8-bit byte as an octet. Those bits in an octet are usually counted with numbering from 0 to 7 or 7 to 0 depending on the bit endianness.

A binary prefix is a unit prefix that indicates a multiple of a unit of measurement by an integer power of two. The most commonly used binary prefixes are kibi (symbol Ki, meaning 210 = 1024), mebi (Mi, 220 = 1048576), and gibi (Gi, 230 = 1073741824). They are most often used in information technology as multipliers of bit and byte, when expressing the capacity of storage devices or the size of computer files.

<span class="mw-page-title-main">Gigabyte</span> Unit of digital information

The gigabyte is a multiple of the unit byte for digital information. The prefix giga means 109 in the International System of Units (SI). Therefore, one gigabyte is one billion bytes. The unit symbol for the gigabyte is GB.

Giga- ( or ) is a unit prefix in the metric system denoting a factor of a short-scale billion or long-scale milliard (109 or 1,000,000,000). It has the symbol G.

The kilobyte is a multiple of the unit byte for digital information.

The kilobit is a multiple of the unit bit for digital information or computer storage. The prefix kilo- (symbol k) is defined in the International System of Units (SI) as a multiplier of 103 (1 thousand), and therefore,

Mega is a unit prefix in metric systems of units denoting a factor of one million (106 or 1000000). It has the unit symbol M. It was confirmed for use in the International System of Units (SI) in 1960. Mega comes from Ancient Greek: μέγας, romanized: mégas, lit. 'great'.

The megabit is a multiple of the unit bit for digital information. The prefix mega (symbol M) is defined in the International System of Units (SI) as a multiplier of 106 (1 million), and therefore

An order of magnitude is usually a factor of ten. Thus, four orders of magnitude is a factor of 10,000 or 104.

A unit prefix is a specifier or mnemonic that is prepended to units of measurement to indicate multiples or fractions of the units. Units of various sizes are commonly formed by the use of such prefixes. The prefixes of the metric system, such as kilo and milli, represent multiplication by positive or negative powers of ten. In information technology it is common to use binary prefixes, which are based on powers of two. Historically, many prefixes have been used or proposed by various sources, but only a narrow set has been recognised by standards organisations.

File size is a measure of how much data a computer file contains or, alternately, how much storage it consumes. Typically, file size is expressed in units of measurement based on the byte. By convention, file size units use either a metric prefix or a binary prefix.

IEEE 1541-2002 is a standard issued in 2002 by the Institute of Electrical and Electronics Engineers (IEEE) concerning the use of prefixes for binary multiples of units of measurement related to digital electronics and computing. IEEE 1541-2021 revises and supersedes IEEE 1541–2002, which is 'inactive'.

The octet is a unit of digital information in computing and telecommunications that consists of eight bits. The term is often used when the term byte might be ambiguous, as the byte has historically been used for storage units of a variety of sizes.

<span class="mw-page-title-main">Power Macintosh 5500</span> Personal computer by Apple Computer

The Power Macintosh 5500 is a personal computer designed, manufactured, and sold by Apple Computer from February 1997 to March 1998. Like the Power Macintosh 5260 and 5400 that preceded it, the 5500 is an all-in-one design, built around a PowerPC 603ev processor operating at 225, 250 or 275 megahertz (MHz).

ISO 80000 or IEC 80000, Quantities and units, is an international standard describing the International System of Quantities (ISQ). It was developed and promulgated jointly by the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC). It serves as a style guide for using physical quantities and units of measurement, formulas involving them, and their corresponding units, in scientific and educational documents for worldwide use. The ISO/IEC 80000 family of standards was completed with the publication of the first edition of Part 1 in November 2009.

The JEDEC memory standards are the specifications for semiconductor memory circuits and similar storage devices promulgated by the Joint Electron Device Engineering Council (JEDEC) Solid State Technology Association, a semiconductor trade and engineering standardization organization.

In telecommunications, data transfer rate is the average number of bits (bitrate), characters or symbols (baudrate), or data blocks per unit time passing through a communication link in a data-transmission system. Common data rate units are multiples of bits per second (bit/s) and bytes per second (B/s). For example, the data rates of modern residential high-speed Internet connections are commonly expressed in megabits per second (Mbit/s).

This timeline of binary prefixes lists events in the history of the evolution, development, and use of units of measure that are germane to the definition of the binary prefixes by the International Electrotechnical Commission (IEC) in 1998, used primarily with units of information such as the bit and the byte.

In digital computing and telecommunications, a unit of information is the capacity of some standard data storage system or communication channel, used to measure the capacities of other systems and channels. In information theory, units of information are also used to measure information contained in messages and the entropy of random variables.

An order of magnitude is generally a factor of ten. A quantity growing by four orders of magnitude implies it has grown by a factor of 10000 or 104. However, because computers are binary, orders of magnitude are sometimes given as powers of two.

References

  1. "SI Prefixes". Bureau international des poids et mesures. Archived from the original on June 7, 2007. Retrieved June 1, 2007.
  2. 1 2 "Definitions of the SI units: The binary prefixes". National Institute of Standards and Technology.
  3. SanDisk USB Flash Drive "Note: 1 megabyte (MB) = 1 million bytes; 1 gigabyte (GB) = 1 billion bytes."
  4. "How Mac OS X reports drive capacity". Apple Inc. 2009-08-27. Retrieved 2009-10-16.
  5. Tracing the History of the Computer - History of the Floppy Disk
  6. Dickens, Charles (July 1, 1998). Great Expectations via Project Gutenberg.
  7. Melville, Herman (July 1, 2001). Moby Dick; Or, The Whale via Project Gutenberg.
  8. Christley, S.; Lu, Y.; Li, C.; Xie, X. (2008). "Human genomes as email attachments". Bioinformatics. 25 (2): 274–275. doi: 10.1093/bioinformatics/btn582 . PMID   18996942.