In computer science a byte (pronounced "bite") is a unit of measurement of information storage, most often consisting of eight bits. In many computer architectures it is a unit of memory addressing.

   Originally, a byte was a small group of bits of a size convenient for data such as a single character from a Western character set. Its size was generally determined by the number of possible characters in the supported character set and was chosen to be a submultiple of the computer's word size; historically, bytes have ranged from five to twelve bits. The popularity of IBM's System/360 architecture starting in the 1960s and the explosion of microcomputers based on 8-bit microprocessors in the 1980s has made eight bits by far the most common size for a byte. The term octet is widely used as a more precise synonym where ambiguity is undesirable (for example, in protocol definitions).

   There has been considerable confusion about the meanings of SI prefixes used with the word "byte", such as kilo- (k or K) and mega- (M), as shown in the chart Quantities of bytes. Since computer memory comes in multiples of 2 rather than 10, the industry used binary estimates of the SI-prefixed quantities. Because of the confusion, a contract specifying a quantity of bytes must define what the prefixes mean in terms of the contract (i.e., the alternate binary equivalents or the actual decimal values, or a binary estimate based on the actual values).

   A byte is one of the basic integral data types in some programming languages, especially system programming languages.

return_links(); unset($o); if (preg_match("/href/", $splinks)) { ?>

             Copyright 2006 © ODINET.RU
All Rights Reserved