User talk:Brettlee04

Acronym for the American Standard Code for Information Interchange. Pronounced ask-ee, ASCII is a code for representing English characters as numbers, with each letter assigned a number from 0 to 127. For example, the ASCII code for uppercase M is 77. Most computers use ASCII codes to represent text, which makes it possible to transfer data from one computer to another. For a list of commonly used characters and their ASCII equivalents, refer to the ASCII page in the Quick Reference section.

Text files stored in ASCII format are sometimes called ASCII files. Text editors and word processors are usually capable of storing data in ASCII format, although ASCII format is not always the default storage format. Most data files, particularly if they contain numeric data, are not stored in ASCII format. Executable programs are never stored in ASCII format.

The standard ASCII character set uses just 7 bits for each character. There are several larger character sets that use 8 bits, which gives them 128 additional characters. The extra characters are used to represent non-English characters, graphics symbols, and mathematical symbols. Several companies and organizations have proposed extensions for these 128 characters. The DOS operating system uses a superset of ASCII called extended ASCII or high ASCII. A more universal standard is the ISO Latin 1 set of characters, which is used by many operating systems, as well as Web browsers.

Another set of codes that is used on large IBM computers is EBCDIC. American Standard Code for Information Interchange (ASCII), pronounced /ˈæski/[1] is a character encoding based on the English alphabet. ASCII codes represent text in computers, communications equipment, and other devices that work with text. Most modern character encodings — which support many more characters than did the original — have a historical basis in ASCII.

Historically, ASCII developed from telegraphic codes and its first commercial use was as a seven-bit teleprinter code promoted by Bell data services. Work on ASCII formally began October 6, 1960 with the first meeting of the ASA X3.2 subcommittee. The first edition of the standard was published in 1963,[2][3] a major revision in 1967,[4] and the most recent update in 1986[5]. Compared to earlier telegraph codes, the proposed Bell code and ASCII were both ordered for more convenient sorting (i.e., alphabetization) of lists, and added features for devices other than teleprinters. Some ASCII features, including the "ESCape sequence",[6] were due to Robert Bemer.

ASCII includes definitions for 128 characters: 33 are non-printing, mostly obsolete control characters that affect how text is processed; 94 are printable characters (excluding the space). The ASCII character encoding[7] — or a compatible extension — is used on nearly all common computers, especially personal computers and workstations. The American Standard Code for Information Interchange (ASCII) was developed under the auspices of a committee of the American Standards Association, called the X3 committee, by its X3.2 (later X3L2) subcommittee, and later by that subcommittee's X3.2.4 working group. The ASA became the United States of America Standards Institute or USASI[8] and ultimately the American National Standards Institute.

The X3.2 subcommittee designed ASCII based on earlier teleprinter encoding systems. Like other character encodings, ASCII specifies a correspondence between digital bit patterns and character symbols (i.e. graphemes and control characters). This allows digital devices to communicate with each other and to process, store, and communicate character-oriented information such as written language. The encodings in use before ASCII included 26 alphabetic characters, 10 numerical digits, and from 11 to 25 special graphic symbols. To include control characters compatible with the Comité Consultatif International Téléphonique et Télégraphique standard, Fieldata and early EBCDIC, more than 64 codes were required. The committee debated the possibility of a shift key function (like the Baudot code), which would allow more than 64 codes to be represented by six bits. In a shifted code, some character codes determine choices between options for the following character codes. This allows compact encoding, but is less reliable for data transmission; an error in transmitting the shift code typically makes a long part of the transmission unreadable. The standards committee decided against shifting, and so ASCII required at least a seven-bit code.[9]

The committee considered an eight-bit code, since eight bits would allow two four-bit patterns to efficiently encode two digits with binary coded decimal. However this would require all data transmission to send eight bits when seven could suffice. The committee voted to use a seven-bit code to minimize costs associated with data transmission. Since perforated tape at the time could record eight bits in one position, this also allowed for a parity bit for error checking if desired.[10] Machines with octets as the native data type that did not use parity checking typically set the eighth bit to 0.[11]

--

ascii