What is ASCII?

ASCII stands for the American Standard Code for Information Interchange, and is pronounced with a hard “C” sound, as ask-ee. As a standard, it was first adopted in 1963 and quickly became widely used throughout the computer world. It’s used as a way of defining a set of characters that can be displayed by a computer on a screen, and includes some control characters that have special functions.

Basic ASCII uses seven bits to define each letter, meaning it can have up to 128 specific identifiers, 27 power. This size was chosen based on the common basic block of computing, the byte, which consists of eight bits. The eighth bit was often set aside for error-checking functions, leaving seven remaining for a character set.

There are 33 codes in ASCII that are used to represent things other than specific characters. The first 32 (0-31) represent things ranging from a chime sound, to a line feed command, to the start of a header. The final code, 127, represents a backspace. Beyond the first 31 bits are the printable characters. Bits 48-57 represent the numeric digits, bits 65-90 are the capital letters, and bits 97-122 are the lower-case letters. The rest of the bits are symbols of punctuation, mathematical symbols, and other symbols such as the pipe and tilde.

ASCII began in theory as a simpler character set, using six rather than seven bits. Ultimately it was decided that the addition of lower-case letters, punctuation, and control characters would greatly enhance its usefulness. Not long after its adoption, there was much discussion about possible replacements and adaptations of the code to incorporate non-English and even non-Roman characters. As early as 1972, an ISO standard (646) was created in an attempt to allow a greater range of characters. A number of problems existed with ISO-646, however, and it was left by the wayside.

The current leading contender for replacing this standard is the Unicode character set. It allows for essentially unlimited characters to be mapped by using collections of bytes to represent a character, rather than a single byte. The first byte of all Unicode standards remains dedicated to the ASCII character set, however, to preserve backward compatibility.

The standard is also sometimes discussed in reference to ASCII art. This describes the use of the basic character set to create visual approximations of images.