The American Standard Code for Information Interchange (ASCII), also known as ANSI X3.4, and the Extended Binary Coded Decimal Interchange Code (EBCDIC) are computer character encoding sets. Although there are a number of variants of ASCII, all of them are essentially the same, and the encoding system is often praised by computer scientists for its simplicity and adaptability. EBCDIC is considered an anachronism in the computer world as it was designed for the now superannuated punch cards. ASCII, on the other hand, was developed in the 1960s and is very much designed for use in the world of modern computing.
Both ASCII and EBCDIC are based on the Baudot code — a 19th century alternative Morse code — but they were designed for different purposes and to different ends. Essentially a seven-bit code, ASCII allows the eighth most significant bit (MSB) for error checking, but most contemporary computer systems use the codes above 128 for extended character sets. Whereas this is a character encoding set common on a number of different computer systems, EBCDIC is a character set particular to IBM mainframes.
EBCDIC uses the quota of available eight bits and therefore forgoes parity checking, but it has a greater range of control characters. However, the advantages of this character encoding set are limited to this fuller scope of control characters and EBCDIC’s suitability to its use on punch cards. It also includes the American cent character (¢) that ASCII omits, although it usually leaves out these other characters: [ ] { } ^ ~ and ¦.
Some of the characters missing from EBCDIC and found in ASCII are in the UUencoding range, with the consequence that Internet attachment mail is often corrupted. Moreover, there are many variants of EBCDIC and among these variants there exist types that are incompatible with each other. This problem is exacerbated because documentation of this character set is hard to obtain from IBM, in contrast to ASCII, which is well documented and widely available.
There are pieces of software that allow users to convert between the two sets. It appears only a matter of time before ASCII’s status of de facto encoding system becomes the de jure one among computer users, however.