As we know Computers understand only binary numbers, which are represented by two electronic states: high voltage and low voltage.
These notations are further divided into standard codes, which can be used to represent data for the benefit of users.
Some of the most well-known codes are mentioned below.
Absolute Binary Code
In absolute binary code, place 0 in a most significant bit if the number is positive if the number is negative 1 is placed in the most significant bit this bit is called the sign bit.
Binary Coded Decimal (BCD) code
BCD is a way to express each of the decimal digits with a binary code. In BCD each decimal digit is represented by a 4-bit binary number. In the BCD, with four bits we can represent sixteen numbers (0000 to 1111). But in the BCD code only the first ten of these are used (0000 to 1001). The remaining six code combinations i.e. 1010 to 1111 are invalid in BCD.
ASCII (American Standard Code for Information Interchange)
The American Standard Code for Information Interchange, or ASCII code, was created in 1963 by the “American Standards Association” Committee or “ASA”. The agency changed its name in 1969 by “American National Standards Institute” or “ANSI” as it is known since.
In the initial phase, only the capital letters and numbers were included, but in 1967, ANSI added the lowercase letters and some control characters, forming what is known as US-ASCII.
The characters range from 0 through 127, containing all you need to write in the English language.
What is UNICODE?
Unicode is a universal character encoding standard that assigns a code to every character and symbol in every language in the world.
Since no other encoding standard supports all languages, Unicode is the only encoding standard that ensures that you can retrieve or combine data using any combination of languages.