Binary Language of Computers: Definition & Overview

Binary Numbers Definition

As we know computer system consist binary information only and binary is the two possible values 0 and 1. Computer translate between binary information and the information user actually work with on a computer such as decimal numbers, text, photos, sound, and video. Sometimes binary information, also known as machine language because it represent fundamental level of information that stored in the computer system. Binary information can be transmitted through magnetic properties that is on two different levels. One for polarities that are used to represent 0's and 1's, another one is optical disk like CD-ROM or DVD, also stores binary information in pits and lands form.

Binary Notation

In binary notation, each binary number represented as a bit that consist of two values 0 and 1 and to represent more than two values we need to use multiple bits. A Combination of two bits represent 4 different values, whereas combination of three bits represent 8 different values. So the method to obtain values is” n” which is bits of combination which comes in power of 2. As we know to represent large numbers require more bits. Then modern computer use 32 and 64 bit architecture. So different values of 32 bit are equivalent to decimal notation of 4,294,967,295 values.

Binary Coding

The presentation scheme that represents number can be used to represent text. Basically coding scheme is needed, which is similar to the example we discussed earlier. So first case arises, how many character we need to represent text and there are 26 lower characters and 26 upper characters so in total there are 52 different characters. Sometimes we also need character to represent punctuation, numeric digits, and special character. Just imagine all the characters from so many different languages we have which is known as standard character sets and there are several standard characters sets have been developed over the years, some of them are ASCII and Unicode. ASCII stands for American Standard Code for Information Exchange is a binary coding system that developed from telegraphic code which was adapted to represent text in binary code. ASCII original version is of 7 bits that represent 128 different characters, but there are more add on now, which is of 8, 16 and 32 bit of encoding do. After ASCII there is Unicode that still in high demand so the basic principle to underlying Unicode is same like ASCII, but it contain 110,000 characters by covering most of the printing languages.

s
citation generator
citaion generator
make money online