Understanding the Computer Language.
We all know the universal/most common definition of a computer.
A computer is an electronic machine that processes data for information.
This is true, but have you ever wondered how exactly a computer process this data for you to get this information? or in a simpler understanding, how computers understand and reply to our requests?
An average learned human being understands the English language at least to an extent irrespective of where he is from, this is because the English language is the universal language. Computers also have their own universal language which is how they communicate, called the binary system or base 2 numeral system. This means computers can only talk in 1’s and 0’s.
The same way we use letters to create words, sentences, and stories is the same way computers use the binary to do all these, only instead of a, b, c, d, e…. computers will only use 0 and 1 to create words and give them meaning. According to computer terms, we group binary into 8 numbers or bits. A group of 8 bits is called a byte. Understand this, binary is (0’s and1's), a bit is one digit in the binary system i.e either 0 or 1 and a byte is 8 digits in the binary system.
For example, 01100100- is a byte because it is made up of 8 bits. In computer talk, the above is the letter B.
Our computer reads every request we make in 1’s and 0’s. Using Character encoding, it has become easier to understand the computer language. Character encoding is like a dictionary. It helps computers check which characters(i.e. a, b, -, .) should be represented by a binary value. ASCII is the oldest character encoding standard, it represents Punctuations, the English alphabet, and digits.
To understand character encoding better, check https://medium.com/jspoint/introduction-to-character-encoding-3b9735f265a6
With just 8 combinations of 0’s and 1’s, we are able to represent everything you see on any gadget from watching movies on your laptop to reading this article.