Sunday 6 November 2011

Basis of computers about the computer languages-speed-memory










Technically, a programmer could write a program directly as machine code, but it would take a tremendous understanding of the inner working PC and an absurd amount of time.For this they use programming languages instead of machine code. The binary number system consisting of 1s and 0s, this is called machine code.


The number 2 has been the basis of computers since the very beginning. Of course, not in the Arabic form that we see the numeral, but as the underlying basis for binary - the language that computers speak. Geeks even have jokes in binary: "There are 10 types of people in this world: those who understand binary and those who don't. " 


If you got it, chances are you're a nerd;a computer programmer. If you didn't, well - in binary - the number 2 is denoted by 10. 


Transistors - the basis of all machine computing - as envisaged by Claude Shannon in 1937, carry out calculations by switching themselves on or off. (' On' is the condition when there is an electric current in the transistor. 'Off' is when there is no electric current. ) 


Understandably, computers needed a language where everything could be denoted using these 2 states. Enter binary, where everything on God's green earth can be translated mathematically into zeroes and ones, or 'Off' and 'On'. In fact, the clock speed of processors - megahertz and gigahertz - is nothing but a measurement of how fast these chips can switch between the two states. 


The binary system of calculation has existed for centuries. Indian scholar Pingala (circa 5th-2 nd centuries BC) was the first to use it. Much, much later, Francis Bacon, the man of ideas, dabbled in it. Gottfried Leibniz, the German mathematician known for his work in the field of calculators, refined it. Briton George Boole picked it up and gave birth to Boolean algebra, which gave rise to Boolean logic. 


In fact, binary is so deeply ingrained in computers that Bill Gates once remarked: "I don't think there's anything unique about human intelligence. All the neurons in the brain that make up perceptions and emotions operate in a binary fashion. " For engineers and geeks, 2 as the base number became the guiding spirit. This led to bit, kilobit (1024 bits or 210 bits), megabit (220 bits), gigabit (230 bits), so on and so forth. 


Moore's Math In 1965, Gordon E Moore, one of the co-founders of Intel, stated that for the next 10 years the number of transistors in a processor would double every two years. It had little scientific basis. But such was the obsession of the industry to "double" everything that since 1965, Moore's Law has continued to define the semiconductors industry. 


Moreover, as engineers became obsessed with multiples of 2, RAM grew from 2MB to 4MB, from 4MB to 8MB, and 8MB to 16MB... to now reach an average of 4GB. Storage took the same course: From 16GB to 32GB, 32GB to 64GB... 


If computers are affordable nowadays and are powerful enough to render extremely complex scenes in 3D video games in real time, the credit for it goes to how well the computing industry has been able to follow Moore's Law. Theoretically, every time transistors have been doubled in processors, computers have become doubly powerful, but at half the cost. 


Two is a beautiful number. It is the symbol of balance and harmony. In the case of computers and the digital world, the number has proved particularly significant. It has been the number of logic and progress. It has been the base on which mankind has built its virtual castles.

No comments:

Post a Comment