Kerala Plus One Computer Science Notes Chapter 1 The Discipline of Computing
The concept of computing has evolved from I the abacus to super computers of today. It includes the evolution of computing devices, provides an overview of the different generations of computers and the evolution of programming languages.
Computing Milestones and Machine Evolution
In ancient times people used stones for counting. Different counting methods are:
Counting and Evolution of the Positional Number System:
Each number ; has a weight. In order to count items, such as animals, ‘sticks’ were used, each stick representing one animal or object.
- Egyptian number system (3000BC) used 10 as a radix (base) and write from right to left.
- Sumerian/Babylonian number system used 60 as its number base, known as the sexagesimal system and WTitten from left 1 to right.
- The Chinese number system (2500 BC) use numbers from 1 to 9 which are represented by using small bamboo rods.
- Greek/Ionian number system (500 BC) was a decimal number system.
- The Romans used 7 letters (I, V, X, L, C, D and M) of the alphabet for representing numbers.
- The Mayans used a number system with base 20.
- The Hindu-Arabic numeral system originated in India, around 1500 years ago. It was a positional decimal numeral system and had a symbol for zero.
Evolution of the Computing Machine
During the period from 3000 BC – 1450 AD, human beings started communicating and sharing information with the aid of simple drawings and later through writings.
1. Abacus (3000BC): It was discovered by the Mesopotamians. The word ‘abacus’ means calculating board and also known as counting frame. An abacus consisted of beads on movable rods divided into two parts. The abacus may be considered the first computer for basic arithmetical calculations. Abacus works on the basis of the place value system.
2. Napier’s hones (1617 AD): John Napier devised a set of numbering rods known as Napier’s bones by which a multiplication problem could be easily performed.
There are 10 bones corresponding to the digits 0-9 and a special eleventh bone that is used to represent the multiplier. This device was known as Napier’s bones. John Napier also invented logarithm in 1614.
3. Pascaline (1642): Blaise Pascal developed a computing machine that was capable of adding and subtracting two numbers directly and that could multiply and divide by repetition. This machine was operated by dialing a series of wheels, gears and cylinders. He called it ‘Pascaline’. Initially, the Pascaline will be set to o for all the six digits.
4. Leibniz’s calculator (1673):
Leibniz designed a calculating machine called the Step Reckoner. The Step Reckoner expanded on Pascal’s ideas and extended the capabilities to perform multiplication and division as well.
5. Jacquard’s loom (1801):
It is invented by Joseph Marie Jacquard that simplifies the process of manufacturing textiles with complex patterns. The loom is controlled by punched cards with punched holes, each row of which corresponds to one row of the design. It also allowed to store patterns on cards to be utilised again to create the same product. This ability to store information triggered the computer revolution. The punched card concept was adopted by Charles Babbage to control his Analytical Engine and later by Herman Hollerith.
6. Difference engine:
The first step towards the creation of computers was made by Charles Babbage. He started working on a difference engine that could perform and print results automatically. In 1822, Babbage invented the Difference Engine to compile mathematical tables.
7. Analytical engine:
It is the real predecessor of the modern day computer. Analytical Engine marks the development from arithmetic calculation to general-purpose computation. The Engine had a ‘Store’ (memory) where numbers and intermediate results could be stored, and a separate ‘Mill’ (processor) where arithmetic processing could be performed. Its input/ output devices were in the form of punched cards containing instructions. These instructions were written by Babbage’s assistant, Agusta Ada King, the first programmer in the world. Charles Babbage’s great inventions-the Difference Engine and the Analytical Engine earned Charles Babbage the title ‘Father of Computer’.
8. Hollerith’s machine (1887):
Herman Hollerith fabricated the first electro mechanical punched card tabulator that used punched cards for input, output and instructions. The card had holes on them in a particular pattern, having special meaning for each kind of data. Hollerith’s greatest breakthrough was his use of electricity to read, count and sort punched cards whose holes represented data. In 1896, he started the Tabulating Machine Corporation and became International Business Machines (IBM) Corporation in 1924.
9. Mark – I (1944):
Howard Aiken constructed a large automatic electromechanical computer. Aiken’s machine, called the Babbage’s Analytical Engine, handled 23 decimal-place numbers and could perform all four arithmetic operations. It was preprogrammed to handle logarithms and trigonometric functions. For input and output, it used paper-tape readers, card readers, card punch and typewriters.
Generations of Computers
The evolution of computer started from the 16th century. It is distinguished into five generations of computers. Each generation of computers is characterised by a major technological development. Based on various stages of development, computers can be divided into different generations.
1. First Generation Computers (1940-56):
The first generation computers were built using vacuum tubes. This generation implemented the stored program concept. The first general purpose programmable electronic computer, the Electronic Numerical Integrator and Calculator (ENIAC) by J. Presper Eckert and John V. Mauchly belongs to this generation. Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) with a memory to hold both stored program as well as data. Eckert and Mauchly later developed the first commercially successful computer, the Universal Automatic Computer (UNIVAC).
Von Neumann architecture:
Von Neumann architecture consists of a central processing unit (CPU) containing arithmetic logic unit (ALU) and control unit (CU), input-output unit and a memory for storing data and instructions. This model implements the ‘Stored Program Concept’ in which the data and the instructions are stored in the memory.
2. Second Generation Computers (1956-63):
In this generation computers, vacuum tubes were replaced by transistors. This allowed computers to become smaller and more powerful and faster. They also became less expensive, required less electricity and emitted less heat. The manufacturing cost was also less. The concept of programming language was developed. This generation used magnetic core memory and magnetic disk memory for primary and secondary storage respectively. During second generation, many high level programming languages FORTRAN and COBOL were introduced. IBM 1401 and IBM 1620 are computers in this generation.
3. Third Generation Computers (1964-71):
Third generation computers are smaller in size due to the use of integrated circuits (IC’s). IC drastically reduced the size and increased the speed and efficiency of computing. Keyboards and monitors were introduced. The high level language BASIC was developed during this period. Computers in this generation are IBM 360 and IBM 370.
4. Fourth Generation Computers (1971 onwards):
The computers that we use today belong to this generation. These computers use microprocessors and are called microcomputers use Large Scale of Integration (LSI). These computers are smaller in size and have faster accessing and processing speeds. Some computers in this generation are IBM PC and
5. Fifth Generation Computers (future):
Fifth generation computers are based on Artificial Intelligence (AI). The two most common AI programming languages are LISP and Prolog. The fifth generation computing also aims at developing computing machines that respond to natural language input and are capable of learning and self-organisation.
Evolution of Computing Programming Languages: A programming language is an artificial language designed to communicate instructions to a computer. Programming languages can be used to create programs that control the behavior of a machine and/or to express algorithms.
These are divided into:
a. Machine language:
The first programming language developed for use in computers was called machine language. Machine language consisted of strings of the binary digits 0 and 1. This language had many drawbacks like difficulty in finding and rectifying programming errors and its machine dependency. The programmer also needed to have a good knowledge of the computer architecture.
b. Assembly language:
To make programming easier, a new language with instructions consisting of English-like words instead of o’s and l’s, was developed. This language was called assembly language. It is specific to a given machine and the programs written in this language are not transferable from one machine to another.
c. High level languages:
These are machine independent and which used simple English-like words and statements. It allowed people having less knowledge of computer architecture to develop easy to understand programs.
An effective tool for planning a computer program is an algorithm. An algorithm provides a step by step solution for a given problem. These steps can then be converted to machine instructions using a programming language.
Theory of Computing:
The theory of computation is the branch that deals with how efficiently problems can be solved based on computation models and related algorithms. In order to perform a rigorous study of computation, computer scientists work with a mathematical abstraction of computers called a model of computation.
He made significant contributions to the development of computer science, by presenting the concepts of algorithm and computing with the help of his invention the Turing Machine, which is considered as a theoretical model of a general purpose computer. Considering these contributions he is regarded as the father of modern Computer science as well as artificial intelligence.
Turing machine is a model of a computer proposed by Alan Turing. Turing machine is a theoretical computing device that could print symbols on a paper tape in a manner that emulate a person following a series of logical instructions. A Turing machine consists of an infinitely long tape which acts like the memory in a computer.
The action of a Turing machine is determined by:
- The current state of the machine.
- The symbol in the cell currently being scanned by the head.
- A table of transition rules, which serve as the ‘program’ for the machine.