Chapter 1 Introduction
The Turing Model
The idea of a universal computational device was first described by Alan Turing in 1936. He proposed
that all computation could be performed on a special kind of machine.
• Data Processors: A computer is defined as a data processor. The computer acts as a black
box that accepts input data, processes the data and creates output data.
• Problem with this model: It is too general, it does not specify the type of processing or
whether more than one processing can be done(meaning we don’t know how many
operations this model can perform)
• This model is a better model for a general purpose computer, this model also includes a
program.
• In this model, the output data depends on two factors: the input data and the program.
Using the same input data we can generate different outputs by changing the program. We
can also generate different outputs by using the same program but different inputs.
The Von Neumann Model
John von Neumann proposed in 1944-1945 that, since program and data are logically the same,
programs should be stored in memory of a computer.
• This model has four subsystems: memory, arithmetic logic unit, control unit, and
input/output subsystem.
• The memory is a storage area. Programs and data are stored here during processing.
• Arithmetic Logic unit (ALU) is where calculation and logical operations occur.
• The Control Unit controls the operations of memory, ALU & input/output
• The Input / Output: input subsystem accepts input data and the program from outside the
computer, while the output system sends results of processing to the outside world.
Includes secondary storage devices such as disk or tape too.
• The Stored program concept: Von Neumann model states that the program must be stored
in memory. Totally different from the architecture of early computers in which only the data
was stored in memory: Programs for their tasks were implemented by manipulating sets of
switches or changing wiring.
• Sequential execution of instructions: A program in Von Neumann model is made of a finite
number of instructions. The Control unit fetches 1 instruction from memory, decodes it,
then executes it. Instructions are executed sequentially.
,Computer Components
• Computer Hardware discussed in Chapter 5
• Data: The von Neumann model does not define how data must be stored in a computer. If a
computer is electronic the best way to store data is through an electric signal. It is discussed
in more detail in Chapter 3 and 4.
• Data Organization is discussed in Chapters 11-14
• Algorithms is discussed in Chapter 8
• Languages is discussed in Chapter 9
• Software Engineering is discussed in Chapter 10
• Operating systems is discussed in Chapter 7
History
Mechanical Machines: Pre-1930
• 17th century: Blaise Pascal invented Pascaline, a mechanical calculator for addition and
subtraction operations. 20th Century, Niklaus Wirth invented a programming language and
named it Pascal
• Late 17th Century: German Gottfried Leibnitz invented a more sophisticated mechanical
calculator that could do multiplication & division as well. Called Leibnitz’ Wheel.
• First machine using the idea of storage and programming was the Jacquard loom, invented
by Joseph-Marie Jacquard at the beginning of 19th Century, loom used punched card to
control raising of the warp threads.
• 1823: Charles Babbage invented the Difference engine, doing more than simple arithmetic
operations – could solve Polynomial equations too. Invented a machine called the Analytical
engine, similar to a modern Computer.
• 1890: Herman Hollerith designed and built a programmer machine that could automatically
read, tally, and sort data stored on punched cards.
Electronic Computers: 1930 – 1950
• First special purpose computer encoded information electrically was invented John V
Atanasoff & Clifford Berry in 1939. Atanasoff Berry Computer (ABC)
• Same time, German Konrad Zuse designed general purpose machine called Z1.
• 1930s, US Navy and IBM sponsored project at Harvard under direction of Howard Aiken to
build a huge computer called Mark 1. Used both electrical and mechanical components.
, • England, Alan Turing invented the computer called Colossus designed to break the German
Enigma Code.
• First General Purpose totally electronic computer was made by John Mauchly and J Presper
Eckert and was called ENIAC ( Electronic Numerical Integrator and Calculator). Completed in
1946. Used 18000 Vacuum Tubes, 100 ft long, 10ft high, weighed 30 tons.
Computer Generations: 1950 – Now
• First: 1950 – 1959: Characterised by the emergence of commercial computers, only used by
professionals & locked in rooms with limited access to the operator or specialist.
• Second 1959 – 1965: Used transistors instead of vacuum, reduced the size of computers,
cost and made them affordable to small and medium size corporations. FORTRAN AND
COBOL were invented and made programming easier.
• Third 1965-1975: The invention of the integrated circuit reduced cost and size further.
Software packages became available.
• Fourth 1975-1985: Saw appearance of microcomputers, first desktop calculator, Altair 8800
became available in 1975. Also saw emergence of the computer network.
• Fifth 1985- : Witnessed as appearance as laptop and palmtop computers, improvements in
secondary storage media, use of multimedia, and VR.
Definitions
• Algorithm - An ordered set of unambiguous steps that produces a result and terminates in a
finite time.
• Arithmetic logic unit (ALU) - Where calculation & logical operations take place.
• Computer languages –Any of the syntactical languages used to write programs for
computers, such as machine language, assembly language, C, COBOL and Fortran.
• Control unit - Controls the operations of the memory, ALU, and input / output sub-system.
• Data processor – An entity that inputs data, processes it, and outputs the result.
• Digital divide – A social issue that divides people in society into two groups: those who are
electronically connected to the rest of society, and those who are not.
• Input data – User information that is submitted to a computer to run a program.
• Instruction – A command that tells a computer what to do.
• Integrated circuit – Transistors, wiring, and other components on a single chip.
• Memory - Where programs and data are stored.
• Operating system – The software that controls the computing environment and provides an
interface to the user.
• Output data – The results of running a computer program.
• Program – A set of instructions.
• Software engineering - The design & writing of computer programs, following strict rules &
principles.
• Turing machine – A computer model with 3 components: tape, controller, and read/write
head, that can implement statements in a computer language.
• Turing model – A computer model based on Alan Turings' theoretical definition of a
computer. von Neumann model – A computer model consisting of memory, arithmetic logic
unit, control unit, and input/output subsystems, upon which the modern computer is based.
, Chapter 3 Data Storage
Data Types
Data the days can come in many different forms including numbers, text, audio, images, and videos.
• The data inside a computer is transformed into a uniform representation when they are
stored in a computer and translated back into the original form when retrieved. This is called
a bit pattern.
• Bits are the smallest unit of data that can be stored in a computer and has the value 0 or 1.
Think of it as a switch with an on(1 or vice versa) and an off(0 or vice versa).
• Bit patterns are a sequence of bits. A bit with 8 bits is called a byte. Bits can represent many
different forms of data such as text, audio etc.
• In order to occupy less space, data can be compressed before being stored.
Storing Numbers
A number is changed to the binary system before being stored in the computers memory. But we
still don’t know how to store the sign of a number and showing the decimal point.
• Fixed point Representation ( decimal place assumed but not stored. Only integers)
-Unsigned Representation (an integer that can never be negative)
The maximum unsigned integer is 2𝑛 − 1.
An input device stores an unsigned integer by:
1. Changing the integer to binary.
2. If number of bits is less than n, 0’s are added to the of the binary integers so there is a total
of n bits. If more than n, integer cannot be stored and overflow occurs.
If the binary number is larger the maximum unsigned integer number, it is labelled as overflow.
Applications of unsigned integers:
• Counting: When we count, we don’t need negative numbers