Evolution of Microprocessor

1.a  Evolution of Microprocessor

Microprocessors were categorized into five generations: first, second, third, fourth, and fifth generations. Their characteristics are described below:

First-generation

The microprocessors that were introduced in 1971 to 1972 were referred to as the first generation systems. First-generation microprocessors processed their instructions serially—they fetched the instruction, decoded it, then executed it. When an instruction was completed, the microprocessor updated the instruction pointer and fetched the next instruction, performing this sequential drill for each instruction in turn. 

Second generation

By the late 1970s, enough transistors were available on the IC to usher in the second generation of microprocessor sophistication: 16-bit arithmetic and pipelined instruction processing. 
Motorola’s MC68000 microprocessor, introduced in 1979, is an example. Another example is Intel’s 8080. This generation is defined by overlapped fetch, decode, and execute steps (Computer 1996). As the first instruction is processed in the execution unit, the second instruction is decoded and the third instruction is fetched. 
The distinction between the first and second generation devices was primarily the use of newer semiconductor technology to fabricate the chips. This new technology resulted in a five-fold increase in instruction, execution, speed, and higher chip densities. 

Third generation

The third generation, introduced in 1978, was represented by Intel’s 8086 and the Zilog Z8000, which were 16-bit processors with minicomputer-like performance. The third generation came about as IC transistor counts approached 250,000.
Motorola’s MC68020, for example, incorporated an on-chip cache for the first time and the depth of the pipeline increased to five or more stages. This generation of microprocessors was different from the previous ones in that all major workstation manufacturers began developing their own RISC-based microprocessor architectures (Computer, 1996). 
Fourth generation

As the workstation companies converted from commercial microprocessors to in-house designs, microprocessors entered their fourth generation with designs surpassing a million transistors. Leading-edge microprocessors such as Intel’s 80960CA and Motorola’s 88100 could issue and retire more than one instruction per clock cycle.

Fifth generation

Microprocessors in their fifth generation, employed decoupled super scalar processing, and their design soon surpassed 10 million transistors. In this generation, PCs are a low-margin, high-volume-business dominated by a single microprocessor. 




1.b Classification of Microprocessor:



A microprocessor is the chip containing some control and logic circuits that is capable of making arithmetic and logical decisions based on input data and produce the corresponding arithmetic or logical output. The word ‘processor’ is the derivative of the word ‘process’ that means to carry out systematic operations on data. The computer we are using to write this page of the manuscript uses a microprocessor to do its work. The microprocessor is the heart of any computer, whether it is a desktop machine, a server or a laptop. The microprocessor we are using might be a Pentium, a K6, a PowerPC, a Sparc or any of the many other brands and types of microprocessors, but they all do approximately the same thing in approximately the same way. No logically enabled device can do any thing without it. The microprocessor not only forms the very basis of computers, but also many other devices such as cell phones, satellites, and many other hand held devices. They are also present in modern day cars in the form of microcontrollers. The microprocessor is identified with the word size of data. For E.g. The ALU can perform a 4- bit data operation at a time these microprocessor is called as 4-bit microprocessor. 


4-bit Microprocessors
The first microprocessor was introduced in 1971 by Intel Corp. It was named Intel 4004 as it was a 4 bit processor. It was a processor on a single chip. It could perform simple arithmetic and logic operations such as addition, subtraction, boolean AND and boolean OR. It had a control unit capable of performing control functions like fetching an instruction from memory, decoding it, and generating control pulses to execute it. It was able to operate on 4 bits of data at a time.This first microprocessor was quite a success in industry. Soon other microprocessors were also introduced. Intel introduced the enhanced version of 4004, the 4040. Some other 4 bit processors are International’s PPS4 and Thoshiba’s T3472.
8-bit Microprocessors
The first 8 bit microprocessor which could perform arithmetic and logic operations on 8 bit words was introduced in 1973 again by Intel. This was Intel 8008 and was later followed by an improved version, Intel 8088. Some other 8 bit processors are Zilog-80 and Motorola M6800.
16-bit Microprocessors
The 8-bit processors were followed by 16 bit processors. They are Intel 8086 and 80286.
32-bit Microprocessors
The 32 bit microprocessors were introduced by several companies but the most popular one is Intel 80386.
Pentium Series
Instead of 80586, Intel came out with a new processor namely Pentium processor. Its performance is closer to RISC performance. Pentium was followed by Pentium Pro CPU. Pentium Pro allows allow multiple CPUs in a single system in order to achive multiprocessing. The MMX extension was added to Pentium Pro and the result was Pentiuum II. The low cost version of Pentium II is celeron.
The Pentium III provided high performance floating point operations for certain types of computations by using the SIMD extensions to the instruction set. These new instructions makes the Pentium III faster than high-end RISC CPUs.
Interestingly Pentium IV could not execute code faster than the Pentium III when running at the same clock frequency. So Pentium IV had to speed up by executing at a much higher clock frequency.


Comments