Architecture of computer

architecture of computer

A Brief History of Computer Architecture

Computer Architecture is the science and art of selecting and interconnecting hardware components to create computers that meet functional performance and cost goals. * [Computer Architecture Page: http: //] It refers to those attributes of the system that are visible to a programmer and have a direct impact on the execution of a program. Computer architect coordinate of many levels of abstraction and translates business and technology drives into efficient systems for computing tasks.

Computer Architecture concerns Machine Organization, interfaces, application, technology, measurement & simulation. Includes:

  • Instruction set
  • Data formats
  • Principle of Operation (textual or formal description of every operation)
  • Features (organization of programmable storage, registers used, interrupts mechanism, etc.)

In short, it is the combination of Instruction Set Architecture, Machine Organization and the underline hardware.

The Brief History of Computer Architecture:

First Generation (1940-1950) – Vacuum Tube

  • ENIAC- 1945: Designed by Mauchly & Echert, built by US army to calculate trajectories for ballistic shells during WWII, used 18000 vacuum tubes and 1500 relays, programmed by manually setting switches
  • UNIVAC – 1950: the first commercial computer
  • John Von Neumann architecture: Goldstine and Von Neumann took the idea of ENIAC and developed concept of storing a program in the memory. Known as the “Von Neumann” architecture and has been the basis for virtually every machine designed since then.
  • Electron emitting devices
  • Data and programs are stored in a single read-write memory
  • Memory contents are addressable by location, regardless of the content itself
  • Machine language/Assemble language
  • Sequential execution

Second Generation (1950-1964) – Transistors

  • William Shockley, John Bardeen, and Walter Brattain invent the transistor that reduce size of computers and improve reliability.
  • First operating Systems: handled one program at a time
  • On-off switches controlled by electricity
  • High level languages
  • Floating point arithmetic

Third Generation (1964-1974) – Integrated Circuits (IC)

  • Microprocessor chips combines thousands of transistors, entire circuit on one computer ship
  • Semiconductor memory
  • Multiple computer models with different performance characteristics
  • Smaller computers that did not need a specialized room

Fourth Generation (1974-present) – Very Large-Scale Integration (VLSI)/Ultra Large Scale Integration (ULSI)

  • Combines millions of transistors
  • Single-chip processor and the single-board computer emerged
  • Creation of the Personal Computer (PC)
  • Wide spread use of data communications
  • Artificial intelligence: Functions & logic predicates
  • Object-Oriented programming: Objects & operations on objects
  • Massively parallel machine

Evolution of Instruction Sets

Instruction Set Architecture (ISA) Abstract interface between the Hardware and lowest-level Software

  • 1950: Single Accumulator: EDSAC
  • 1953: Accumulator plus Index Registers : Manchester Mark I, IBM 700 series
  • Separation of programming Model from implementation:
    • 1963: High-level language Based: B5000
    • 1964: Concept of a Family: IMB 360
  • Genera Purpose Register Machines:
    • 1977-1980: CISC - Complex Instruction Sets computer: Vax, Intel 432
    • 1963-1976: Load/Store Architecture: CDC 6600, Cray 1
    • 1987: RISC – Reduced Instruction Set Computer: Mips, Sparc, HP-PA, IBM RS6000
    Typical RISC:
    • Simple, no complex addressing
    • Constant length instruction, 32-bit fixed format
    • Large register file
    • Hard wired control unit, no need for micro programming
    • Just about every opposites of CISC

Evolution or Revolution? Major advances in computer architecture are typically associated with landmark instruction set designs. Computer architecture’s definition itself has been through bit changes. The following are the main concern for computer architecture through different times:

  • 1930-1950: Computer arithmetic
    • Microprogramming
    • Pipelining
    • Cache
    • Timeshared multiprocessor
  • 1960: Operating system support, especially memory management
    • Virtual memory
  • 1970-1980: Instruction Set Design, especially for compilers; Vector processing and shared memory multiprocessors
    • RISC
  • 1990s: Design of CPU, memory system, I/O system, multi-processors, networks
    • CC-UMA multiprocessor
    • CC-NUMA multiprocessor
    • Not-CC-NUMA multiprocessor
    • Message-passing multiprocessor
  • 2000s: Special purpose architecture, functionally reconfigurable, special considerations for low power/mobile processing, chip multiprocessors, TLP, memory systems
    • Massive SIMD
    • Parallel processing multiprocessor

Under a rapidly changing set of forces, computer technology keeps at dramatic change, for example: processor clock rate at about 20% increase a year, logic capacity at about 30 increase a year; memory speed at about 10% increase a year, capacity at about 60% increase a year, cost per bit improves about 25% a year; last, the disk capacity increase at a 60% a year. Is there a limit? Is there a virtuous Circle? If so, where are we heading?

Category: Architecture

Similar articles: