Top Pages

Saturday, 20 December 2014

Computer History

Computer History
Computer history includes hardware, architecture, and its influence on the software.


  • computer hardware is all the physical parts of the computer, and are distinguished from the data in it or who operate in it, and distinguished by the software (software) that provides instructions for the hardware to accomplish tasks. Boundaries between hardware and software will be a little blurry when we speak about firmware, because this firmware is software that is "made" into the hardware. Firmware is an area of computer science and computer engineering, which is rarely known by the general user.
  • The architecture in the field of computer engineering, computer architecture is the concept of planning and structure of the basic operation of a computer system. Computer architecture is a blueprint plans and functional description of requirements designed piece of hardware (the speed of the process and system interconnection). In this case, the implementation plan of each section will be focused mainly, on how the CPU will work, and on how to access the data and address from and to the cache memory, RAM, ROM, hard disk, etc.). Some examples of computer architecture is the von Neumann architecture, CISC, RISC, Blue Gene, etc. Computer architecture can also be defined and categorized as science and the art of how to interconnect hardware components to be able to create a computer that meets the functional requirements, performance, and the target cost.
  • Software The software is a general term for the data that is formatted and stored digitally, including computer programs, documentation, and other information that can be read and written by the computer. In other words, part of the computer system that is intangible. This term accentuate differences with computer hardware. Below are some examples of kinds of software, namely:
  1. Application software (application software) such as word processing, spreadsheets table count, media player, and an office suite like OpenOffice.org.
  2. Operating system (operating system) for example Linux.
  3. Software development tools (software development tools) such as compilers for high-level programming languages such as Pascal and low-level programming language is assembly language.
  4. Controller hardware (device drivers) is the liaison between the maid and computer hardware is software that is widely used in supermarkets and schools, namely the use of barcode scanners on other database applications.
  5. Software settled (firmware) as installed in digital watches and remote control.
  6. Free software (free 'libre' software) and open source software (open source software)
  7. Free software (freeware)
  8. Software testing (shareware / trialware)
  9. Malicious software (malware)

Understanding Computers
Computers are tools used to process the data according to the commands that have been formulated. Computer word originally used to describe people who perkerjaannya perform arithmetic calculations, with or without aids, but the meaning of the word is then transferred to the machine itself. Originally, the processing of information almost exclusively related to arithmetical problems, but modern computers are used for many tasks unrelated to mathematics.

Broadly, the computer can be defined as an electronic device that consists of several components, which can cooperate between the components with one another to produce an information based on existing programs and data. The computer components are included: Screen Monitor, CPU, Keyboard, Mouse and Printer (as a complement). Without a computer printer can still do its job as a data processor, but not limited to visible screen monitor in print form (paper).

In the definition of such a tool like a slide rule, mechanical calculators types ranging from abacus and so on, until all the contemporary electronic computers. The term better suited for a broad sense such as "computer" is "that process information" or "information processing systems."

Nowadays, computers are becoming more sophisticated. However, before the computer is not small, sophisticated, cool and light now. In the history of computers, there are five generations in the history of computers.


Computer generation

The first generation:
With the onset of the Second World War, the countries involved in the war sought to develop computers to exploit their potential strategic computer. This increases funding for the development of computers and accelerate the progress of computer engineering. In 1941, Konrad Zuse, a German engineer to build a computer, the Z3, to design airplanes and missiles.
                                                                          
Allied also made other advances in the development of the power of the computer. In 1943, the British completed a secret code-breaking computer called Colossus to decode secret German. The Colossus's impact influenced the development of the computer industry because of two reasons. First, Colossus is not a versatile computer (general-purpose computer), it was only designed to decode secret messages. Second, the existence of the machine was kept secret until decades after the war ended.
The work done by the Americans at that time produced a broader achievement. Howard H. Aiken (1900-1973), a Harvard engineer working with IBM, succeeded in producing electronic calculators for the US Navy. It was about half the length of a football field and has a range of 500 miles along the cable. The Harvard-IBM Automatic Sequence Controlled Calculator, or Mark I, an electronic relay computer. It uses electromagnetic signals to move mechanical components. The machine was slow (taking 3-5 seconds per calculation) and inflexible (in order of calculations can not be changed). The calculator can perform basic arithmetic and more complex equations.

Another computer development today is the Electronic Numerical Integrator and Computer (ENIAC), which is made by the cooperation between the US government and the University of Pennsylvania. Consisting of 18,000 vacuum tubes, 70,000 resistors and 5 million soldered joints, the computer is a very large machines that consume power of 160kW.

This computer was designed by John Presper Eckert (1919-1995) and John W. Mauchly (1907-1980), ENIAC is a versatile computer (general purpose computer) that work 1000 times faster than Mark I.
                                                                                                               
In the mid-1940s, John von Neumann (1903-1957) joined the University of Pennsylvania team, initiating concepts in computer design that the next 40 years is still used in computer engineering. Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to accommodate either program or data. This technique allows the computer to stop at some point and then resume her job back. Key to the von Neumann architecture is the central processing unit (CPU), which allowed all computer functions to be coordinated through a single source. In 1951, the UNIVAC I (Universal Automatic Computer I) made by Remington Rand, became the first commercial computer that utilizes the model of Von Neumann architecture.
Both the US Census Bureau and General Electric have UNIVAC. One of the impressive results achieved by the UNIVAC dalah success in predicting victory of Dwight D. Eisenhower in the 1952 presidential election.

First generation computers were characterized by the fact that the operating instructions are made specifically for a particular task. Each computer has a different binary code program called "machine language" (machine language). This causes the computer is difficult to be programmed and the speed limit. Another feature is the use of first-generation computer vacuum tube (which makes the computer at that time very large) and a magnetic cylinder for data storage.

The second generation:
In 1948, the invention of the transistor greatly influenced the development of the computer. Transistors replaced vacuum tubes in television, radio, and computers. As a result, the size of the electrical machines is reduced drastically.

The transistor used in computers began in 1956. Coupled with early advances in magnetic-core memory to second generation computers smaller, faster, more reliable, and more energy efficient than their predecessors. The first machine that utilizes this new technology is a supercomputer. IBM makes supercomputer named Stretch, and Sprery-Rand makes a computer named LARC. These computers, which were developed for atomic energy laboratories, can handle large amounts of data, a capability that is needed by researchers atoms. The machine is very expensive and tend to be too complex for business computing needs, thereby limiting. There are only two LARC has been installed and used: one at the Lawrence Radiation Labs in Livermore, California, and the other in the US Navy Research and Development Center in Washington DC Second generation computer replaces the assembly language machine language. Assembly language is a language that uses abbreviations to replace the binary code.

In the early 1960s, began to appear successful second generation computers in business, in universities and in government. The second generation of computers is a fully computer using transistors. They also have components that can be associated with the computer at this time: a printer, storage, disk, memory, operating system, and programs.

One important example of this period is 1401 which is widely accepted in the industry. In 1965, almost all large businesses use computers second generation to financial memprosesinformasi.

The program stored in the computer and programming language that is in it gives flexibility to the computer. Flexibility is increased performance at a reasonable price for business use. With this concept, the computer can print customer invoices and minutes later design products or calculate paychecks. Some programming languages began to appear at that time. Programming language Common Business-Oriented Language (COBOL) and FORTRAN (Formula Translator) came into common use. This programming language replaces complicated machine code with words, sentences, and mathematical formulas are more easily understood by humans. This allows a person to program a computer. A wide variety of emerging careers (programmer, systems analyst, and expert computer systems). Industr software also began to emerge and develop in this second generation computers.

The third generation:
Although the transistor is in many respects the vacuum tube, but transistors generate considerable heat, which can potentially damage the internal parts of the computer. Quartz stone (quartz rock) eliminates this problem. Jack Kilby, an engineer at Texas Instruments, developed the integrated circuit (IC: integrated circuit) in 1958. The IC combined three electronic components in a small silicon disc made of quartz sand. Scientists later managed to fit more components into a single chip, called a semiconductor. As a result, computers become smaller because the components can be squeezed onto the chip. Other third-generation development is the use of the operating system (operating system) that allows the engine to run many different programs at once with a central program that monitored and coordinated the computer's memory.

The fourth generation:
After IC, the development becomes more obvious: reduce the size of circuits and electrical components. Large Scale Integration (LSI) could fit hundreds of components on a chip. In the 1980s, the Very Large Scale Integration (VLSI) contains thousands of components in a single chip.

Ultra-Large Scale Integration (ULSI) increased that number into the millions. Ability to install so many components in a chip half the size coins encourage lower prices and the size of the computer. It also increased their power, efficiency and reliability. Intel chips are made in the year 4004 1971membawa advances in IC with all the components of a computer (central processing unit, memory, and control input / output) in a very small chip. Previously, the IC is made to do a certain task specific. Now, a microprocessor can be manufactured and then programmed to meet all the requirements. Not long after, every household devices such as microwave ovens, televisions, and cars with electronic fuel injection (EFI) equipped with a microprocessor.

Such developments allow ordinary people to use a regular computer. Computers no longer be a dominant big companies or government agencies. In the mid-1970s, computer assemblers offer their computer products to the general public. These computers, called minicomputers, sold with a software package that is easy to use by the layman. The software is most popular at the time was word processing and spreadsheets. In the early 1980s, such as the Atari 2600 video game consumer interest in home computers are more sophisticated and can be programmed.

In 1981, IBM introduced the use of the Personal Computer (PC) for use in homes, offices, and schools. The number of PCs in use jumped from 2 million units in 1981 to 5.5 million units in 1982. Ten years later, 65 million PCs in use. Computers continued their trend toward a smaller size, of computers that are on the table (desktop computer) into a computer that can be inserted into the bag (laptop), or even a computer that can be grasped (palmtops).

IBM PC to compete with Apple Macintosh, introduced in the computer. Apple Macintosh became famous for popularizing the computer graphics system, while his rival was still using a text-based computer. Macintosh also popularized the use of mouse devices.

At the present time, we know the journey IBM compatible with CPU usage: IBM PC / 486, Pentium, Pentium II, Pentium III, Pentium IV (series of CPUs made by Intel). Also we know AMD k6, Athlon, etc. This is all included in the class of fourth-generation computers.

Along with the proliferation of the use of computers in the workplace, new ways to explore the potential to be developed. Along with the increased strength of a small computer, these computers can be connected together in a network to share a memory, software, information, and also to be able to communicate with each other. Computer network allows a single computer to establish electronic cooperation to complete an assignment process. By using direct cabling (also called a Local Area Network or LAN), or [telephone cable, the network can become very large.

The fifth generation:
Defining the fifth generation computer becomes quite difficult because this stage is still very young. Examples are the fifth generation computer imaginative fictional HAL9000 computer from the novel by Arthur C. Clarke's 2001: A Space Odyssey. HAL displays all the desired functions of a fifth-generation computer. With artificial intelligence (artificial intelligence or AI), HAL may have enough reason to hold conversations with humans, using visual input, and learn from his own experience.

Although it may be the realization of HAL9000 is still far from reality, many of the functions that had been established. Some computers can receive verbal instructions and imitate human reasoning. The ability to translate a foreign language also becomes possible. This facility is deceptively simple. However, such facilities become much more complicated than expected when programmers realized that human understanding relies on context and understanding rather than just translate the words directly.

Many advances in the field of computer design and technology are increasingly enabling the manufacture of fifth generation computers. Two engineering advances which are mainly parallel processing capabilities, which will replace the non-Neumann model. Non Neumann model will be replaced with a system that is able to coordinate many CPUs to work in unison. Another advancement is the superconducting technology that allows the flow of electrically without any obstacles, which will accelerate the speed of information.

Japan is a country that is well known in the attributes of the fifth generation computers. ICOT Institute (Institute for New Computer Technology) was also set up to make it happen. Many news stating that the project has failed, but some other information that the success of this fifth generation computer project will bring new changes in the world of computerized paradigm.

Source: Wikipedia
Source: http://yan-mil.blogspot.com/2014/11/sejarah-komputer.html
Older Post

No comments: "Computer History"

Post a Comment