Definition

computer

A computer is a device that accepts information (in the form of digitalized data) and manipulates it for some result based on a program, software, or sequence of instructions on how the data is to be processed.

Complex computers include the means for storing data (including the program, which is also a form of data) for some necessary duration. A program may be invariable and built into the computer hardware (and called logic circuitry as it is on microprocessors) or different programs may be provided to the computer (loaded into its storage and then started by an administrator or user). Today's computers have both kinds of programming.

Major types of computers

Analog computer - represents data by measurable quantities
Desktop computer - a personal computer that fits on a desk and is often used for business or gaming
Digital computer - operates with numbers expressed as digits
Hybrid computer - combines features of both analog and digital computers
Laptop (notebook) - an easily transported computer that is smaller than a briefcase
Mainframe (big iron) computer - a centralized computer used for large scale computing
Microcomputer - generally referred to as a PC (personal computer). Uses a single integrated semiconductor chip microprocessor.
minicomputer - an antiquated term for a computer that is smaller than a mainframe and larger than a microcomputer
Netbook - a smaller and less powerful version of a laptop
Personal computer (PC) - a digital computer designed to be used by one person at a time
Smartphone - a cellular telephone designed with an integrated computer
Supercomputer - a high performing computer that operates at extremely high speeds
Tablet computer (tablet PC) - a wireless personal computer with a touch screen
Workstation - equipment designed for a single user to complete a specialized technical/scientific task

History of the modern computer

Most histories of the modern computer begin with the analytical engine envisioned by Charles Babbage following the mathematical ideas of George Boole, the mathematician who first stated the principles of logic inherent in today's digital computer. Babbage's assistant and collaborator, Ada Lovelace, is said to have introduced the ideas of program loops and subroutines and is sometimes considered the first programmer. Apart from mechanical calculators, the first really useable computers began with the vacuum tube, accelerated with the invention of the transistor, which then became embedded in large numbers in integrated circuits, ultimately making possible the relatively low-cost personal computer.

Modern computers inherently follow the ideas of the stored program laid out by John von Neumann in 1945. Essentially, the program is read by the computer one instruction at a time, an operation is performed, and the computer then reads the next instruction.

From the mid-1900s to the present, the advancement of computers is divided into five generations. While the year span for each generation varies depending on the reference source, the most recognized generational timeline is below.

1940 to 1956

First generation computers were room-sized machines that used vacuum tubes for circuitry and magnetic drums for limited internal storage. These machines used punched cards for data input and a binary machine code (language). Examples of first generation computers include the ABC (Atanasoff Berry Computer), Colossus, IBM 650 and the EDVAC (Electronic Discrete Variable Computer).

1956 to 1963

Second generation computers replaced vacuum tubes with transistors, used magnetic tape storage for increased storage capacity, used BAL (basic assembler language) and continued to use punched cards for input. Transistors drew less power and generated less heat than vacuum tubes. Examples of second-generation computers include the IBM 7090, IBM 7094, IBM 1400, and the UNIVAC (Universal Automatic Computer).

1964 to 1971

Third generation computers used ICs (integrated circuits) with several transistors and MOS (metal oxide semiconductor) memory. Smaller, cheaper and faster than their predecessors, these computers used keyboards for input, monitors for output, and employed programming languages such as FORTRAN (Formula Translation), COBOL (Common Business Oriented Language) and C-Language. Examples of third generation computers include the IBM 360 and IBM 370 series.

1972 to 2010

Fourth generation computers used integrated circuits and microprocessors with VLSI (very large scale integration), RAM (random access memory), ROM (read-only memory), and high-level programming languages including C and C++. The creation and expansion of the World Wide Web and cloud computing (the ability to deliver hosted services using the Internet) significantly enhanced computing capabilities during this period. Examples of fourth generation computers include Apple's Macintosh and IBM's PC.

2010 and beyond

Fifth generation computers are based on AI (artificial intelligence), use large scale integrated chips and more than one CPU (processor). Fifth generation computers respond to natural language input, solve highly complex problems, make decisions through logical (human-like) reasoning and use quantum computing and Nanotechnology (molecular manufacturing). Fifth generation computers and programs allow multiple programs (and computers) to work on the same problem at the same time in parallel.

The advent of the Internet, cloud computing, and high bandwidth data transmission enables programs and data to be distributed over a network quickly and efficiently, while application programs and software make computers the tools of choice for such things as word processing, databases, spreadsheets, presentations, ERP (enterprise resource planning), simulations, education, CMS (content management systems), gaming and engineering. 

This was last updated in April 2019

Continue Reading About computer

Dig Deeper on IT operations and infrastructure management