Definition

mainframe (big iron)

What is a mainframe computer?

A mainframe, also known as big iron, is a high-performance computer used for large-scale, compute-intensive purposes and tasks that require greater availability and security than smaller-scale machines. Historically, mainframes have been associated with centralized rather than distributed computing. But that distinction has blurred as smaller types of computers become more powerful and mainframes increasingly multipurpose.

Photo of IBM System z
The IBM System z provide a high level of data privacy, security and resiliency.

The original mainframes were housed in room-sized metal frames that occupied 2,000-10,000 square feet. Newer mainframes are about the size of a large refrigerator to fit in the modern data center more easily. Smaller-scale IBM mainframes serve distributed users and as smaller servers in a computing network.

The mainframe is sometimes referred to as a dinosaur, not only because of its size but also because of predictions, going back many years, of its extinction. In the early 1990s, experts predicted the demise of the mainframe by the end of that decade. However, in February 2008, IBM released the z10 mainframe, running z/OS, z/VM , z/VSE and z/TPF mainframe operating systems (OS) as well as Linux. Today, companies, government agencies and other organizations continue to use mainframes for back-office transactions and data processing as well as web-based applications.

A brief history of the mainframe

IBM is credited with creating the original mainframe computer, the Harvard Mark I, which was designed by Howard Aikens and built by IBM. It ran its first programs in 1944.

Commercial mainframes hit the market in the 1950s, starting with Remington Rand's UNIVAC, which was introduced in 1951. By the 1970s and 1980s, IBM remained a leader in the mainframe market and large organizations the main customers. However, many vendors sold mainframe-class machines, including Amdahl, Burroughs, Control Data, Data General, Digital Equipment, Fujitsu, Hewlett-Packard, Hitachi, Honeywell, NCR, RCA, Scientific Data Systems, Siemens and Sperry Univac.

Plug-compatible mainframes (PCMs) that competed with IBM mainframes emerged during those decades. They were cheaper yet more powerful than comparable IBM products. Vendors such as Amdahl, Fujitsu and Hitachi offered computer systems that had their own central processing units (CPUs) and could use the IBM System/370 instruction set. That meant apps configured to run in IBM mainframes could also run in PCM systems.

Plug-compatible peripheral systems, such as memory and disk drives, were less expensive than IBM equipment, but they could support IBM environments. Vendors of these peripherals included Amdahl, IPL Systems, Memorex, Storage Technology and Telex. Considering that one megabyte of memory for an IBM System/370 mainframe in the late 1970s and early 1980s could cost $100,000, competing with IBM presented many opportunities for plug-compatible manufacturers.

Today, IBM still owns much of the mainframe market. Most of these competing firms either no longer exist or merged to create other companies.

How are mainframes used today?

Mainframes aren't as ubiquitous today as in the past, but they still play a significant role in several industries. They handle large-scale, data-intensive workloads and process huge amounts of data fast. They are often used for high-volume transaction processing, batch processing, data warehousing and analytics. Modern mainframes can run multiple OSes simultaneously and support cloud computing and virtual environments.

Among the industries where mainframes continue to have a significant role are the following:

  • Banking and financial companies. These use mainframes to process large volumes of transactions and to handle high-frequency trading in the financial markets.
  • Healthcare providers. They depend on mainframes to provide the security, dependability and scalability they need to manage patient data and data storage.
  • Government agencies. These include the military and the Internal Revenue Service. They rely on mainframes to handle large databases and data processing tasks.
  • Transportation providers. They use these machines to manage traffic control, scheduling and reservation systems.
  • Retailers. Particularly large online retailers, they use mainframes to track sales and inventory data.
Photo of Nvidia Perlmutter supercomputer
Nvidia's Perlmutter supercomputer is being used in astrophysics and climate science research.

Supercomputers vs. mainframe systems

In the 1960s the term supercomputer started being used to describe the fastest, most powerful computers. Control Data's 6600, designed by Seymour Cray, was the first to get that label. Supercomputers were designed for highly compute-intensive processing tasks, such as simulating nuclear weapons and forecasting weather.

These systems used 64-bit word sizes, as opposed to mainframes, which used 16-bit and 32-bit structures. Supercomputers used multiple CPUs operating in parallel to achieve ultra high-speed processing power. The processing speed of these processors is significantly faster than personal computers, servers and mainframes.

Learn more about where the mainframe fits in the modern data center.

This was last updated in April 2023

Continue Reading About mainframe (big iron)

Dig Deeper on Data center hardware and strategy