Browse Definitions :

Getty Images/iStockphoto

A brief history of the evolution and growth of IT

The history of information technology began long before the modern-day computer was ever invented.

The history that led to the development of IT as it's known today goes back millennia.

But the term information technology is a relatively recent development. The phrase first appeared in a 1958 Harvard Business Review article which predicted its future effects, titled Management in the 1980s:

"Over the last decade a new technology has begun to take hold in American business, one so new that its significance is still difficult to evaluate ... The new technology does not yet have a single established name. We shall call it information technology."

Information technology has evolved and changed ever since. This article will explore that history and the meaning of IT.

What is IT today?

Information technology is no longer just about installing hardware or software, solving computer issues, or controlling who can access a particular system. Today, IT professionals are in demand, and they also:

  • create policies to ensure that IT systems run effectively and are aligned with an organization's strategic goals;
  • maintain networks and devices for maximum uptime;
  • automate processes to improve business efficiency;
  • research, implement and manage new technologies to accommodate changing business needs; and
  • maintain service levels, security and connectivity to ensure business continuity and longevity.

In fact, today's modern hyper-connected data economy would collapse without information technology.  

The slow evolution of computers and computing technology

Before the modern-day computer ever existed, there were precursors that helped people achieve complex tasks.

The abacus is the earliest known calculating tool, in use since 2400 B.C.E. and still used in part of the world today. An abacus consists of rows of movable beads on a rod that represent numbers.

But it wasn't until the 1800s that the idea of programming devices really came along. At this time the Jacquard loom was developed, enabling looms to produce fabrics with intricate woven patterns. This system used punched cards that were fed into the loom to control weaving patterns. Computers well into the 20th century used the loom's system of automatically issuing machine instructions. But electronic devices eventually replaced this method.

In the 1820s, English mechanical engineer Charles Babbage -- known as the father of the computer -- invented the Difference Engine to aid in navigational calculations. This was regarded as the first mechanical computer device.

Then in the 1830s, he released plans for his analytical engine. The Analytical Engine would have operated on a punch card system. Babbage's pupil, Ada Lovelace, expanded on these plans. She brought these plans beyond simple math calculations and designed a series of operational instructions for the machine -- now known as a computer program. The Analytical Engine would have been the world's first general-purpose computer. But it was never completed, and the instructions were never executed.

Many of the data processing and execution capabilities of modern IT, such as conditional branches (if statements) and loops, are derived from the early work of Jacquard, Babbage and Lovelace.

Herman Hollerith, an American inventor and statistician, also used punch cards to feed data to his census-tabulating machine in the 1890s. This was an important precursor of the modern electronic computer. Hollerith's machine recorded statistics by automatically reading and sorting cards numerically encoded by perforation position. Hollerith started the Tabulating Machine Company to manufacture these machines in 1911. It was renamed International Business Machines Corp. (IBM) in 1924.

German engineer Konrad Zuse invented Z2, one of the world's earliest electromechanical relay computers, in 1940. It had very low operating speeds that would be unimaginable today. Later in the 1940s came Colossus computers, developed during World War II by British codebreakers. These computers intercepted and deciphered encrypted communications from German cipher machines, code-named "Tunny." Around the same time, British mathematician Alan Turing invented the Bombe. This machine decrypted messages from the German Enigma machine. 

Turing -- immortalized by the Turing Test -- first conceptualized the modern computer in his paper "On Computable Numbers" in 1936. In this piece, Turing suggested that programmable instructions could be stored in a machine's memory to execute certain activities. This concept forms the very basis of modern computing technology.

By 1951, British electrical engineering company Ferranti Ltd. produced the Ferranti Mark 1, the world's first commercial general-purpose digital computer. This machine was based on the Manchester Mark 1, developed at Victoria University of Manchester. 

The IT revolution picks up pace

J. Lyons and Co. released the LEO I computer in 1951 and ran its first business application that same year. MIT's Whirlwind -- also released in 1951 -- was one of the first digital computers capable of operating in real time. In 1956, it also became the first computer that enabled users to input commands with a keyboard.

As computers evolved, so too did what eventually led to the field of IT. From the 1960s onward, the development of the following devices set the stage for an IT revolution:

  • screens
  • text editors
  • the mouse
  • hard drives
  • fiber optics
  • integrated circuits
  • programming languages such as FORTRAN and COBOL

Today's IT sector is no longer the exclusive domain of mathematicians. It employs professionals from a variety of backgrounds and skillsets, such as network engineers, programmers, business analysts, project managers and cybersecurity analysts.

Read more here about the top cybersecurity careers.

The information revolution and the invention of the internet

In the 1940s, '50s and '60s, governments, defense establishments and universities dominated computing IT. However, it also spilled over into the corporate world with the development of office applications such as spreadsheets and word processing software. This created a need for specialists who could design, create, adapt and maintain the hardware and software required to support business processes.

Various computer languages were created and experts for those languages also appeared. Oracle and SAP programmers emerged to run databases, and C programmers to write and update networking software. These were in high demand -- a trend that continues to this day, especially in areas of cybersecurity, AI and compliance.

The invention of email in the 1970s revolutionized IT and communications. Email began as an experiment to see if two computers could exchange a message, but it evolved into a fast and easy way for humans to stay in touch. The term "email" itself was not coined until later, but many of its early standards, including the use of @, are still in use today.

Many IT technologies owe their existence to the internet and the world wide web. However, ARPANET, a U.S. government-funded network that was conceptualized as an intergalactic computer network by MIT scientists in the 1960s, is considered the precursor of the modern internet. ARPANET grew into an interconnected network of networks from just four computers. It eventually led to the development of Transmission Control Protocol (TCP) and Internet Protocol (IP). This enabled distant computers to communicate with each other virtually. Packet switching -- sending information from one computer to another -- also brought machine-to-machine communication from the realm of possibility to fruition.

Tim Berners-Lee introduced the World Wide Web, an "internet" that was a web of information retrievable by anyone, in 1991. In 1996, the Nokia 9000 Communicator became the world's first internet-enabled mobile device. By this time, the world's first search engine, the first laptop computer and the first domain search engine were already available. In the late '90s, search engine giant Google was established.

The turn of the century saw the development of WordPress, an open source web content management system. This enabled humans to move from web consumers to active participants, posting their own content.

IT continues to expand

Since the invention of the world wide web, the IT realm has quickly expanded. Today, IT encompasses tablets, smartphones, voice-activated technology, nanometer computer chips, quantum computers and more.

Cloud computing, first invented in the 1960s, is now an inseparable part of many organizations' IT strategies. In the 1960s and '70s, the concept of time-sharing -- sharing computing resources with multiple users at the same time -- was developed. And by 1994, the cloud metaphor described virtual services and machines that act as real computer systems.  

But it wasn't until 2006 and the creation of Amazon Web Services (AWS) that cloud computing really took off. AWS and its top competitors -- Google Cloud Platform, Microsoft Azure and Alibaba Cloud -- now hold the largest slice of the cloud computing market. The top three providers -- AWS, Google and Azure -- accounted for 58% of the total cloud spending in the first quarter of 2021.

Learn more about the history of cloud computing here.

Over the past decade, other technological advancements have also influenced the world of IT. This includes developments in:

  • social media
  • internet of things
  • artificial intelligence
  • computer vision
  • machine learning
  • robotic process automation
  • big data
  • mobile computing -- in both devices and communications technologies such as 4G and 5G

Connectivity between systems and networks is also on the rise. By 2030, there will be an estimated 500 billion devices connected to the internet, according to a Cisco report.

Next Steps

Jetsons technology that became mainstream

The evolution of television technology explained

Dig Deeper on IT management

Networking
  • subnet (subnetwork)

    A subnet, or subnetwork, is a segmented piece of a larger network. More specifically, subnets are a logical partition of an IP ...

  • secure access service edge (SASE)

    Secure access service edge (SASE), pronounced sassy, is a cloud architecture model that bundles together network and cloud-native...

  • Transmission Control Protocol (TCP)

    Transmission Control Protocol (TCP) is a standard protocol on the internet that ensures the reliable transmission of data between...

Security
  • cyber attack

    A cyber attack is any malicious attempt to gain unauthorized access to a computer, computing system or computer network with the ...

  • digital signature

    A digital signature is a mathematical technique used to validate the authenticity and integrity of a digital document, message or...

  • What is security information and event management (SIEM)?

    Security information and event management (SIEM) is an approach to security management that combines security information ...

CIO
  • product development (new product development)

    Product development -- also called new product management -- is a series of steps that includes the conceptualization, design, ...

  • innovation culture

    Innovation culture is the work environment that leaders cultivate to nurture unorthodox thinking and its application.

  • technology addiction

    Technology addiction is an impulse control disorder that involves the obsessive use of mobile devices, the internet or video ...

HRSoftware
  • organizational network analysis (ONA)

    Organizational network analysis (ONA) is a quantitative method for modeling and analyzing how communications, information, ...

  • HireVue

    HireVue is an enterprise video interviewing technology provider of a platform that lets recruiters and hiring managers screen ...

  • Human Resource Certification Institute (HRCI)

    Human Resource Certification Institute (HRCI) is a U.S.-based credentialing organization offering certifications to HR ...

Customer Experience
  • contact center agent (call center agent)

    A contact center agent is a person who handles incoming or outgoing customer communications for an organization.

  • contact center management

    Contact center management is the process of overseeing contact center operations with the goal of providing an outstanding ...

  • digital marketing

    Digital marketing is the promotion and marketing of goods and services to consumers through digital channels and electronic ...

Close