Please register to access this content.
To continue viewing the content you love, please sign in or create a new account
Dismiss
This content is for our paying subscribers only
Special Report

How computer chips evolved to power information technology and artificial intelligence

From valves, transistors, integrated circuits to microprocessors: the journey of chips



Chips are a major invention of the 20th century as they helped develop complex, multifunctional devices. Personal computing became a reality after microchips grew smaller and more powerful.
Image Credit: Pexels

The world runs on chips. Also known as microchips, silicon chips, or computer chips, these microprocessors are the brains of computers. Their ability to process and store data makes them integral to smartphones, laptops, vehicles, supercomputers, and even nuclear missiles. They drive today's technology and are a critical component in the race to refine and upgrade artificial intelligence.

Six decades ago, resistors, capacitors, inductors, transformers and vacuum tubes (thermionic valves, or just valves) worked together to keep electronic devices buzzing. Tiny transistors replaced the bulky valves, and the race for miniaturisation began. Integrated circuits (ICs) gave it a turbo boost, paving the way for microchips.

The invention of the transistor was a pivotal moment in electronics due to its small size and reduced electricity consumption. That set the stage for the next leap in electronics. The development of integrated circuits, which packed several individual components into a single unit, ushered in a new era in electronics.

What's a computer chip?
A computer chip is a tiny wafer of semiconductor material embedded with integrated circuitry. It comprises the processing and memory units of the modern digital computer.
Today's chips are the size of a fingernail but packed with billions of transistors capable of executing billions of instructions per second. The smallest transistors on the market are now hitting the 3-nanometre mark. A nanometre is one billionth of a metre or a millionth of a millimetre.

The origins of chips

The integrated circuit was the first of the chips, a forerunner of modern computer chips. Jack Kilby of Texas Instruments created the first integrated circuit in 1958. A year later, Robert Noyce, co-founder of Fairchild Semiconductor and later Intel, made another by placing the full circuit on a single silicon chip. It helped mass-produce devices with integrated circuits, paving the way for the digital age.

Advertisement

Computer chips are indeed the most important invention of the last half-century. Without them, the world would be without advanced computers, and there would be no internet to keep the global population interconnected.

The first computer chips had only one transistor, but today's chips can hold billions of transistors. Modern chips can have up to 100 layers, which align atop each other with nanometre precision.

The first microprocessor

As microchip technology evolved, several companies, including Texas Instruments, Fairchild Semiconductor and Intel, refined and upgraded integrated circuits to drive innovation and competition in the industry.

Persistent efforts at miniaturisation led to the emergence of microprocessors. Work on the first microprocessor began in 1969 when Busicom Corp, a Japanese company, asked Intel to make a unit of seven chips for a new calculator. Intel proposed a single CPU (central processing unit) chip instead of multiple ICs, and the first chip was born.

Advertisement
How a microprocessor works
The microprocessor follows a sequence to execute the instruction: Fetch, Decode and Execute.
Initially, the instructions are stored in the computer's memory sequentially.
1. The microprocessor fetches instructions from the storage (memory), decodes them, and executes them until the STOP instruction.
2. It sends the result in binary form to the output port.
3. The register stores temporary data between these processes, and the ALU (Arithmetic and Logic Unit) performs the computing functions.

The Intel 4004, launched in 1971, was the world's first microprocessor, leading to the modern computing era. More advanced microprocessors, including the Intel 8080, were instrumental in the development of early personal computers. The 8080 was followed by the 8086, which laid the foundation for the x86 architecture that is still used in personal computers.

Smaller and faster for complex devices

The personal computing revolution became a reality as microchips grew smaller and more powerful, with billions of transistors in a unit. Chips became a major invention of the 20th century as they helped develop complex, multifunctional devices. Their versatility and capability have made them integral to countless applications, managing everything from communication and entertainment to critical infrastructure and transportation.

A world without chips is unimaginable. Chips shape the world, powering artificial intelligence and the race to new frontiers in innovation, imagination and technology.

Advertisement