Wednesday 5 May 2021

History of Computer

 History of Computer

Imagine a world without computers. A world where humanity\rquote s knowledge is no longer at your fingertips.

A world where a tool that you use every day just no longer exists. A world where you wouldnt be watching this video right here, right now. Computers have penetrated nearly every

facet of our lives. But how did they become so ubiquitous? This is the history of the

computer. \par Today, the word computer refers to the devices

that we interact with to work, connect and play. However, it historically described machines

that were used in performing calculations with numbers. As part of this video, we will study the evolution of the earliest devices used for computations and how they become

the computers that we depend on today. \par The abacus was a computational tool used for

hundreds of years and is generally considered to be the first calculator. The exact origin

of the device is still unknown but the Sumerian abacus appeared as early as 2700 \f1\endash

2300 BCE in Mesopotamia. It has been mentioned in numerous civilizations throughout history,

including in Ancient Egypt, Persia, Greece, China, Rome and India.

Another famous calculator from the past was the astrolabe, which was used to measure the

elevation of celestial bodies in the sky. The earliest known reference to one was from

around the 2nd century BCE in the Hellenistic civilization. In addition to its value to

astronomers, the astrolabe became indispensable for sailors since it allowed them to determine

their local latitude on long voyages. \par One defining quality of modern computers that

separates them from simple calculators is the fact that they can be programmed. This

allows them to automatically perform certain tasks without continual human input. In the

19th century, Charles Babbage conceptualized the first programmable, mechanical computer.

His design utilized punch cards to input instructions that the machine would carry out. Unfortunately,

it proved too complex to economically produce and the project was cancelled after the British

government stopped funding. \par The early 20th century saw analog computers

develop further as they were put to work to solve complex mathematical problems. The differential

analyzer is the most famous example of this and was built at MIT by Vannevar Bush in the

1920s. Bush later became involved in the Manhattan project to produce nuclear weapons and even

inspired the invention of the World Wide Web nearly 50 years before its creation. \par

World War 2 led to a strong leap in computer technology as nations tried to gain the upper

hand over their adversaries. Computers were primarily built to calculate firing tables

to improve artillery accuracy and to break enemy code to gain valuable intelligence.

The first large scale digital computer was built by Howard Aiken in 1944 at Harvard University;

it was one of the first machines that used electrical switches to store numbers. When

the switch was off, it stored zero and while on, it stored the number one. Modern computers

follow this same binary principle. This time period also saw the rise of vacuum tubes,

which offered much faster performance than traditional relay switches.

The most famous vacuum tube computer and one considered to be the predecessor of modern

machines was the ENIAC, invented by John Mauchly and J. Presper Eckert. It was the first fully

electronic and general-purpose digital computer.

Despite vacuum tubes offering advantages over electromechanical switches, they had their

own drawbacks. They consumed enormous quantities of power, were unreliable and needed large

amounts of space. In 1947, three scientists at Bell Labs discovered that semiconductors

could be used to more effectively amplify electrical signals. This led to the creations

of the transistor, which paved the way for modern computing. Transistors were much smaller

than vacuum tubes, used no power unless in operation and were extremely reliable. William

Shockley, one of the inventors of the transistor, continued refining it and founded a company

in Palo Alto, California. This would foreshadow Silicon Valley\rquote s development into the

global hub of computing over the next few decades. 

In the late 1950s, two teams independently built the integrated circuit, a collection

of transistors and other components that could be manufactured on a large scale. This was

a major breakthrough that led to computers shrinking throughout the 1960s. In 1968, the

general-purpose microprocessor was invented and was the first example of a computer existing

on a single chip. \par The miniaturization of microchips allowed

Intel to release a processor known as the 8080 in 1974. This was used by hobbyists to

build home computers. One such hobbyist was Steve Wozniak, who partnered with his friend

Steve Jobs to found a company named Apple and begin selling home computers. Although

the first iteration didn\rquote t sell well, their second machine was sold as the Apple

II and gained popularity among home users, schools and small businesses due to its ease

of use. In 1980, the market leader for computers was IBM and they responded with their first

personal computer, also based on the Intel 8080 processor.

The main problem with early computers was that

they all used different hardware, and programs written for one machine would not work with

others. In 1976, Gary Kildall created an intermediary between a machine\rquote s software and hardware;

this became the first operating system. IBM was eager to implement this into their PCs;

however, after Kildall refused to sell to them, they turned to a young programmer named

Bill Gates at a company named Microsoft. After convincing IBM to let Microsoft own the rights

to its operating system, Gates developed MS-DOS, which he licensed to IBM and eventually other

PC manufacturers. This led Microsoft to become the titan it is today. 

At Apple, Steve Jobs was determined to make computers easier to use. He was inspired by

research that Xerox had conducted in the 1970s, which included computers with a desktop-like

screen, mouse and graphical user interface. Jobs borrowed these ideas and eventually launched

the Macintosh, which hurt IBM\rquote s position in the industry. These features were eventually

implemented by Bill Gates into Windows, which led to a copyright lawsuit in the late 1980s.

Microsoft eventually prevailed and Windows became the dominant operating system for home

personal computers, where it remains to this day. 

The 1980s and beyond have seen computers find numerous new applications. They appeared in

watches, cars, cellphones, airplanes. They became portable and ever-present. Today, computers

are everywhere. And yet, the future remains even more promising. Quantum computers could

signal a paradigm shift as humanity can tackle complex problems that today\rquote s machines

cannot solve. A move away from silicon may reignite the pace of transistor development.

Computers will be crucial for us in reaching out into space and exploring

the stars. They may have humble beginnings but no matter what challenges humanity faces,

the descendants of that abacus from Mesopotamia will be always be alongside us. 

Thanks for watching and I hope you enjoyed the video. Feel free to drop a like or leave

a comment down below and make suggestions for any future videos. I\rquote ll be trying

to get back into making these. So, thanks again and I will see everyone next time!



No comments:

Post a Comment