The history of computing
hi guys, welcome to my website
Hi, thanks for tuning in to singularity prosperity.
This blog is the first in a multi part series discussing computing. In this blog, we'll be discussing the evolution of computing more specifically the evolution of the technologies that are brought upon the modern computing era.
The purpose of this blog is so we can appreciate how fast technology is evolving and the people have brought us to this point.
Many inventions have taken several centuries to develop into their modern forms and modern inventions are rarely the product of a single inventors efforts.
The computer is no different the bits and pieces of the computer both hardware and software have come together over many centuries, with many people and groups each adding a small contribution.
We start as early as 3000 bc with the Chinese advocates houses related to computing you ask the advocates was one of the first machines humans ever created to be used for counting and calculating fast forward a 1642
And the abacus devolves into the first mechanical adding machine built by mathematician and scientist Blaise Pascal. This first mechanical calculator. The pascaline is also where we see the first signs of technophobia emerging with mathematicians fearing the loss of their jobs due to progress also in the 1600s. from the 1660s to the early 1700s, we made Gottfried Leibniz.
He's a pioneer in many fields, most notably known for his contributions to mathematics and considered by many of the first computer scientists inspired by pesco he created his own calculating machine able to perform all four arithmetic operations.
He was also the first to lay down the concepts of binary arithmetic how all technology nowadays communicates, and even envision a machine that uses binary arithmetic from birth
we are taught how to do arithmetic in base 10, and for most people, that's all they're concerned with the numbers zero to nine However, there are an infinite number of ways to represent information such as octal as base eight hexadecimal is base 16.
First generation of Computer
Used represent colors base 256, which is used for encoding the list can go on binary is based to represented by the numbers zero and one we'll explore later in this video.
Why binaries essential for modern computing back on topic progressing to the 1800s were met with Charles Babbage, Babbage is known as the father of the computer with the design of his mechanical calculating engines.
In 1820, Babbage noticed that many computations consisted of operations that were regularly repeated and theorized that these operations could be done automatically.
This led to his first design the Difference Engine, it will have a fixed instruction set be fully automatic through the use of steam power and print its results into a table and 1830 Babbage stop work on his Difference Engine to pursue a second idea the
Analytical Engine. elaborating on a Difference Engine this machine would be able to execute operations in non numeric orders through the addition of conditional control, store memory and read instructions from punch cards, essentially making it a programmable mechanical computer.
Unfortunately, due to lack of funding, his designs never came to reality, but if they had would have sped up the invention of the computer by nearly 100 years. Also worth mentioning is Ida Lovelace who worked very closely with Babbage.
She is considered the world's first programmer and came up with an algorithm that would calculate Bernoulli numbers. I was designed to work with Babbage's machine. She also asked
Most usable computer shortcuts
Learn many fundamentals of programming such as data analysis, looping and memory addressing 10 years prior to the turn of the century with inspiration from Babbage American inventor Herman Hollerith design one of the first successful electromechanical machines referred to as a census tabulator.
This machine would read US Census data from punch cards up to 65 at a time and tally up the results. Hollerith tabulator became so successful, he went on to found his own firm to market the device, this company eventually became IBM. To briefly explain our punch cards were essentially once fed into a machine and electrical connections attempted to be made, depending on where the holes in the card are will determine your input based on what connections are completed.
to input data to the punch card, you could use a key punch machine aka the first iteration of a keyboard. The 1800s are a period where the theory of computing began to evolve and machines started to be used for calculations.
But the 1900s is where we begin to see the pieces of this nearly 5000 year puzzle coming together, especially between 1930 to 1950. In 1936, Alan Turing proposed the concept of a universal machine later to be dubbed the Turing machine.
Capable of computing anything that is computable. Up to this point machines are only able to do certain tasks to the hardware was designed for the concept of the modern computer is largely based off turns ideas. Also starting in 1936, German engineer Conrad Zeus invented the world's first programmable computer.
This device read instructions from punch tape and was the first computer to use Boolean logic and binary to make decisions through the use of relays. For reference Boolean logic is simply logic that results in either a true or false output or one corresponding to binary one or zero.
We'll be diving into Boolean logic deeper later in this video. Zeus will later use punch cards to encode information in binary, essentially making them the first data storage and memory devices in 1942. With the computer the Z for Zeus also released the world's first commercial computer.
For these reasons many consider Zeus the inventor of the modern day computer in 1937, Howard Aiken with his colleagues at Harvard and collaboration with IBM began work on the Harvard mark one calculating machine, a programmable calculator and inspired by Babbage's Analytical Engine. This machine was composed of nearly 1 million pounds
had over 500 miles of wiring and weighed nearly five tons. The marked one had 60 sets of 24 switches for manual data entry and could store 72 numbers each 23 decimal digits long, I could do three additions or subtractions. In a second, a multiplication took six seconds, a division took 15.3 seconds and a logarithm or trig function took about one minute.
As a funny side note one of the primary programmers of the mark one Grace Hopper discover the first computer bug a dead moth blocking one of the reading holes of the machine hopper is also credited with coining the word debugging.
The vacuum tube era marks the beginning of modern computing the first technology that was fully digital and unlike the relays used in previous computers are less power hungry, faster and more reliable beginning in 1937, and completing in 1942.
The first digital computer was built by john atanasoff and his graduate student Clifford Berry, the computer was dubbed the ABC. Unlike previously built computers like those built by Zeus, the ABC was purely digital. It used vacuum tubes and included binary maps
In Boolean logic to solve up to 29 equations at a time in 1943 the Colossus was built in collaboration with Alan Turing to assist in breaking German crypto codes. Not to be confused with Turing's Bombay that actually solved enigma. This computer was fully digital as well. But unlike the AVC was fully programmable making it the first fully programmable digital computer completing construction in 1946.
The electrical numerical integrator and computer aka the NEC was completed composed of nearly 18,000 vacuum tubes and large enough to fill an entire room the NEC is considered the first successful high speed electronic digital computer. It was somewhat programmable but like Aikens, Mark one was a patriot wire every time the instruction set had to be changed. The net essentially took the concepts from a 10 assaults ABC and elaborated on them in a much larger scale.
Meanwhile, the neck was under construction in 1945, mathematician john von Neumann contributed a new understanding of how computers should be organized and built further elaborating on Turing's theories and bringing clarity to the idea of computer memory and addressing he elaborated on conditional addressing or sub routines, something Babbage had envisioned for his Analytical Engine, and
100 years earlier also the idea that instructions or the program running on a computer could be modified in the same way as data and decode them in binary. Von Neumann assisted in the design of the next successor, the electronic discrete variable automatic computer, aka the ED vac, which was completed in 1950.
And the first stored program computer, it was able to operate over 1000 instructions per second, he is also credited with being the father of computer virus, ology was the design of a self reproducing computer program.
And it contains essentially those things, which the modern computer has in it, although in somewhat primitive form. This machine has stored program concept as its major feature, and that in fact, is the thing which makes the modern computer revolution possible. At this point, you can see that computing and officially evolved into its own field from mechanical to electromechanical relays that took milliseconds to digital vacuum tubes that took only microseconds from binary as a way to encode information with punch cards to be used with Boolean logic and represented by physical technologies like relays and vacuum tubes to finally being used to store instructions and programs from the abacus as a way to count Pascal's mechanical calculator that there is a Lebanese, Alan Turing and john von Neumann division of Babbage and the intellect of Lovelace, George Boole, his contribution on Boolean logic, the progressing inventions of a programmable calculator to a stored program, fully digital computer and countless other inventions, individuals and groups each step of further accumulation of knowledge while the title of the inventor of the computer may be given to an individual or group. It was really a joint contribution over 5000 years and more so between 1800 and 1950.
vacuum tubes were a huge improvement over relays, but they still didn't make economic sense in a large scale.
For example of the next 18,000 tubes roughly 50 would burn up per day and around the clock team of technicians would be needed to replace them. vacuum tubes are also the reason why computers took up the space of entire rooms weighed multiple tons and consumed enough energy to power a small town in 1947. The first
silicon transistor was invented at Bell Labs. And by 1954, the first transistorized digital computer was invented aka the tronic. It was composed of 800 transistors took the space of point 05 cubic meters compared to the 28. The next hookup only took 100 watts of power and could perform 1 million operations per second.
Also during this era, we begin to see major introductions on both the hardware and software aspect of computing. On the hardware side, the first memory device the random access magnetic core store was introduced in 1951. by Jay Forrester, in other words, the beginnings of what is now known as ram today. The first hard drive was introduced by IBM in 1957. It weighed one ton and get store five megabytes costing approximately $27,000 per month in today's money.
on the software side is where a lot of major innovations and breakthroughs began to come this because computer hardware and architectures are beginning to become more standardized instead of everyone working on different variations of a computing machine assembly was the first programming language to be introduced in 1949 by really started taking off in this era of computing assembly as a way to communicate with the machine and pseudo English instead of machine language.
The first true widely used programming language was Fortran invented by john Baucus at IBM in 1954. Assembly is a low level language and Fortran is a high level language and low level languages. While you aren't writing instructions in machine code, a very deep understanding of computer architecture and instructions is still required to execute a desired program,
which means a limited number of people have the skills and it is very error prone. Also in the early to mid 50s. To compile code back to machine code was still an expensive and time consuming process. This all changed with the Grace Hopper and the development of the first computer compiler Hopper,
if you remember from earlier also found the first computer bug that's a lot for programming of computers to become more affordable and nearly instantaneous instead of the time consuming process of writing code in assembly and then manually converting it back to machine code as a side note hopper sister with the invention of and other early programming language COBOL.
This era marks the beginnings of the modern computing era and where the exponential trend of computing performance really began. While transistors were a major improvement over vacuum tubes, they still have to be individually soldered together.
As a result, the more complex computers became, but the more complicated on numerous connections between transistors increasing the likelihood of faulty wiring in 1958. This all changed with jack Kilby of Texas Instruments and his invention of the integrated circuit.
The integrated circuit was a way to pack many transistors onto a single chip instead of individually wiring transistors packing all the transistors also significantly reduce the power and heat consumption of computers once again and made them significantly more economically feasible to design and buy integrated circuits sparked a hardware revolution and beyond computers assisted in the development of various other electronic devices due to miniaturization, such as a mouse invented by Douglas Engelbart in 1964.
He also demonstrated the first graphical user interface as a side note computer speed, performance memory and storage also began to iteratively increases ICS could pack more transistors into smaller surface areas.
This demonstrated by the invention of the floppy disk in 1971, by IBM and in the same year DRAM by Intel to list a few along with hardware further advances in software made as well with an explosion of programming languages and the introduction of some of the most common languages today base
In 1964, and see in 1971.
As you can see from throughout this video computing since the 1900s was a vaulted and increasingly fast rate bus in 1965 LED Gordon Moore, one of the founders of Intel to make one of the greatest predictions in human history, computing power would double every two months at low cost, and that computers would eventually be so small data that can be embedded into homes, cars, and what is referred to as personal portable communications equipment, aka mobile phones.
We now refer to this as Moore's Law Harrison charts to further illustrate how fast computing was evolving and what more basis predictions on one of my colleagues
call this Moore's law. Rather than just being something that chronicles the progress of the industry, it kind of became something that drove the progress of the industry, the tremendous amount of engineering and commitment and been required to make that happen, but much to my surprise,
the industry has been able to keep up with the projection
At this point the blog come to conclusion. I'd like to thank you for taking the time to read it.
If you enjoyed it, please leave a comment up and if you want me to elaborate on any of the topics discussed or have any topic suggestions, please leave them in the comments below.This has been all for you've been reading singularity prosperity, and I'll see you again soon.