The digital age started with the introduction of the personal computer and successive technology, which provided the ability to transfer information quickly and freely. Even though digital transformation has taken humanity by storm in recent years, its roots go back to the 1940s.
It is easy to misunderstand “the power of the new.” Humans thought the beauty of mobile phones was making a phone call anywhere when in hindsight, it was about getting a personal gateway to the internet. They believed that the wonder of the MP3 format was its storage space ability, when in fact, it was about streaming every song that’s ever been made.
Digitalization has changed how humans govern, educate, travel, manage their health, bank, shop, work, and enjoy life. After decades of development, digital technology has become extremely useful. Digital technology will continue to transform the world at an accelerating pace in the years that follow.
Advancements of the Technology Before 1995
Humans need to start preparing for a new era of innovations, where different technologies will rise to the front. Looking at earlier technologies can help you understand what is happening.
Every technology trails a similar path of invention, engineering, and transformation. For instance, Michael Faraday invented the electric motor and dynamo in the early 1830s. Edison opened his first power plant 50 years later, and it wasn’t until 40 years later that electricity started to have a measurable influence on productivity.
Following is a technology timeline before 1995.
The Three Laws of Robotics
In May 1941, Isaac Asimov published the short science fiction story named Liar! In which he introduced the three laws of robotics.
The First Law is that a robot isn’t allowed to injure a human being, and a robot shouldn’t allow a human to be damaged through inaction.
The Second Law says that a robot must obey the orders by humans, except if the orders conflict with the First Law.
Lastly, the Third Law says that a robot should protect its existence, except if the protection conflicts with the First or Second Law.
This is the first time that the term “robotics” was used.
The Z3 Computer
The Z3 Computer was created in 1941 by Konrad Zuse, a German engineer. The computer had 22-bit word length, performed floating-point binary arithmetic, and utilized 2,300 relays. However, during a bombing raid in Berlin in 1943, the Z3 computer was destroyed. In the 1960s, Zuse supervised a reconstruction of the Z3, which is now on display at the Deutsches Museum in Munich.
The Atanasoff-Berry Computer
In 1942, Professor John Vincent Atanasoff received funds to build a full-scale machine at the Iowa State College after he successfully demonstrated a prototype of the device in 1939. Atanasoff, alongside a graduate student named Clifford Berry, designed and built the machine.
The Atanasoff-Berry Computer, or ABC, looked nothing like today’s computers, it weighed 750 pounds, and it was the size of a wide desk. But the ABC was the first to use several innovations that are a part of the computers we use nowadays: parallel processing, regenerative memory, separate memory and computing function, a binary system of arithmetic, a modular design, and more.
In 1943, an Austrian engineer called Curt Herzstark made history by creating the smallest four-function, all-technical calculator ever built. Herzstark was working at his family’s business up until he was arrested by the Nazis in 1943. He refined his pre-war design of a calculator while imprisoned at a concentration camp, and after World War II was over, he made the calculator.
Moore School of Electrical Engineering
Moore School of Electrical Engineering was a summer school at the University of Pennsylvania. In 1946, Moore School public-set free lectures took place. Among the lecturers were mathematicians including Douglas Hartree, George Stibitz, Derrick Lehmer, and early computer designers like John Mauchly, J. Presper Eckert, Howard Aiken, and John von Neumann.
Students included future computing pioneers like Jay Forrester, David Rees, Claude Shannon, and Maurice Wilkes.
Presper Eckert and John Mauchly built the ENIAC computing system at the Moore School of Electrical Engineering in 1946. The ENIAC computing system was over 1,000 times faster than any previous computer, thanks to its electronic technology. ENIAC weighed 30 tons, used around 18,000 vacuum tubes, occupied more than 1,000 square feet, and used panel-to-panel switches and wiring for programming.
The Invention of The Transistor
John Bardeen and William Shockley of Bell Laboratories invented the transistor in 1947. They discovered how to make an electric switch with solid materials and without the need for a vacuum.
A Mathematical Theory of Communication
In 1948, the American engineer and mathematician Dr. Claude Shannon wrote A Mathematical Theory of Communication. In the article, Shannon laid out the basis for understanding the limits of communication between machines and people. A Mathematical Theory of Communication earned him the nickname “The father of modern digital communications and information theory.”
First Computer Program to Run on a Computer
Geoff Toothill, Tom Kilburn, and Frederic Williams, researchers at the University of Manchester, developed SSEM (Small-Scale Experimental Machine), or the Manchester Baby. SSEM was built to test new memory technology developed by Kilburn and Williams – soon known as the Williams-Kilburn Tube – the first random-access digital storage device. Kilburn wrote the first program consisting of seventeen instructions, which they ran in 1948.
The First Commercial Computer
The first commercial computer that attracted extensive public attention was the Univac 1. The machine was often referred to as the “IBM Univac,” although Remington Rand manufactured it. Remington Rand delivered its first computer to the US Census Bureau in 1951. Major customers of the Univac computers were the US military and insurance companies.
The Univac 1 weighed 29,000 pounds and used 5,200 vacuum tubes. Univac 1 was created by the designers of the earlier ENIAC computer, John Mauchly and Presper Eckert. There were 46 Univac 1s sold by Remington Rand at more than $1 million each.
One of The Earliest Computer Games
In 1952, a Ph.D. candidate at Cambridge University named Alexander Douglas designed one of the earliest computer games. OXO was a version of Tic-Tac-Toe where players could decide if they wanted to make the first move or the machine would start. The game was played on Cambridge’s EDSAC computer, and players used a rotary telephone dial to enter their moves.
The mathematician Grace Hooper was a part of a team that developed the UNIVAC I computer. In 1952, she finished the first compiler called A-0, which was a crucial step toward creating modern programming languages.
IBM’s First Computer
In 1952, IBM released its first computer, IBM 701, designed for research and scientific work. Its internal memory contained 8,192 words of magnetic drum memory and 2,048 36-bit words of electrostatic memory. The machine was rented for $15,000 a month.
The Illiac Suite
One of the first pieces of music composed in 1956 with an electronic computer is the Illiac Suite. This piece of music was created using the Illiac I computer at the University of Illinois.
RAMAC, the first commercial disk drive, was introduced by IBM in 1956. RANAC stored 5MB, used 50 hefty aluminum disks, weighed a ton, and occupied the space of two refrigerators.
In 1957, a robust scientific language that uses English-like statements was developed. The computer language called FORTAN was created by an IBM team led by John Backus. FORTAN proved it could generate efficient code, and it became the most often used language for technical and scientific computing. This computing language is still in use today.
First Digital Scan
Russell Kirsch was a part of the team that ran the Standards Eastern Automatic Computer (SEAC) at the National Bureau of Standards. In 1957, Kirsch and his team developed a digital image scanner. The first digital scan was a five-by-five centimeter black-and-while picture of his young son, Walden. Due to its importance in the development of digital photography, Kirsch’s scanned image was named one of the “100 Photographs That Changed the World” by Life Magazine.
In 1959, Douglas Engelbart, Ted Nelson, and several other visionaries suggested computerizing the concept of cross-reference, and they created the clickable link we use on the web. This link was called a “hyperlink” by Nelson.
The First Online Communities
In 1961, people could use terminals to log in over phone lines and share a single computer. Even though the computers couldn’t connect to one another, they were the first common multi-user systems.
In 1964, Edson de Castro and Gordon Bell, two young engineers from DEC (Digital Equipment Corporation), developed a small, general use computer. A later version of that machine became the PDP-8, which was the first successful commercial minicomputer. The PDP-8 sold for $18,000.
The Beginnings of Computer Dating
Project TACT (Technical Automated Compatibility Testing) was created in 1964. Although students at Harvard had created dating programs, Project TACT was a dating service that matched single residents that lived in New York’s Upper East Side. The residents would fill out a questionnaire, which was transferred to punch cards and then fed into an IBM 1401 computer.
The Basis of Modern Travel Sites
SABRE was created in 1964. The SABRE reservation system was set up for American Airlines and used phone lines to deliver data on any flight in less than three seconds. SABRE linked 2,000 terminals in 65 cities to a pair of IBM 7090 computers.
In 1964, John Kemeny and Thomas Kurtz created BASIC (Beginner’s All-purpose Symbolic Instruction Code) for their students at Dartmouth College. BASIC is an easy-to-learn programming language that was used in schools and early personal computers all over the world.
CDC 6600 Supercomputer
In 1964, the Control Data Corporation (CDC) 6600 was introduced. The supercomputer performed up to 3 million instructions per second, and it was three times faster than its closest competitor, the IBM 7030 supercomputer. To unburden the workload from the central processor, this computer used ten small computers (peripheral processing units).
The Orm was an air-powered robotic arm developed at Stanford University in 1965. It had 28 rubber bladders that were crammed between seven metal disks, and it moved by inflating those rubber bladders. Since movements could not be repeated accurately, the design was abandoned.
In 1966, Joseph Weizenbaum finished ELIZA, a natural language processing environment. DOCTOR, its most famous mode, responded to user questions. DOCTOR used predetermined questions or phrases and substituted keywords in order to mimic human behavior. When using DOCTOR, some users were tricked into believing that they were talking to another human being.
The Brown Box
In 1966, Ralph Baer designed the Brown Box, which allowed users to play several different games on a standard television set without requiring software, microprocessor, or computer. The Brown Box even had a gun accessory for playing games with shooting.
The Debut of Star Trek
Star Trek, one of the most popular TV series of all time, debuted in 1966. The series speculated on technologies like machine-supported medical diagnosis, human-computer interaction, handheld computing and communications, and voice recognition. The technologies that were seen in the series impacted generations of technologists, writers, and filmmakers.
Apollo Guidance Computer
The Apollo Guidance Computer (AGC) was designed by engineers and scientists at MIT’s Instrumentation Laboratory in 1968. With the AGC, the size of the Apollo spacecraft computer was reduced from seven refrigerators to a single unit weighing only 70 lbs. The first flight of AGC was on Apollo 7, and a year later, it piloted Apollo 11 on the Moon. The AGC was one of the earliest uses of read-only magnetic rope memory, core memory, and integrated circuits.
SHRDLU is a natural language used in AI research, developed by Terry Winograd in 1986 while working on his Ph.D. thesis at MIT. SHRDLU was able to combine deductive reasoning, meaning, and syntax to respond appropriately to commands, unlike previous programs like ELIZA that were incapable of truly understanding English.
The Mother of All Demos
Douglas Engelbart and his team at SRI revealed their experimental “oNline System” at a San Francisco computing conference in 1968. The 90-minute presentation demonstrated almost all the fundamental elements of modern personal computing: graphics, hypertext, windows, navigation and command input, the computer mouse, video conferencing, dynamic file linking, word processing, collaborative real-time editor, and revision control.
The Stanford Arm
The Stanford Arm was created in 1969 by Victor Scheinman, a mechanical engineering student that worked in the Stanford Artificial Intelligence Lab (SAIL). It was the first successful computer-controlled, electrically powered robot art, which led directly to commercial production.
The First Robot to Embody Artificial Intelligence
In 1970, Shakey the robot was created. The robot was controlled by AI, and it roamed around the halls of SRI by applying information about its environment to a route. Shakey used a laser range finder and a TV camera to collect data, and it transmitted that data to a DEC PDP-15 and PDP-10. A computer sent the commands to Shakey through a radio link, and the robot could move at a speed of 2 meters per hour.
The First Microprocessor
In 1971, Stanley Mazor, Federico Faggin, Masatoshi Shima, and Marcian Hoff invented the world’s first microprocessor – the Intel 4004. The microprocessor used cutting-edge silicon-gate technology, and it marked the start of Intel’s rise to dominance in the global processor industry.
The Invention of The Floppy Disk
The first memory disk, better known as the floppy disk, was introduced by IBM in 1971. It was an 8-inch flexible plastic disk that held 100 KBs of data. The nickname of the memory disk, “floppy,” originated from the flexibility of the disk.
The C Programming Language
In 1972, the C programming language was released by Dennis Ritchie and his team. The C programming language is still widely used today.
Release of Pong
Al Alcorn, a young engineer, was hired by California entrepreneur Nolan Bushnell to design a car-driving game in 1972. However, the idea was too ambitious for the time, so instead, Alcorn designed a version of Ping Pong. Pong launched the modern video game era and revolutionized the arcade industry.
Development of the Ethernet
In 1973, Robert Metcalfe, a member of the research staff for Xerox, developed Ethernet for connecting multiple computers and other hardware. Ethernet was inspired by ALOHAnet, which Metcalfe had studied as part of his Ph.D. dissertation.
Altair 8800 appears in Popular Electronics
In 1975, the Altair 8800 was featured in the January issue of Popular Electronics, and it was described as the “world’s first minicomputer kit to rival commercial models.” Bill Gates and Paul Allen both offered to write the software for Altair, using the BASIC language. The same year, after they had success on their first attempt, the two childhood friends formed their own company, Microsoft.
Completion of Apple-1
On April Fool’s Day in 1976, Steve Wozniak and his friend Steve Jobs released Apple-1. According to Stanford University, Apple I was the first computer that had a single-circuit board.
Release of Star Wars
In 1977, another sci-fi classic was released – Star Wars. Star Wars has influenced and inspired many futuristic technologies, some of which exist and others are under development.
The First Widely Successful Video Game System
In 1977, Atari released the Video Computer System, which would later be known as Atari 2600. The VCS was designed to connect to a home television set, and it used an 8-bit MOS 6507. Throughout 1980, there were more than twenty million units sold, making Atari 2600 the first widely successful video game system.
MCA Phillips introduced the LaserDisc in 1978, and it offered better video and audio quality than its competitors. The LaserDisc was the direct predecessor of the CD and DVD, and the movie Jaws was the first LaserDisc that sold in North America.
The Oldest Virtual World
MUD1, or Multi-User Dungeon, went online in 1979. Two students at the University of Essex called Roy Trubshaw and Richard Bartle wrote a program that allowed many people to play against each other online. This was the first internet multiplayer online game, and it became a hit among students as a means for socializing.
VisiCalc was developed in 1979 by programmer Bob Frankston and Harvard MBA candidate Dan Bricklin. VisiCalc, which stood for visible calculator, was the first spreadsheet computer program for personal computers. This program made business forecasting much simpler.
In the 1980s, Jaron Larnier popularized the term “virtual reality.” In 1985, he founded the company VPL Research. The “VPL” in VPL Research stood for “Virtual Programming Languages.” This was one of the first companies that developed and sold products related to virtual reality.
The DataSuit was developed by VPL Research circa 1989. It was a full-body suit with sensors for measuring the movement of the trunk, legs, and arms.
The Predecessor to the World Wide Web
Tim Berners-Lee wrote a program called Enquire in 1980. This program made it possible to create typed links between different information nodes within a single file and across file boundaries in a file system. A few years later, Berners-Lee will invent the Word Wide Web, partly based on Enquire.
IBM introduces its Personal Computer
IBM Model 5150, or the first IBM PC, was released in 1981. The IBM PC used Microsoft’s MS-DOS operating system and was based on a 4.77 MHz Intel 8088 microprocessor. It was the first PC that gained widespread adoption by the industry, and it was widely cloned.
The First Mass-Produced Portable Computer
In 1981, the Osbourne 1 was introduced. The portable computer weighed 24 pounds; it had 64 KB of memory, a 5-inch display, and two 5.25-inch floppy disk drives.
The first commercial Compact Disc was released in 1982. The disc contained a recording of Claudio Arrau performing Chopin waltzes.
The Lisa Computer
In 1983, the first commercial PC with a graphical user interface (GUI) was introduced. This was an important milestone because soon, the Apple Macintosh and Microsoft Windows would adopt the graphical user interface.
The Road to Point Reyes
The Road to Point Reyes was produced by Lucasfilm in 1983. The single image that took a month to render was one of the most significant static images in computer graphics.
Release of Word
The first version of Word (at that time called Multi-Tool Word) was launched in 1983. Microsoft distributed 450,000 disks with a demo version of Word in the November issue of World magazine, giving readers a chance to try the program.
Launch of The Macintosh
In 1984, Apple launched the Macintosh, the first successful mouse-driven computer with a graphical user interface. The price of the computer was $2,500.
In 1984, Fujio Masuoka invented flash memory while working for Toshiba. Flash memory quickly gained a following in the computer memory industry, thanks to its capacity of being erased and reprogrammed multiple times.
Publication of the C++ Programming Language
A computer programming book called The C++ Programming Language was published in 1985. The book was written by the language’s creator, Bjarne Stroustrup.
The Nintendo Entertainment System
In 1985, Nintendo released the Nintendo Entertainment System (NES), formerly called the Famicom Gaming System.
The First Dot-Com Domain
The formal beginning of Internet history was marked when a small Massachusetts company, called Symbolics Computer Company, registered the first dot-com domain name in 1985. The domain name was Symbolics.com.
The Morris Worm
The first worm that had a significant effect on real-world computer systems was released in 1988. The worm was sent by 23-year-old Robert T. Morris, the son of a computer expert for the NSA. The worm caused major problems for about 6,000 hosts, and the problems lasted for days. Morris was the first person to be convicted under the “Computer Fraud and Abuse Act,” and the event publicized the importance of network security.
The First Game Boy Handheld Game Console
In 1989, Nintendo released its 8-bit handheld game console. The puzzle game Tetris helped this game console rise to popularity. More than one hundred million Game Boys were sold over nearly twenty years.
World Wide Web Is Born
Tim Berners-Lee invented the World Wide Web while he was working at CERN in 1989. The web was initially developed to meet the demand for automated information-sharing between scientists in institutes and universities worldwide.
Release of Photoshop
Photoshop was created by John and Thomas Kroll in 1988. Adobe Systems bought it in 1988 and released it to the public in 1990. The first version included digital color editing and retouching.
In 1991, Thomas A. DeFanti, Daniel J. Sandin, and Carolina Cruz-Neira from the Electronic Visualization Laboratory created the CAVE or the Cave automatic virtual environment. This was the first cubic immersive room, which allowed people to see their bodies in relation to others in the room.
In 1992, Louis Rosenberg created the virtual fixtures system using a full upper-body exoskeleton, which enabled a physically realistic mixed reality in 3D. The system produced the first true augmented reality experience enabling touch, sound, and sight.
JPEG Standard Introduced
In 1992, the Joint Photographic Expert Group (JPEG) determined a set of rules for digital images that would become the jpeg (or .jpg) format. Jpeg is the format most commonly used by digital cameras.
SanDisk introduced CompactFlash in 1994, and it quickly became the preferred memory storage option for many customers. Even though it was a bit larger than other memory card formats, its high capacity made it a popular choice.
Digital transformation has been one of the most notable and persistent large-scale trends shaping the modern era of humans. Playing a game, watching a film, listening to music, making a payment, buying a product, booking a flight, or ordering a cab – humans can now do all of these things remotely.
In the future, technological innovation will lead to long-term gains in productivity and efficiency.
The possibilities of billions of humans connected by mobile devices, with extraordinary access to knowledge, storage capacity, and processing power are unlimited. These possibilities will be multiplied by emergent technology advances in fields like 3-D printing, AI, robotics, nanotechnology, quantum computing, and energy storage.