Subtitles section Play video
Hi and welcome to another awesome video by 365 Data Science, where we take interesting
ideas related to data, technology, business, and careers then turn them into unique and
intriguing content for your enjoyment.
So, without delay, let’s get started.
Nowadays, it feels like every week another ground-breaking invention or idea is revealed.
But how did we get here?
In this video we’re going to look at the evolution of technology from the very beginning.
We’ll go through some of the most influential devices and concepts that have led us to the
techno-fuelled world we live in today.
Our time travel journey starts pretty much when we did, before selfies, trolling, open-world
gaming, and cheesy infographic videos, way back in 35,000 BC with the first recorded
example of counting.
It all began with a simple tally on a bone – the Lebombo bone.
Talk about upcycling!
More conveniently, Papyrus paper was invented in 3,000 BC for recording and the Abacus in
2,300 BC for sums.
These had evolutions of their own for the next couple of millennia up to around 100
CE where they reached a state of how we know them today.
This was also the time Hindu Arabic numerals were developed, not the first characterised
numbers but ones which allow complex mathematics.
Jumping to the second millennia we have the development of the Torquetum – a complex
analogue computer used to measure astronomical coordinates and is the bases for all modern
astronomical instruments.
The Torquetum was the first device used for the observation of Hayleys Comet.
Equally terrifying and amazing In 1206 came one of the first examples of the concept of
automation with Al-Jazari’s Programmable automata which helped to fuel the ideas of
mechanical humans and artificial intelligence.
Recently, evidence was discovered showing that the first mechanical calculator was conceived
in 1502 by none other than Leonardo da Vinci – who apparently invented everything.
However, nothing came of it and after a few manual calculation systems, the first mechanical
calculator was realised as Pascal’s Calculator in 1642.
Shortly after, we got the binary system.
But It wasn’t until the 1800s that technology was advanced enough to take advantage of it.
Starting with the Jacquard Loom, which used punch cards to automatically weave designs,
then leading to Babbage’s concept of the difference engine and later the analytical
engine.
Unfortunately, these last two projects weren’t actualised due to funding but that didn’t
stop the world’s first computer programmer Ada Lovelace from seeing how they could work
and writing the first ever algorithm.
Then we literally captured lightning in a bottle and harnessed electricity, we performed
the census with a tabulating machine for the first time, the triode was invented – essential
for the development of television and radio, we came up with the idea of a thinking machine
– the first ever robot in the sci-fi film “Metropolis”, and the Turing machine was
conceptualised.
All this played a huge part in the creation of the first ever electric computer in 1939
– The Atanasoff-Berry Computer.
Work on the Atanasoff-Berry was discontinued due to World War II, but other computers built
for the war effort, like the colossus and Zuse Z3, were developed.
During the war a model for computational neural networks was created.
After the war had ended we got the ENIAC - the grandfather of digital computing, the UNIVAC
for business and government, the transistor, and the first thinking machines in the SNARC
and the IBM 701.
In fact, this was the first computer that displayed AI capabilities as it learned to
play checkers.
The 50s saw another step forward in machine learning with the Logic Theorist – a program
designed to mimic the grammar skills of a human and dubbed the first AI program.
This decade also sees the integrated circuit which in the 60s was starting to be used in
digital computers.
This gives us the first ever minicomputer – the PDP 1 which, in turn, creates the
need for the mouse and graphical user interface.
We also got Eliza, the first ever Chatbot.
The end of the 60s saw programmable calculators, super computers, operating systems and the
introduction of ARPANET, the internet’s humbler and less complex older sibling, and
of course the Apollo guidance computer.
The 1970s saw these technologies become commercial and with that came a flood of new developments
– the Honeywell 316, Canon Pocketronic, the first DRAM and microprocessors, the iconic
floppy disk, the first humanoid robot, and the first commercially available microcomputer.
The decade saw plenty of improvements to the technology including Microsoft’s first programming
language, and IBM’s and Apple’s original entries into desktop computing.
The end of the 70’s saw the first automated vehicle – the Stanford cart – and led
us into the 80’s where technology was quickly getting smaller, cheaper, and more powerful.
We also find the development of backpropagation and the CD ROM.
During this time, tech is becoming so compact that the Galvilan SC is marketed as a ‘laptop’
computer.
AI is also making leaps and bounds with Cyc, a project aiming to build a knowledge base
of basic concepts, ‘rules of thumb’ and common sense to assist AI applications perform
human-like thinking.
In the following 5 years we get handheld computers, AI that teaches itself to speak, the first
versions of windows and excel, plenty of supercomputers, laptops, desktops, processors, and in 1989
we get the world wide web.
The web and some big leaps in processing power in the early 90s brought PCs into the realm
of entertainment devices.
This decade introduced us to PC gaming, DVDs, internet browsers.
We got some of the first mobile phone devices from Nokia to fit in our hands and super computers
that are now able to beat human master chess champions with Deep Blue.
And let’s not forget Skynet, I mean Google.
For the next decade we see the development of various computers, hard drives, robots,
processors, and the still in use USB.
In the mid 2000’s we get our first 64-bit processor and dual core CPU, a huge jump in
speed and power of computing.
Speaking of speed, in 2002 we got the first commercial Maglev – short for magnetic levitation
train.
The world is also introduced to YouTube and thus the innate human need for on-demand cat
videos was satisfied.
With the huge growth and commercialisation of technology in the last decade or two, data
is everywhere.
That’s why in 2005 the term Big data is coined.
Apple shows its power in the tech field with the release of the MacBook Pro, while the
iPhone brings smartphones to the mainstream.
In the late 2000s IBM shows off its dominance in the realm of the supercomputer by having
the fastest in the world with the blue gene then bettering it themselves with the roadrunner.
Block chain arrives in force in 2008 to shake up the financial sector among others.
China and Japan start to push back IBM in a battle for who has the most powerful super
computer, with the Tianhe and K computer, respectively.
This leads to some of the biggest jumps in computing power.
We also see another show of AIs growth with Siri and Watson, who wins Jeopardy against
human opponents.
And it’s not just on earth where things are getting more advanced with the Mars Rover
showing the rest of the universe what we’re capable of.
To exemplify how far we’ve come to this point, in 2012 and 2013 we have supercomputers
that reach exascale speed and on our wrists we have computational devices millions of
times faster than the computers that sent a rocket to the moon.
And did I mention now that our rockets have the capability to land themselves – with
a lot more computing power than a watch, mind you.
By 2016 we’ve officially reached sci-fi territory with self-aware Sophia, 3-D printed
prosthetic limbs, and realistic virtual reality.
At this point we’re looking at molecular computing – because humanity love making
small tech… especially things like almost paper-thin laptops with incredibly fast processors
and cancer-fighting nanobots.
And because we also like to show off, we have the fastest supercomputer at 200 petaflops
with summit.
New grounds of travel and AI combined with the model S. And Atlas who puts many of to
shame with his precision parkour.
We really are in amazing times.
But where do you think we’ll be heading next?
What will be the next big thing?
Are we maybe going too fast, our ancestors had much more time to process and learn from
new developments.
We’d like to open up a discussion in the comments and get your thoughts on the future
of technology.
And, of course if you enjoyed this video and would like to see more – like, subscribe,
and hit the notification bell.