So, take a look at this.

This is ENIAC, the world's first electronic, large-scale,

general purpose digital computer,

built right here at the University of Pennsylvania, in the 1940's.

This is actually just one part of eniac.

Eniac was huge, it took up three sides of a big room,

and weighed over 30 tons.

Now, my office is literally right up the hall,

and whenever I bring guests and visitors here,

they always want to see ENIAC.

They always ask me, ''What is that piece of hardware lying on the floor?''

Well, ENIAC stored numbers using decimal digits,

that is, digits from zero to nine.

That piece of hardware stored one decimal digit.

It is not that each tube stored a number from zero to nine,

is set the whole thing stored a single digit.

This is my cell phone,

It holds billions of digits.

I'm sure the people who built ENIAC,

never thought we'd be carrying computers in our pockets less than 70 years later.

So, we say that ENIAC was the first computer,

but in the time leading up to that,

there were many scientists,

engineers, and mathematicians, who made

super important contributions to the development of what we now think of as a computer.

So in this lesson,

we'll meet some of those people,

answer the question, who invented the computer?

Then see how that relates to our understanding of computational thinking.

The early years of computing were driven by

the need to perform mathematical calculations at scale.

In fact, computer was first a job title used in the 17th century.

The first concept of a programmable computer,

that is a machine,

is often attributed to British mathematician, Charles Babbage.

He conceptualized two mechanical devices.

The first was called the difference engine,

that was used to calculate complicated mathematical tables,

which until that point, were calculated by hand.

From this, he proposed a design for a more general purpose machine which he called

the analytical engine that had the basic features such as memory storage,

a central processing unit,

and even a printer,

that we're familiar with today.

He theorized that such a device could understand commands,

be programmed much like a modern day computer well

over 100 years before the first computers were actually built.

One of Babbage's contemporaries and brief mentee,

was Ada Lovelace, another British mathematician,

who was credited with being the first computer programmer for

formalizing many computer concepts still used today.

Some of her key ideas included repeated instructions or looping,

as well as how code could be created to handle letters and symbols along with numbers.

From there, she theorized that in addition to crunching numbers,

Babbage's analytical engine could process graphics and even compose music.

These ideas were just a few documented in a series of comments she

provided on Babbage's machine designs in the mid 19th century.

While we could say that Babbage devised the hardware,

and Lovelace the software,

for the early concept of a computer,

English mathematician George Boole,

provided the theoretical underpinnings as the inventor of Boolean algebra.

Boolean algebra, is the study of operations on logical values,

and the types of operations that can be performed using the values true and false.

The most common Boolean operators are and or and not.

We'll see these as we go through the remainder of the course.

Boolean algebra, is important in computer science,

because it forms the theoretical foundation of digital logic circuits,

which are the basic building blocks of a modern computer.

In addition, Boolean algebra has a number of

mathematical applications such as in set theory,

which is important in machine learning, and artificial intelligence.

The next leap in computation,

was enabled by the establishment of the electric power system in the 1880's,

and driven primarily by the demand for deciphering encoded messages,

during the world wars.

American mathematician Claude Shannon, related Boolean algebra,

to electronic logic networks,

and electrical devices such as relays and switches.

This work documented in his MIT master's thesis,

became the seminal paper that essentially founded practical digital circuit design.

Among the many other contributions he made,

he also founded the mathematical theory of cryptography.

His work primarily focused on reconstructing transmitted information at a target point.

Given this work on communication operations and signal processing,

Shannon is primarily known as being the father of

information theory which is the study of the quantification,

storage, and communication of information.

Across the Atlantic at that time, was British mathematician,

logician and cryptographer Alan Turing,

who is considered to be one of the fathers of modern computer science.

Turing provided an influential formalization of what's known as the Turing machine.

In his seminal work,

on computable numbers with application to the,

I'm not going to try to pronounce that.

But it was a mathematical model of a hypothetical idealized computing machine,

and led to the concepts we now think of as algorithms and computation.

During World War Two,

he was a chief cryptanalyst,

working on breaking German ciphers particularly the enigma machine,

and also served as Director of the Naval Enigma section at Bletchley Park.

After the war, he designed one of the earliest electronic programmable digital computers,

and built another early machine at the University of Manchester.

He also among many things,

made significant contributions to the discussion on machine intelligence.

He devised the Turing test,

as a test of a machine's ability to exhibit intelligent behavior,

equivalent to or indistinguishable from a human's.

Also during this period,

was American rear admiral,

Grace Brewster Murray Hopper,

who was an early computer programmer,

and the developer of the first compiler for a computer programming language.

A compiler is a program that takes source code which is a program written by a human,

and converts it into something that the computer can

understand based on the computer's hardware.

The compiler she wrote was known as the A-compiler,

and the first version was A-0.

Later versions were released commercially as the arithmetic,

mathematic, and flowmatic compilers.

She's also responsible for Cobol,

Fortran, and many other computing innovations.

Prior to this, she was one of the first three coders from Mark one,

the first electro- mechanical computer,

built in the United States, by IBM.