Short History of Computing

Are some graphs better than others. Probably not. Especially because one graph can represent any other. This is a tradition in computing.

# Graphs

Ada Lovelace is sometimes called the first programmer because she recognized she could give words (and bit patterns) any meaning.

When we say there are nodes and relations we've asserted a structure.

When we attach words to the nodes and relations it becomes arbitrarily general.

Structure is more about navigation than meaning.

A graphic designer can suggest structure (and enable navigation) by proximity and alignment.

In the computer we use punctuation. That is, we reserve some bit patterns to mean something other than words.

# Representation

In compilers one often distinguishes syntax from semantics. Structure vs meaning is a similar distinction.

What computers can and can't do falls into automata theory. wikipedia

Better than this is the foundational insight due to Lovelace. wikipedia

There are many levels of abstraction in the computer I am using. Let me see if I can name a few.

My wall plug provides the power required to establish regions of more or less charge. This charge lingers making it a memory.

The charge in one spot can vary the resistance in another spot. From this we can assemble decision making networks.

If we interpret lots of charge as logic true and less charge as false, the decisions resemble logic gates like and and or gates.

If we interpret true as 1 and false as 0 and arrange our regions of charge by powers of two we can represent numbers.

If we amass a lot of logic storage and then hook that up to retrieval logic then we can build random access memory.

if we amass a lot of logic gates we can perform arithmetic and logic operations we can build an arithmetic logic unit.

If we interpret some numbers as instructions and feed them into decoding logic to control the arithmetic logic unit then we will have a stored program digital computer.

We might say we are using the Von Neumann architecture because he was an early advocate for this structure.

# Computers

There has been a lot of packaging innovation since Von Neumann but the basic architecture is largely unchanged.

I have a friend who built a computer out of relays that he bought from Radio Shack. He share his instruction decoding with me so I built a simulator in perl. post

I've now covered 100 years of computing from Lovelace to Von Neumann.

The Apollo moon landing need a small, light and powerful computer.

Charles Draper had designed the guidance computer for the Polaris missile but a moon landing was a much bigger project.

Draper's team sought out the manufacture of integrated circuits, multiple gages on a single silicon substrate. This made the Apollo Guidance Computer possible.

Draper's team also invented a mathematical language suitable for guidance computations and wrote a simulator for it to run on the AGC.

# Industry

It's just starting to get interesting.

Intel put all the parts on one substrate, the 4004. Then the 6800. Then the 6502 that Woz turned into the Apple I.

Basic and Visicalc made this class of machines useful. Themselves written as interpreters on top of interpreters.

Displays have had their own history. CRTs, Light Pens, Liquid Crystals, GPUs, touch pads, all programmed with multiple levels of interpretation.

Signaling over a distance has had an awesome history too from Morse, to Armstrong, to packet switching, flow control, backbones, and NICs.

All this is what makes the super collaborator super.