Discussion about this post

User's avatar
Dom Robinson's avatar

Hey chief I saw @Andy Beach tag this in so was happy to find your stacks..

I’d gently like to contribute a bunch of stuff to this. I did AI at university when the subject was 50 years old (early 90s) and in reality not much has happened to AI in the time since.

Transformers (2017) were genuinely novel, but most of what’s happened since is scaling - more parameters, more data, more compute. That’s not the same as fundamental advances in how machines think or understand.

What you are really chronicling is the US history of the GPU.

to make it a history of AI, where’s:

• Turing and Colossus (UK, 1940s)

• Cybernetics and Norbert Wiener

• Ivakhnenko gets a brief mention but the Soviet/Eastern European contributions are glossed over

• British computing generally

The article conflates computational throughput with intelligence advances. Each era is framed as progress toward current LLMs, but it’s really just: “We got GPUs, then TPUs, then bigger clusters, now we have ChatGPT!”

This is why I think the modern phenomenon should be determined as EI (extended) rather than AI which would better be described as ‘autonomous intelligence’ and we are absolutely nowhere nearer that than we were 80years ago IMHO..

Expand full comment
Rainbow Roxy's avatar

Brilliant. What if Swift's "thinking machine" had more direct infleunce on early engineers? We might've seen actual AI concepts emerge even faster than the ABC's foundational work.

Expand full comment
2 more comments...

No posts

Ready for more?