The Mystery of Consciousness

Antonio Damasio & David Chalmers,

A panel with Antonio Damasio, David Chalmers, and Marcelo Gleiser
When one does the job of science which is to break things apart in order to understand the mechanism, we have to be very careful to put the mechanism back

Neuroscientist Antonio Damasio and philosopher David Chalmers entertain a wonderful and nuanced conversation about the nature of consciousness.

The Mind Bleeds Into the World

David Chalmers,

 David Chalmers presenting The Mind Bleeds Into the World
A virtual world is just as real as a physical world

Philosopher David Chalmers gives an introduction to the philosophy of virtual reality, the simulation hypothesis and artificial intelligence. See the article on Edge for interesting comments from scientists and philosophers.

Programming and Scaling

Alan Kay,

Philosophy of computing
Alan Kay presents Programming and Scaling
I’m showing you this just because everybody should understand the history of our field. If we are physicists and we fail to understand what Newton did, we should be shocked, but in fact in computing, we’re very happy not to understand what most of the founders of our field actually did.

Alan Kay tells the overlooked history of computing, and sketches an ambitious vision of computing in which we deal with high level meaning instead of tiny instructions.

Interview with Bill Atkinson

History of computing
interview with Bill Atkinson
Hypercard was inspired by an LSD trip.

Wonderful talk with Bill Atkinson, one of the original Apple Macintosh team and creator of HyperCard.

It’s interesting to note that many great inventors in computing (Alan Kay, Dan Ingalls, Atkinson, etc.) had a deep interest in how the mind works, and saw the computer as a tool to augment our cognitive abilities. Bill Atkinson wanted to do research in neuroscience, but Steve Jobs convinced him to join Apple before he could finish his PhD.

Back to the Future of Software Development

Alan Kay,

History of computing
Alan Kay presents Back to the Future of Software Development
We live in a world of applications. But if you think about it, applications are one of the worst ideas anybody had, and we thought we got rid of them at PARC in the 1970s, because applications draw a barrier around your ability to do things. What you really want is to gather the resources you need to you and make the things that you want out of them.

In every one of his talks, Alan Kay carves one face of a multi faceted diamond that nobody has seen yet. All we know is that the coveted gem is a computer revolution that hasn’t quite happened yet.

Alan Kay Keynote

SCI X Utah,

History of computing
Alan Kay's keynote at the SCI X 2012 at the University of Utah

Alan Kay talks about the overlooked history of computing. One of the forgotten ideas is that the history of computing is intertwined with a systematic thinking about biology. Early computing pioneers like Alan Turing were thinking of biology in computational terms. The computing community back then saw an intimate relation between computing processes and biological systems, which inspired the invention of Object Oriented Programming as a system of many objects exchanging messages. The OOP of today bears little resemblance to the original idea.

Fun to Imagine

Richard Feynman,

Interview with Richard Feynman
The world is a dynamic mess of jiggling things. If you magnify it, you can hardly see anything anymore, because everything is jiggling in their own patterns, and there are lots of little balls and... It’s lucky that we have such a large scale of view of everything that we can see them as things, without having to worry about all these little atoms all the time.

One hour long interview with physicist Richard Feynman in which he explains various natural phenomena with a unique humor and style.

Artificial Intelligence and the Future

Demis Hassabis,

Artificial intelligence
Demis Hassabis presents Artificial Intelligence and the Future at the Royal Society of Arts in London
For a true thinking machine to achieve high end tasks, it needs to be grounded in a sensory motor reality.
Opposed to that, there are logic based systems. The problem with those systems is when they interact with the outside world, they find it very difficult to map the logic to these real-world messy situations. A lot of the IA research that went on in the 1980s and 1990s were logic based. They couldn’t encode things like common sense because they weren’t ultimately grounded in perceptual experience.

Demis Hassabis from DeepMind shares his approach to artificial intelligence. Instead of building disembodied symbolic systems, DeepMind trains sensory motor grounded agents inside virtual environments.

Mechanical Computer

US Navy,

Computing hardware
3 dimensional cam of a mechanical computer

This amazing video explains how we can do computations with ingenious physical arrangements. It makes computation feel tangible.

How Differential Gear works

General Motors,

explanation of differential gears by General Motors
We will put in more spokes.

Not a talk, but oh did I smile when I saw the reconstruction of a differential gear from first principles.

Doing with Images Makes Symbols

Alan Kay,

History of computing
Alan Kay presents Doing with images makes symbols in 1987
Find a context that will do most of your thinking for you.

Alan Kay covers the history of graphical user interface, personal computers and object oriented programming starting from the 1960s, and explains the philosophy of computing and education behind the work of the pioneers.

Principles of Lighting and Rendering

John Carmack,

Computer graphics
John Carmack talks about the principle of lighting and rendering in computer graphics

John Carmak talks about the hard science that goes into rendering graphics on a computer screen. Extraordinarily dense and thorough overview of the physics of light, followed by a presentation of the algorithmic solutions used in computer graphics.

The Dragon Speech

Chris Crawford,

Game design
Chris Crawford presents The Dragon Speech in the 1992 Games Developers Conference
For truth! For beauty! For art! Charge!

This talk by Chris Crawford has became a manifesto for computer games as an art form. Crawford is chasing a dream that he calls the dragon: the day computer games would be a viable medium for artistic expression. "What is special about computer games?" Asks Crawford. "It’s the interactivity". Interactivity is transformative because that’s how the brain works. We learn not as passive receptacles of knowledge, but as active agents in an environment. For the first time in history, we have a technology that can automate interactivity.

Interview with Alan Kay

University of Paderborn,

Philosophy of computing
Photo of Alan Kay by Mark Heinemann
The best way to predict the future is to invent it

Another take on Alan Kay’s deep ideas in computing and education. This interview was given for the opening of a university building in Germany featuring his famous quote.

Deep Interaction

Karl Fast,

User interface
Karl Fast presenting Deep Interaction

What is thinking, and what does the extended mind theory mean for Human-Computer Interface design.

Karl Fast discusses what we imagine thinking is. If we consider that thinking is what happens in our head, then we limit our ability to build new tools to extend what we can think of. Instead, we should view thinking as an interaction we maintain with our environment. Building proper environments with richer tools shapes our thinking. An example would be solving a jigsaw puzzle: merely looking at the puzzle isn’t enough to solve even relatively simple puzzles. But if we play with our hands and constantly update what we see, we can solve fairly complicated puzzles.

Inventing on principle

Bret Victor,

User interface
Bret Victor presenting Inventing on Principle

Brilliant talk in which Bret Victor presents ideas about interactive programming. But more importantly, I think this work articulated latent intuitions about interactive media, and later exposed ill-known computing pioneers whose ambitious and romantic vision of computing has yet to come true.