In 1936 a young PhD student named Alan Turing came up with a mathematical model of a computing device that, in his view, perfectly pinned down what "computation" means. 86 years later, his model is still used as a tool for reasoning about arbitrary computation. This lecture explores a slight variation on Turing's original idea and gives our first glimpse of what makes them so extraordinary.
Links