Tony Mann from the University of Greenwich has been appointed Visiting Professor of Computing Mathematics at Gresham College, London. This means he will deliver a
series of free public lectures will look at the mathematics of computing, and the computing of mathematics. The lectures will consider what can go wrong, how computers sometimes get the wrong answer, and the ingenuity mathematicians have used in overcoming these inherent problems. Since Gresham Professors such as Henry Briggs, Edmund Gunter and, more recently, Louis Milne-Thomson were pioneers in the mechanisation of computation, he is especially pleased to address these subjects at Gresham College.
The three lectures, in February, March and April next year in London, will be:
Arithmetic by Computer and by Human – Monday 4th February 2013, 6pm
Long multiplication, long division and logarithms are, for many, dim-remembered memories, and few now use these skills. We will examine some of the tools that help us, from the abacus to calculators. Computers are the ultimate arithmetic tool but their method is one of the oldest, used by ancient Egyptians. We will demonstrate ways to impress friends with quick calculations.
How computers get it wrong: 2 + 2 = 5 – Monday 4th March 2013, 6pm
When lives depend on calculations, human error can kill. From the early days of computing, one problem has been that they generally work with a fixed number of digits, creating rounding errors. Chaos Theory has shown new ways in which computer arithmetic can give misleading results. Such problems are not just theoretical – it is said that one programmer became rich on the fractions of a penny lost in rounding errors!
Proof by Computer and Proof by Human – Monday 15th April 2013, 6pm
The idea of a proof as a simple, easily-checked method of establishing truth has undergone modification in the age of computers. But the specialisation of the mathematical world has resulted in difficulties with even entirely human-made proofs. Many major recent results of mathematics have proofs so specialised that there are very few people in the world who can understand them, while some proofs depend on computers to do calculations no human could perform. Where does pure mathematics stand in the digital age?