I have made very clear, many times, the problems that I have the Newton, but that does not diminish how awesome it is that Cambridge has made the Newton Papers available to the public.

Great work Cambridge!

I have made very clear, many times, the problems that I have the Newton, but that does not diminish how awesome it is that Cambridge has made the Newton Papers available to the public.

Great work Cambridge!

“If I have seen farther it is because I have stood on the shoulders of giants,” is a quote that is often misattributed to Sir Isaac Newton, one of the two amazing minds that developed the calculus around the turn of the 18th century. While that quote is not Newton’s, it still is entirely true about his work. Without the work of the mathematicians of the previous two centuries Newton, and Leibniz, would not have been in a position to develop the calculus, an area of mathematics which can easily be argued to be the true engine behind the industrial revolution.

Mathematics, in fact all of science, is not done in a vacuum. All the work that is done today builds off of the work of generations of thinkers, inventors, and researchers. When a mathematician decides to shuck off the yoke of previous work they find themselves staring up from a hole that might as well go all the way to the center of the earth. One of the classic examples of this is the work Principia Mathematica by Bertrand Russel and Albert North Whitehead where they decided to tear down mathematics and build it back up from a foundation centered only upon the most elementary of logic. The Principia was never finished, in fact not two decades later it was proven that such a treatment of the subject could not be done, they ended up using the first 362 pages to prove the statement 1+1=2. Of course mathematicians can also find themselves indebted to someone well outside of their field. A wonderful example of this is the work of Duncan Watts and Steve Strogatz on the topic of Small World Networks which all started because Watts remembered his father once telling him that everyone is separated by only six handshakes, and idea popularized in 1929 by Frigyes Karinthy in his short story Chains.

This debt to what has come before is treated very differently by mathematicians, and while few would completely disavow what they owe none take it more seriously than Grigori Perelman. A Russian mathematician, Perelman gained international renown in November 26 for his proof of the Poincaré conjecture, one of the Clay Mathematics Millennium Problems which have a million dollar bounty on their solutions, which had stood unsolved since 1904. This work was so important and groundbreaking that in 2006 they earned Perelman the highest award that a mathematician can earn, a Fields Medal. An award that, with a mind at least partially thinking of the debt to the mathematical community that comes with accepting such a high honor, Perelman declined, citing that he did not want to displayed like a zoo animal. This was only the first award that Perelman declined, he also turned down the million dollar Millennium Prize bounty and this time his refusal was clearly to deal with debt. Perelman’s proof of the Poincaré conjecture came from following what is known as the Hamilton program, essentially a plan for producing a solution to the conjecture that was developed by American mathematician Richard Hamilton. Perelman thought that it was unjust that he alone was getting all of the accolades, and prizes, for the work and Richard Hamilton, a man to whom Perleman clearly feels a great debt, was languishing on the sidelines. There is hope that Perelman’s thoughts about the unjust nature of the mathematical community, which have cause him to withdraw from the community, may still be assuaged as Richard Hamilton was a co-award winner of the Shaw prize for the work he did towards the proof.

While Perelman’s empathy towards those whose work he used as stepping stool to new results is unusual, it is illustrative of just how important previous work is to those who really notice.

Not too long ago I took a trip to Nottingham, England to visit my Math/Maths co-host Peter Rowlett. While I was there Peter took me on a tour of his city hitting all of the Mathematical and Computing history hot spots. So, of course we made videos of this tour.

Early on this past week Professor Bhatnagar brought up the idea of mathematical funding, specifically how would any of us choose to fund mathematics if we were the government. The government of the United States of America currently funds mathematics through two main channels, the national Science Foundation and the National Security Agency, and many other side channels, Department of Defense, Department of Energy, etc. The National Science Foundation alone represents around 65% of the governmental funding for research in mathematics, and in their most recent budget they ask for an increase of $7.4 billion in total funding with an increase for mathematical research of 5%, a 16% increase in Graduate Fellowship money, as well as many other cyberlearning and outreach programs that will directly impact mathematics.

This resonated with me as I spent my weekend in the seat of United States of America’s federal power, Washington DC. I was there to participate in the Students for Free Culture, http://freeculture.org, annual Free Culture Conference. This conference is in the words of the creators: “A convening of the international free culture community for two days of networking, learning and acting. The vision is to bring together student activists and free culture luminaries to discuss free software and open standards, open access scholarship, open educational resources, network neutrality, and university patent policy, especially in the context of higher education.” The conference itself was a shot in the arm for me in particular, as it has pushed me towards really starting work on some projects that I have had on the back burner for a long time.

The conference, while concentrating a lot on education, spent a decent amount of time on politics, a subject that I have only allocated the minimal amount of interest to since I joined up and became on the few, the proud, the graduate students. It required that I open my mind and start thinking less like a mathematician, i.e. in closed logical fashion, where the strongest of formal arguments is obviously the correct one, and start cogitating in the way that normal people, and more specifically politicians, do on a daily basis. It was not the easiest thing for me to do, I would listen to some of the panelists talking about Net Neutrality or Open Educational Resources and immediately wonder why every does not just do things in the way that one of the panelists puts forth because it was obviously the best way. As a mathematician I often forget that most people do not think about the work in such a clean and dry way. One thing that became clear to me at this conference was that if I were the government I would spend as much money as I possibly could to make mathematics more open.

We started the week off talking about paradigm shifts. Paradigm shift, according to Wikipedia, was a term that was first used by Thomas Kuhn in his book The Structure of Scientific Revolutions in 1962 to characterize a foundational transformation in the dominant theory of a science. Since its introduction the phrase has had to go through one of itself to arrive at its current meaning of a over-arching change in any area within the realm of homosapiens. In class quite a few paradigm shifts were discussed from the Calculus of Leibniz and Newton to the training regimes of the Williams sisters, but there is one that I feel is just as crucial and that is the invention of computational theory by Alan Turing.

One interesting way to look at the idea of a paradigm shift is through the lens of the Singularity that Venor Vinge was kind enough to give us. The term singularity is one that mathematicians are very comfortable as they are a term for where some mathematical object happens to not be defined. Say we are talking about a function that is not defined at zero, then the function tends to behave oddly near this singularity. The same can be said for matter near black holes, which are called gravitational singularities. It was from these ideas that Vinge came up with the term Singularity, in this case referring to some point in the future when technology will stop increasing in speed at an algebraic level and start progressing at essentially infinite speed; usually this refers to AI or self-replicating machines. The reason that I feel this lens could be useful to look at paradigm shifts through is because of something that Singularity Science Fiction, the Singularity does have it own sub-genre of Science Fiction literature, author Cory Doctorow once said, that for all essential purposes the Singularity is the point at which human beings that were raised under the conditions caused by the Singularity are incapable of meaningfully communication with those born before the event. He went on to posit that human beings have already gone through multiple points of Singularity, the greatest of which would be the invention of spoken language. After which it is quite clear that meaningful communication with those who do not have spoken language by those who do would be, for any practical purposes, impossible.

While I do not believe that any of these paradigm shifts qualify as full Singularity events, I do appreciate the problems that those who learned mathematics after we had the calculus would have communicating the mathematics of, say a thrown projectile to those who came before the calculus was know. It is in this way that I wish to discuss Alan Turing and the beginning of computing. Read more