Merge branch 'mumford'

This commit is contained in:
Dimitri Lozeve 2022-03-24 22:09:05 +01:00
commit 97bfc57b86
2 changed files with 209 additions and 0 deletions

View file

@ -615,3 +615,56 @@
url = {https://doi.org/10.1080/09537289208919407}, url = {https://doi.org/10.1080/09537289208919407},
DATE_ADDED = {Thu Mar 18 13:53:35 2021}, DATE_ADDED = {Thu Mar 18 13:53:35 2021},
} }
@article{mumford2000_dawnin_age_stoch,
author = {Mumford, David},
title = {The Dawning of the Age of Stochasticity},
journal = {Atti della Accademia Nazionale dei Lincei. Classe di
Scienze Fisiche, Matematiche e Naturali. Rendiconti
Lincei. Matematica e Applicazioni},
volume = 11,
pages = {107--125},
year = 2000,
url = {http://eudml.org/doc/289648},
ISSN = {1120-6330; 1720-0768/e},
MSC2010 = {00A69 01A99 60A05 62A01},
Zbl = {1149.00309},
month = 12,
publisher = {Accademia Nazionale dei Lincei},
}
@book{davis2012_mathem_exper_study_edition,
author = {Philip J. Davis and Reuben Hersh and Elena Anne
Marchisotto},
title = {The Mathematical Experience, Study Edition},
year = 2012,
publisher = {Birkh{\"a}user Boston},
url = {https://doi.org/10.1007/978-0-8176-8295-8},
doi = {10.1007/978-0-8176-8295-8},
isbn = 978-0-8176-8294-1,
pages = 500,
series = {Modern Birkh{\"a}user Classics},
}
@book{hacking2006_emerg_probab,
author = {Hacking, Ian},
title = {The Emergence of Probability: A Philosophical Study
of Early Ideas about Probability, Induction and
Statistical Inference},
year = 2006,
publisher = {Cambridge University Press},
url = {https://doi.org/10.1017/CBO9780511817557},
doi = {10.1017/CBO9780511817557},
edition = 2,
place = {Cambridge},
}
@book{alon2016_probab_method,
author = {Alon, Noga and Spencer, Joel H.},
title = {The Probabilistic Method},
year = 2016,
publisher = {Wiley},
edition = {4th},
isbn = 9781119061953,
}

View file

@ -0,0 +1,156 @@
---
title: "The Dawning of the Age of Stochasticity"
date: 2022-03-24
tags: maths, foundations, paper, statistics, probability
toc: false
---
#+begin_quote
Mumford, David. 2000. “The Dawning of the Age of Stochasticity.”
/Atti Della Accademia Nazionale Dei Lincei. Classe Di Scienze Fisiche,
Matematiche E Naturali. Rendiconti Lincei. Matematica E Applicazioni/
11: 10725. http://eudml.org/doc/289648.
#+end_quote
This article [cite:@mumford2000_dawnin_age_stoch] is an interesting call
for a new set of foundations of mathematics on probability and
statistics. It argues that logic has had its time, and now we should
make random variables a first-class concept, as they would make for
better foundations.
* The taxonomy of mathematics
[fn::{-} This is probably the best definition of mathematics I have
seen. Before that, the most satisfying definition was "mathematics is
what mathematicians do". It also raises an interesting question: what
would the study of /non-reproducible/ mental objects be?]
#+begin_quote
The study of mental objects with reproducible properties is called mathematics.
[cite:@davis2012_mathem_exper_study_edition]
#+end_quote
What are the categories of reproducible mental objects? Mumford
considers the principal sub-fields of mathematics (geometry, analysis,
algebra, logic) and argues that they are indeed rooted in common
mental phenomena.
Of these, logic, and the notion of proposition, with an absolute truth
value attached to it, was made the foundation of all the
others. Mumford's argument is that instead, the random variable is (or
should be) the "paradigmatic mental object", on which all others can
be based. People are constantly weighing likelihoods, evaluating
plausibility, and sampling from posterior distributions to refine
estimates.
As such, random variables are rooted in our inspection of our own
mental processes, in the self-conscious analysis of our minds. Compare
to areas of mathematics arising from our experience with the physical
world, through our perception of space (geometry), of forces and
accelerations (analysis), or through composition of actions (algebra).
The paper then proceeds to do a quick historical overview of the
principal notions of probability, which mostly mirror the detailed
historical perspective in [cite:@hacking2006_emerg_probab]. There is
also a short summary of the work into the foundations of mathematics.
Mumford also claims that although there were many advances in the
foundation of probability (e.g. Galton, Gibbs for statistical physics,
Keynes in economics, Wiener for control theory, Shannon for
information theory), most important statisticians (R. A. Fisher)
insisted on keeping the scope of statistics fairly limited to
empirical data: the so-called "frequentist" school. (This is a vision
of the whole frequentist vs Bayesian debate that I hadn't seen
before. The Bayesian school can be seen as the one who claims that
statistical inference can be applied more widely, even to real-life
complex situations and thought processes. In this point of view, the
emergence of the probabilistic method in various areas of science
would be the strongest argument in favour of bayesianism.)
* What is a "random variable"?
Random variables are difficult to define. They are the core concept of
any course in probability of statistics, but their full, rigorous
definition relies on advanced measure theory, often unapproachable to
beginners. Nevertheless, practitioners tend to be productive with
basic introductions to probability and statistics, even without
being able to formulate the explicit definition.
Here, Mumford discusses the various definitions we can apply to the
notion of random variable, from an intuitive and a formal point of
view. The conclusion is essentially that a random variable is a
complex entity that do not easily accept a satisfying definition,
except from a purely formal and axiomatic point of view.
This situation is very similar to the one for the notion of
"set". Everybody can manipulate them on an intuitive level and grasp
the basic properties, but the specific axioms are hard to grasp, and
no definition is fully satisfying, as the debates on the foundations
of mathematics can attest.
* Putting random variables into the foundations
The usual way of defining random variables is:
1. predicate logic,
2. sets,
3. natural numbers,
4. real numbers,
5. measures,
6. random variables.
Instead, we could put random variables at the foundations, and define
everything else in terms of that.
There is no complete formulation of such a foundation, nor is it clear
that it is possible. However, to make his case, Mumford presents two
developments. One is from [[https://en.wikipedia.org/wiki/Edwin_Thompson_Jaynes][E. T. Jaynes]], who has a complete formalism
of Bayesian probability from a notion of "plausibility". With a few
axioms, we can obtain an isomorphism between an intuitive notion of
plausibility and a true probability function.
The other example is a proof that the continuum hypothesis is false,
using a probabilistic argument, due to Christopher Freiling. This
proof starts from a notion of random variable that is incompatible
with the usual definition in terms of measure theory. However, this
leads Mumford to question whether a foundation of mathematics based on
such a notion could get us rid of "one of the meaningless conundrums
of set theory".
* Stochastic methods have invaded classical mathematics
This is probably the most convincing argument to give a greater
importance to probability and statistical methods in the foundations
of mathematics: there tend to be everywhere, and extremely
productive. A prime example is obviously graph theory, where the
"probabilistic method" has had a deep impact, thanks notably to
Erdős. (See [cite:@alon2016_probab_method] and [[https://www.college-de-france.fr/site/timothy-gowers/index.htm][Timothy Gowers' lessons
at the Collège de France]][fn::In French, but see also [[https://www.youtube.com/c/TimothyGowers0][his YouTube
channel]].] on the probabilistic method for combinatorics and number
theory.) Probabilistic methods also have a huge importance in the
analysis of differential equations, chaos theory, and mathematical
physics in general.
* Thinking as Bayesian inference
I think this is not very controversial in cognitive science: we do not
think by composing propositions into syllogisms, but rather by
inferring probabilities of certain statements being true. Mumford
illustrates this very well with an example from Judea Pearl, which
uses graphical models to represent thought processes. There is also a
link with formal definitions of induction, such as PAC learning, which
is very present in machine learning.
I'll conclude this post by quoting directly the last paragraph of the
article:
#+begin_quote
My overall conclusion is that I believe stochastic methods will
transform pure and applied mathematics in the beginning of the third
millennium. Probability and statistics will come to be viewed as the
natural tools to use in mathematical as well as scientific modeling.
The intellectual world as a whole will come to view logic as a
beautiful elegant idealization but to view statistics as the standard
way in which we reason and think.
#+end_quote
* References