The standard Bayesian story goes something like this: probabilities represent a rational agent's degrees of belief. When the agent learns something new she conditions on it, meaning that she updates her probabilities according to Bayes' rule. Importantly, the interpretation of the probability function is that it represents the agent's degrees of belief about how likely… Continue reading Paper Review: New Semantics for Bayesian Inference: The Interpretive Problem

# Tag: formal epistemology

## Paper Review: Nonconglomerability for Countably Additive Measures that are not κ-additive

Probability plays a central role in this blog---many of my posts focus on where probability makes contact with philosophy and physics. However, there is also of course the mathematical theory of probability. The mathematics and the;philosophy interact in many ways, often technical results in the mathematics can be important for our work as philosophers. For… Continue reading Paper Review: Nonconglomerability for Countably Additive Measures that are not κ-additive

## Paper Review: Knowledge

A number of my posts have looked at the tight connection between rationality and probability. One of the pioneers of this kind of work was Frank Plumpton Ramsey, who made major contributions to mathematics, economics, psychology, and philosophy before passing away at the young age of 26. Ramsey was also a friend of the influential… Continue reading Paper Review: Knowledge

## Paper Review: Why Conditionalize?

How should we change our beliefs in the light of new information? This is one of the central questions of epistemology, and has great practical importance. For example, consider a doctor who has a patient who is concerned he might have cancer. The doctor has certain beliefs: for example, she may think that her patient… Continue reading Paper Review: Why Conditionalize?

## Paper Review: No Free Lunch Theorem, Inductive Skepticism, and the Optimality of Meta-induction

The infamous no free lunch theorem (NFL theorem) asserts that all computable prediction methods have equal expected success. Computer scientists, and occasionally philosophers, often describe this result as a computer-science cousin of Hume's problem of induction. Given this theorem, one might think that trying to design a better or worse prediction algorithm for general prediction tasks is pointless:… Continue reading Paper Review: No Free Lunch Theorem, Inductive Skepticism, and the Optimality of Meta-induction