Abstract
Occam's razor—the idea that all else being equal, we should pick the simpler hypothesis—plays a prominent role in ordinary and scientific inference. But why are simpler hypotheses better? One attractive hypothesis known as Bayesian Occam's razor (BOR) is that more complex hypotheses tend to be more flexible—they can accommodate a wider range of possible data—and that flexibility is automatically penalized by Bayesian inference. In two experiments, we provide evidence that people's intuitive probabilistic and explanatory judgments follow the prescriptions of BOR. In particular, people's judgments are consistent with the two most distinctive characteristics of BOR: They penalize hypotheses as a function not only of their numbers of free parameters but also as a function of the size of the parameter space, and they penalize those hypotheses even when their parameters can be "tuned" to fit the data better than comparatively simpler hypotheses.
from #ORL-AlexandrosSfakianakis via ola Kala on Inoreader http://ift.tt/2AYTUXN
Δεν υπάρχουν σχόλια:
Δημοσίευση σχολίου