In this week’s episode, Greg and Patrick talk about information theory: what it is, where it comes from, how it works, and how it can be used to make comparative model inferences. Along the way we also mention Pennsylvania 6-5000, the time lady, the Nobel Prize for Awesomeness, juggling and unicycles, enigma, imaginary friends, lemon juice code, red giants and white dwarves, bits, a level-11 paladin, Hungarian Forest Gump, snake eyes and boxcar Willies, the Reaper Divergence Criterion, and getting inspirations on a train.
Lightly-Edited Episode Transcript
We provide a lightly-edited and obviously imperfect audio transcript of the episode available here. This is not an exact representation of the audio, but does provide a searchable document with identified speakers and associated time stamps.
Additional Show Notes
Anderson, D., & Burnham, K. (2004). Model selection and multi-model inference. Second. NY: Springer-Verlag.
Burnham, K. P., & Anderson, D. R. (2004). Multimodel inference: understanding AIC and BIC in model selection. Sociological methods & research, 33(2), 261-304.
Burnham, K. P., & Anderson, D. R. (1998). Practical use of the information-theoretic approach. In Model selection and inference (pp. 75-117). Springer, New York, NY.
Mézard, M., & Montanari, A. (2009). Information, Physics, and Computation. https://web.stanford.edu/~montanar/RESEARCH/book.html
Preacher, K. J. (2006). Quantifying parsimony in structural equation modeling. Multivariate Behavioral Research, 41(3), 227-259.
Preacher, K. J., Cai, L., & MacCallum, R. C. (2007). Alternatives to traditional model comparison strategies for covariance structure models. Modeling contextual effects in longitudinal studies, 8, 33-62.
Shannon, C.E. (1948). A mathematical theory of communication. The Bell System Technical Journal, 27, 379-423. https://people.math.harvard.edu/~ctm/home/text/others/shannon/entropy/entropy.pdf
Stone, J.V. (2019). Information Theory: A Tutorial Introduction. https://arxiv.org/pdf/1802.05968.pdf