August 2024 Digest

Jul 31, 2024 2:39 pm

Consulting and Training


If you are interested in consulting or training engagements or even commissioning me to create a presentation on a topic of interest then don’t hesitate to reach out to me at inquiries@symplectomorphic.com.


Recent Presentation


Interested in using your domain expertise to inform principled model development? I had the opportunity to present on narratively generative modeling for the Bayes Plurinacional Virtual Seminar Series, including a two hour lecture and one hour review of a comprehensive example.


All of the material is now freely available.


Recording: https://www.youtube.com/watch?v=92oSUaZggKs

Exercise (HTML): https://betanalpha.github.io/assets/chapters_html/customer_conversion.html

Exercise (PDF): https://betanalpha.github.io/assets/chapters_pdf/customer_conversion.pdf


Support Me on Patreon


If you would like to support my writing then consider becoming a patron, https://www.patreon.com/betanalpha. Right now covector+ supporters have early access to my upcoming die fairness case study as well as about seven hours of lifestream review of the cases study (it’s a long piece).


Probabilistic Modeling Discord


I’ve recently started a Discord server dedicated to discussion about (narratively) generatively modeling of all kinds, https://discord.gg/W2QVJaV6.


Recent Rants


On The Geometry of the Singular Value Decomposition


Fun fact that is often implied but rarely stated explicitly:


If X is a rank-K linear transformation from an N-dimensional input space to a M-dimensional output space represented by an MxN matrix then the reduced singular value decomposition X = U S V^T can be interpreted as...


 V^T: a linear projection from the N-dimensional input space to a K-dimensional subspace,


 S: a linear transformation of the K-dimensional subspace into itself,


 U : a linear transformation from the K-dimensional subspace up to the M-dimensional output space.


In other words the singular value decomposition provides an explicit construction for mapping to and from the linearly constrained subspace implied by rank-deficient linear transformation.



On Updating Beliefs


Updating your beliefs doesn’t make you Bayesian.  


Post-hoc rationalizing the existence of a heuristic and implicit likelihood function because you changed your mind doesn’t make you Bayesian.


If you want to be Bayesian then you need to derive your likelihoods, and your belief updates, from a consistent observational model which actually forces you to confront your assumptions.


Comments