- Recent Changes 新聞
- History 歷史
- Preferences 喜好
- Edit 修改
- Discussion 討論
2008-08-17 19:19
Please read this week: Hermann von Helmholtz, 1896. Concerning the perceptions in general. (Translated from German, 1925; reprinted with foreword, 2001)
The examples of generative models I showed in class are:
- The Monty Hall problem as two interacting processes (including solving a contestant process to yield a cheating host): in Scheme and in Haskell.
- Generative models of English: uni/bi/tri/quad-gram models based on AP news (not online, sorry) and a trivial probabilistic context-free grammar in Scheme.
- Using a generative model of Boolean formulas (circuits) to learn from examples (based on code from last semester).
- SCIgen, an automatic CS paper generator.
- Context Free Design Grammar for pictures.
My screw-up this week was to confuse two ways to combine uncertain information.
- On one hand, if I make two independent observations of the same table’s location, then I can combine them to be more sure.
- On the other hand, if I observed once where the table was and once how far I moved, and these two random variables are independent, then I can combine them to track where the table is now. However, I would be less sure about the table’s current location than its previous location, because the uncertainties from my two observations accumulate.
I’ll explain all this more (correctly) next week. For now the main point is that combining multiple normal distributions (in either way above) yields simply another normal distribution, so an agent does not need more and more space as it gather successive observations and updates its belief.