Probabilistic Graphical Model 1.3節 - Part 2

2024/04/18閱讀時間約 9 分鐘

以下內容是我閱讀Probabilistic Graphical Model, Koller 2009一書的讀書筆記,未來將不定期新增內容,此技術屬AI人工智慧範疇。

1.3 Overview and Roadmap

1.3.1 Overview of Chapters

延續上一篇Part 1講Probabilistic Graphical Model的基礎,在Part 2的部分專注於AI當中的Inference主題,以下來看各章節的主題排序:

  • In chapter 9, we describe the basic ideas underlying exact inference in probabilistic graphical models. We first analyze the fundamental difficulty of the exact inference task, separately from any particular inference algorithm we might develop. We then present two basic algorithms for exact inference — variable elimination and conditioning — both of which are equally applicable to both directed and undirected models. Both of these algorithms can be viewed as operating over the graph structure defined by the probabilistic model. They build on basic concepts, such as graph properties and dynamic programming algorithms, to provide efficient solutions to the inference task. We also provide an analysis of their computational cost in terms of the graph structure, and we discuss where exact inference is feasible.


  • In chapter 10, we describe an alternative view of exact inference, leading to a somewhat different algorithm. The benefit of this alternative algorithm is twofold. First, it uses dynamic programming to avoid repeated computations in settings where we wish to answer more than a single query using the same network. Second, it defines a natural algorithm that uses message passing on a graph structure; this algorithm forms the basis for approximate inference algorithms developed in later chapters. Because exact inference is computationally intractable for many models of interest, we then proceed to describe approximate inference algorithms, which trade off accuracy with computational cost. We present two main classes of such algorithms.


  • In chapter 11, we describe a class of methods that can be viewed from two very different perspectives: On one hand, they are direct generalizations of the graph-based message-passing approach developed for the case of exact inference in chapter 10. On the other hand, they can be viewed as solving an optimization problem: one where we approximate the distribution of interest using a simpler representation that allows for feasible inference. The equivalence of these views provides important insights and suggests a broad family of algorithms that one can apply to approximate inference.


  • In chapter 12, we describe a very different class of methods: particle-based methods, which approximate a complex joint distribution by considering samples from it (also known as particles). We describe several methods from this general family. These methods are generally based on core techniques from statistics, such as importance sampling and Markov-chain Monte Carlo methods. Once again, the connection to this general class of methods suggests multiple opportunities for new algorithms. While the representation of probabilistic graphical models applies, to a great extent, to models including both discrete and continuous-valued random variables, inference in models involving continuous variables is significantly more challenging than the purely discrete case.


  • In chapter 14, we consider the task of inference in continuous and hybrid (continuous/discrete) networks, and we discuss whether and how the exact and approximate inference methods developed in earlier chapters can be applied in this setting.


  • For chapter 15, the representation that we discussed in chapter 6 allows a compact encoding of networks whose size can be unboundedly large. Such networks pose particular challenges to inference algorithms. In this chapter, we discuss some special-purpose methods that have been developed for the particular settings of networks that model dynamical systems.
2會員
18內容數
這裡將提供: AI、Machine Learning、Deep Learning、Reinforcement Learning、Probabilistic Graphical Model的讀書筆記與演算法介紹,一起在未來AI的世界擁抱AI技術,不BI。同時分享各種網路賺錢方法,包含實測結果
留言0
查看全部
發表第一個留言支持創作者!