Probabilistic Graphical Model 1.3節 - Part 2

閱讀時間約 9 分鐘

以下內容是我閱讀Probabilistic Graphical Model, Koller 2009一書的讀書筆記,未來將不定期新增內容,此技術屬AI人工智慧範疇。

1.3 Overview and Roadmap

1.3.1 Overview of Chapters

延續上一篇Part 1講Probabilistic Graphical Model的基礎,在Part 2的部分專注於AI當中的Inference主題,以下來看各章節的主題排序:

  • In chapter 9, we describe the basic ideas underlying exact inference in probabilistic graphical models. We first analyze the fundamental difficulty of the exact inference task, separately from any particular inference algorithm we might develop. We then present two basic algorithms for exact inference — variable elimination and conditioning — both of which are equally applicable to both directed and undirected models. Both of these algorithms can be viewed as operating over the graph structure defined by the probabilistic model. They build on basic concepts, such as graph properties and dynamic programming algorithms, to provide efficient solutions to the inference task. We also provide an analysis of their computational cost in terms of the graph structure, and we discuss where exact inference is feasible.


  • In chapter 10, we describe an alternative view of exact inference, leading to a somewhat different algorithm. The benefit of this alternative algorithm is twofold. First, it uses dynamic programming to avoid repeated computations in settings where we wish to answer more than a single query using the same network. Second, it defines a natural algorithm that uses message passing on a graph structure; this algorithm forms the basis for approximate inference algorithms developed in later chapters. Because exact inference is computationally intractable for many models of interest, we then proceed to describe approximate inference algorithms, which trade off accuracy with computational cost. We present two main classes of such algorithms.


  • In chapter 11, we describe a class of methods that can be viewed from two very different perspectives: On one hand, they are direct generalizations of the graph-based message-passing approach developed for the case of exact inference in chapter 10. On the other hand, they can be viewed as solving an optimization problem: one where we approximate the distribution of interest using a simpler representation that allows for feasible inference. The equivalence of these views provides important insights and suggests a broad family of algorithms that one can apply to approximate inference.


  • In chapter 12, we describe a very different class of methods: particle-based methods, which approximate a complex joint distribution by considering samples from it (also known as particles). We describe several methods from this general family. These methods are generally based on core techniques from statistics, such as importance sampling and Markov-chain Monte Carlo methods. Once again, the connection to this general class of methods suggests multiple opportunities for new algorithms. While the representation of probabilistic graphical models applies, to a great extent, to models including both discrete and continuous-valued random variables, inference in models involving continuous variables is significantly more challenging than the purely discrete case.


  • In chapter 14, we consider the task of inference in continuous and hybrid (continuous/discrete) networks, and we discuss whether and how the exact and approximate inference methods developed in earlier chapters can be applied in this setting.


  • For chapter 15, the representation that we discussed in chapter 6 allows a compact encoding of networks whose size can be unboundedly large. Such networks pose particular challenges to inference algorithms. In this chapter, we discuss some special-purpose methods that have been developed for the particular settings of networks that model dynamical systems.
152會員
384內容數
這裡將提供: AI、Machine Learning、Deep Learning、Reinforcement Learning、Probabilistic Graphical Model的讀書筆記與演算法介紹,一起在未來AI的世界擁抱AI技術,不BI。
留言0
查看全部
發表第一個留言支持創作者!
Learn AI 不 BI 的其他內容
以下內容是我閱讀Probabilistic Graphical Model, Koller 2009一書的讀書筆記,未來將不定期新增內容,此技術屬AI人工智慧範疇。 1.3 Overview and Roadmap 1.3.1 Overview of Chapters We begin in
以下內容是我閱讀Probabilistic Graphical Model, Koller 2009一書的讀書筆記,未來將不定期新增內容,此技術屬AI人工智慧範疇。 1.2 Structured Probabilistic Models 既然要融入Uncertainty和Probability
以下內容是我閱讀Probabilistic Graphical Model, Koller 2009一書的讀書筆記,未來將不定期新增內容,此技術屬AI人工智慧範疇。 Introduction 1.1 Motivation 想要有一個智能體能接收輸入訊息,進而輸出對應動作甚至做Reasoning
這個頻道將提供以下服務: 深入介紹各種Machine Learning技術 深入介紹各種Deep Learning技術 深入介紹各種Reinforcement Learning技術 深入介紹Probabilistic Graphical Model技術 不定時提供讀書筆記 讓我們一起在未
以下內容是我閱讀Probabilistic Graphical Model, Koller 2009一書的讀書筆記,未來將不定期新增內容,此技術屬AI人工智慧範疇。 1.3 Overview and Roadmap 1.3.1 Overview of Chapters We begin in
以下內容是我閱讀Probabilistic Graphical Model, Koller 2009一書的讀書筆記,未來將不定期新增內容,此技術屬AI人工智慧範疇。 1.2 Structured Probabilistic Models 既然要融入Uncertainty和Probability
以下內容是我閱讀Probabilistic Graphical Model, Koller 2009一書的讀書筆記,未來將不定期新增內容,此技術屬AI人工智慧範疇。 Introduction 1.1 Motivation 想要有一個智能體能接收輸入訊息,進而輸出對應動作甚至做Reasoning
這個頻道將提供以下服務: 深入介紹各種Machine Learning技術 深入介紹各種Deep Learning技術 深入介紹各種Reinforcement Learning技術 深入介紹Probabilistic Graphical Model技術 不定時提供讀書筆記 讓我們一起在未
你可能也想看
Google News 追蹤
Thumbnail
接下來第二部分我們持續討論美國總統大選如何佈局, 以及選前一週到年底的操作策略建議 分析兩位候選人政策利多/ 利空的板塊和股票
Thumbnail
🤔為什麼團長的能力是死亡筆記本? 🤔為什麼像是死亡筆記本呢? 🤨作者巧思-讓妮翁死亡合理的幾個伏筆
Thumbnail
對於天氣的精密掌握,不僅得以改善各種生存條件,還能藉此訂定各種軍事策略,對於各種地緣政治紛爭的此刻,有其重大意義,於是各國對於快速準確的天氣預測技術,皆十分有興趣。本文介紹Google Research 的研究利用AI來預測天氣,誤差可以勝過傳統超級計算機的估算,讓我們一起看看是怎麼做到的。
Thumbnail
1. p1 = Product.new(name:"衛生紙") 接著執行 p1.stores = s1 卻出現 undefined method "each"的錯誤訊息。(如下圖一) 解決方法: 強制給實體掛上store_id 接下來處理Product
Thumbnail
Turtle Graphic的前身是一種設計給小朋友學習的簡易繪圖程式,最初來自於Wally Feurzeig、Seymour Papert和Cynthia Solomon於1967年所創造的Logo編程語言,它是Python內建的繪圖函式庫,我們可以根據函式庫裡面的指令,操控一隻或多隻「小烏龜」在
Thumbnail
Tesla 從二月初開始,各種Model 2新聞、爆料炒得滿天飛,朋友Ted蒐集了很多資訊,看了看先整理出四個必須知道的最新資訊,來看看Model 2究竟香在哪裡。(目前關於Model 2的資訊都還只是猜測)
在Model中常看到這兩個屬性fillable, guarded: protected $fillable=['name','job']; protected $guarded=['user_id']; 其中guarded是黑名單的意思,fillable則是白名單。 這兩個屬性是用來設定是否允許批量
Thumbnail
續上篇,把程式碼改成這樣,model加上關聯,就可以從article去query comments了。 L27表示從article table中primary key = 1的文章去找所有留言。 L27的$articles變數output如下: L12這個method: comments(),
Thumbnail
Elon Musk有個很厲害的地方,就是往往可以從自己搞出來的困境中神奇脫身;而最近的一次,就是在上市18到24個月之前就發表Tesla Model Y車款。問題是,Model Y可能會吸走目前非常重要的Model 3訂單,而且Tesla現在正需要Model 3的訂單來救命。
Thumbnail
這女孩做事認真,每分每刻忙於檢討自己錯失,有時把放大鏡放在錯的位置「自己嚇自己」;相反女兒喜歡嘗試習慣犯錯從修正中跌碰,意外地惹得好友由衷欣賞臨摹。
Thumbnail
接下來第二部分我們持續討論美國總統大選如何佈局, 以及選前一週到年底的操作策略建議 分析兩位候選人政策利多/ 利空的板塊和股票
Thumbnail
🤔為什麼團長的能力是死亡筆記本? 🤔為什麼像是死亡筆記本呢? 🤨作者巧思-讓妮翁死亡合理的幾個伏筆
Thumbnail
對於天氣的精密掌握,不僅得以改善各種生存條件,還能藉此訂定各種軍事策略,對於各種地緣政治紛爭的此刻,有其重大意義,於是各國對於快速準確的天氣預測技術,皆十分有興趣。本文介紹Google Research 的研究利用AI來預測天氣,誤差可以勝過傳統超級計算機的估算,讓我們一起看看是怎麼做到的。
Thumbnail
1. p1 = Product.new(name:"衛生紙") 接著執行 p1.stores = s1 卻出現 undefined method "each"的錯誤訊息。(如下圖一) 解決方法: 強制給實體掛上store_id 接下來處理Product
Thumbnail
Turtle Graphic的前身是一種設計給小朋友學習的簡易繪圖程式,最初來自於Wally Feurzeig、Seymour Papert和Cynthia Solomon於1967年所創造的Logo編程語言,它是Python內建的繪圖函式庫,我們可以根據函式庫裡面的指令,操控一隻或多隻「小烏龜」在
Thumbnail
Tesla 從二月初開始,各種Model 2新聞、爆料炒得滿天飛,朋友Ted蒐集了很多資訊,看了看先整理出四個必須知道的最新資訊,來看看Model 2究竟香在哪裡。(目前關於Model 2的資訊都還只是猜測)
在Model中常看到這兩個屬性fillable, guarded: protected $fillable=['name','job']; protected $guarded=['user_id']; 其中guarded是黑名單的意思,fillable則是白名單。 這兩個屬性是用來設定是否允許批量
Thumbnail
續上篇,把程式碼改成這樣,model加上關聯,就可以從article去query comments了。 L27表示從article table中primary key = 1的文章去找所有留言。 L27的$articles變數output如下: L12這個method: comments(),
Thumbnail
Elon Musk有個很厲害的地方,就是往往可以從自己搞出來的困境中神奇脫身;而最近的一次,就是在上市18到24個月之前就發表Tesla Model Y車款。問題是,Model Y可能會吸走目前非常重要的Model 3訂單,而且Tesla現在正需要Model 3的訂單來救命。
Thumbnail
這女孩做事認真,每分每刻忙於檢討自己錯失,有時把放大鏡放在錯的位置「自己嚇自己」;相反女兒喜歡嘗試習慣犯錯從修正中跌碰,意外地惹得好友由衷欣賞臨摹。