資料介紹
Gaussian linear modelling cannot address current signal processing demands. In
modern contexts, such as Independent Component Analysis (ICA), progress has been
made specifically by imposing non-Gaussian and/or non-linear assumptions. Hence,
standard Wiener and Kalman theories no longer enjoy their traditional hegemony in
the field, revealing the standard computational engines for these problems. In their
place, diverse principles have been explored, leading to a consequent diversity in the
implied computational algorithms. The traditional on-line and data-intensive preoccupations
of signal processing continue to demand that these algorithms be tractable.
Increasingly, full probability modelling (the so-called Bayesian approach)—or
partial probability modelling using the likelihood function—is the pathway for design
of these algorithms. However, the results are often intractable, and so the area
of distributional approximation is of increasing relevance in signal processing. The
Expectation-Maximization (EM) algorithm and Laplace approximation, for example,
are standard approaches to handling difficult models, but these approximations
(certainty equivalence, and Gaussian, respectively) are often too drastic to handle
the high-dimensional, multi-modal and/or strongly correlated problems that are encountered.
Since the 1990s, stochastic simulation methods have come to dominate
Bayesian signal processing. Markov Chain Monte Carlo (MCMC) sampling, and related
methods, are appreciated for their ability to simulate possibly high-dimensional
distributions to arbitrary levels of accuracy. More recently, the particle filtering approach
has addressed on-line stochastic simulation. Nevertheless, the wider acceptability
of these methods—and, to some extent, Bayesian signal processing itself—
has been undermined by the large computational demands they typically make.
The Variational Bayes (VB) method of distributional approximation originates—
as does the MCMC method—in statistical physics, in the area known as Mean Field
Theory. Its method of approximation is easy to understand: conditional independence
is enforced as a functional constraint in the approximating distribution, and
the best such approximation is found by minimization of a Kullback-Leibler divergence
(KLD). The exact—but intractable—multivariate distribution is therefore factorized
into a product of tractable marginal distributions, the so-called VB-marginals.
This straightforward proposal for approximating a distribution enjoys certain optimality properties. What is of more pragmatic concern to the signal processing community,
however, is that the VB-approximation conveniently addresses the following
key tasks:
1. The inference is focused (or, more formally, marginalized) onto selected subsets
of parameters of interest in the model: this one-shot (i.e. off-line) use of the VB
method can replace numerically intensive marginalization strategies based, for
example, on stochastic sampling.
2. Parameter inferences can be arranged to have an invariant functional form
when updated in the light of incoming data: this leads to feasible on-line
tracking algorithms involving the update of fixed- and finite-dimensional statistics.
In the language of the Bayesian, conjugacy can be achieved under the
VB-approximation. There is no reliance on propagating certainty equivalents,
stochastically-generated particles, etc.
Unusually for a modern Bayesian approach, then, no stochastic sampling is required
for the VB method. In its place, the shaping parameters of the VB-marginals are
found by iterating a set of implicit equations to convergence. This Iterative Variational
Bayes (IVB) algorithm enjoys a decisive advantage over the EM algorithm
whose computational flow is similar: by design, the VB method yields distributions
in place of the point estimates emerging from the EM algorithm. Hence, in common
with all Bayesian approaches, the VB method provides, for example, measures of
uncertainty for any point estimates of interest, inferences of model order/rank, etc.
The machine learning community has led the way in exploiting the VB method
in model-based inference, notably in inference for graphical models. It is timely,
however, to examine the VB method in the context of signal processing where, to
date, little work has been reported. In this book, at all times, we are concerned with
the way in which the VB method can lead to the design of tractable computational
schemes for tasks such as (i) dimensionality reduction, (ii) factor analysis for medical
imagery, (iii) on-line filtering of outliers and other non-Gaussian noise processes, (iv)
tracking of non-stationary processes, etc. Our aim in presenting these VB algorithms
is not just to reveal new flows-of-control for these problems, but—perhaps more
significantly—to understand the strengths and weaknesses of the VB-approximation
in model-based signal processing. In this way, we hope to dismantle the current psychology
of dependence in the Bayesian signal processing community on stochastic
sampling methods.Without doubt, the ability to model complex problems to arbitrary
levels of accuracy will ensure that stochastic sampling methods—such as MCMC—
will remain the golden standard for distributional approximation. Notwithstanding
this, our purpose here is to show that the VB method of approximation can yield
highly effective Bayesian inference algorithms at low computational cost. In showing
this, we hope that Bayesian methods might become accessible to a much broader
constituency than has been achieved to date。
modern contexts, such as Independent Component Analysis (ICA), progress has been
made specifically by imposing non-Gaussian and/or non-linear assumptions. Hence,
standard Wiener and Kalman theories no longer enjoy their traditional hegemony in
the field, revealing the standard computational engines for these problems. In their
place, diverse principles have been explored, leading to a consequent diversity in the
implied computational algorithms. The traditional on-line and data-intensive preoccupations
of signal processing continue to demand that these algorithms be tractable.
Increasingly, full probability modelling (the so-called Bayesian approach)—or
partial probability modelling using the likelihood function—is the pathway for design
of these algorithms. However, the results are often intractable, and so the area
of distributional approximation is of increasing relevance in signal processing. The
Expectation-Maximization (EM) algorithm and Laplace approximation, for example,
are standard approaches to handling difficult models, but these approximations
(certainty equivalence, and Gaussian, respectively) are often too drastic to handle
the high-dimensional, multi-modal and/or strongly correlated problems that are encountered.
Since the 1990s, stochastic simulation methods have come to dominate
Bayesian signal processing. Markov Chain Monte Carlo (MCMC) sampling, and related
methods, are appreciated for their ability to simulate possibly high-dimensional
distributions to arbitrary levels of accuracy. More recently, the particle filtering approach
has addressed on-line stochastic simulation. Nevertheless, the wider acceptability
of these methods—and, to some extent, Bayesian signal processing itself—
has been undermined by the large computational demands they typically make.
The Variational Bayes (VB) method of distributional approximation originates—
as does the MCMC method—in statistical physics, in the area known as Mean Field
Theory. Its method of approximation is easy to understand: conditional independence
is enforced as a functional constraint in the approximating distribution, and
the best such approximation is found by minimization of a Kullback-Leibler divergence
(KLD). The exact—but intractable—multivariate distribution is therefore factorized
into a product of tractable marginal distributions, the so-called VB-marginals.
This straightforward proposal for approximating a distribution enjoys certain optimality properties. What is of more pragmatic concern to the signal processing community,
however, is that the VB-approximation conveniently addresses the following
key tasks:
1. The inference is focused (or, more formally, marginalized) onto selected subsets
of parameters of interest in the model: this one-shot (i.e. off-line) use of the VB
method can replace numerically intensive marginalization strategies based, for
example, on stochastic sampling.
2. Parameter inferences can be arranged to have an invariant functional form
when updated in the light of incoming data: this leads to feasible on-line
tracking algorithms involving the update of fixed- and finite-dimensional statistics.
In the language of the Bayesian, conjugacy can be achieved under the
VB-approximation. There is no reliance on propagating certainty equivalents,
stochastically-generated particles, etc.
Unusually for a modern Bayesian approach, then, no stochastic sampling is required
for the VB method. In its place, the shaping parameters of the VB-marginals are
found by iterating a set of implicit equations to convergence. This Iterative Variational
Bayes (IVB) algorithm enjoys a decisive advantage over the EM algorithm
whose computational flow is similar: by design, the VB method yields distributions
in place of the point estimates emerging from the EM algorithm. Hence, in common
with all Bayesian approaches, the VB method provides, for example, measures of
uncertainty for any point estimates of interest, inferences of model order/rank, etc.
The machine learning community has led the way in exploiting the VB method
in model-based inference, notably in inference for graphical models. It is timely,
however, to examine the VB method in the context of signal processing where, to
date, little work has been reported. In this book, at all times, we are concerned with
the way in which the VB method can lead to the design of tractable computational
schemes for tasks such as (i) dimensionality reduction, (ii) factor analysis for medical
imagery, (iii) on-line filtering of outliers and other non-Gaussian noise processes, (iv)
tracking of non-stationary processes, etc. Our aim in presenting these VB algorithms
is not just to reveal new flows-of-control for these problems, but—perhaps more
significantly—to understand the strengths and weaknesses of the VB-approximation
in model-based signal processing. In this way, we hope to dismantle the current psychology
of dependence in the Bayesian signal processing community on stochastic
sampling methods.Without doubt, the ability to model complex problems to arbitrary
levels of accuracy will ensure that stochastic sampling methods—such as MCMC—
will remain the golden standard for distributional approximation. Notwithstanding
this, our purpose here is to show that the VB method of approximation can yield
highly effective Bayesian inference algorithms at low computational cost. In showing
this, we hope that Bayesian methods might become accessible to a much broader
constituency than has been achieved to date。
下載該資料的人也在下載
下載該資料的人還在閱讀
更多 >
- 簡述對貝葉斯公式的基本理解 0次下載
- 基于貝葉斯網絡等的疼痛表情識別方法 11次下載
- 基于貝葉斯網絡和數據挖掘的航班延誤預測方法 3次下載
- 一種基于貝葉斯方法的網絡安全態勢感知混合模型 19次下載
- 貝葉斯網絡模型之一依賴估測器模型研究 12次下載
- 樸素貝葉斯NB經典案例 2次下載
- 基于貝葉斯網絡的克隆有害性預測方法 0次下載
- 建立實體情感演化貝葉斯置信網的方法 0次下載
- 貝葉斯網絡分析 2次下載
- 貝葉斯算法(bayesian)介紹 0次下載
- 基于貝葉斯分類研究肌肉動作模式識別方法
- 基于貝葉斯網絡的故障樹在機械設備中的應用
- 貝葉斯方法在蛋白質耐熱性分類中的研究
- 基于貝葉斯網絡的軟件項目風險評估模型
- 基于應變模態和貝葉斯方法的桿件損傷識別
- 貝葉斯優化是干什么的(原理解讀) 1214次閱讀
- 關于貝葉斯概念進行形式化的建模和推理 509次閱讀
- 對樸素貝葉斯算法原理做展開介紹 1744次閱讀
- 使用樸素貝葉斯和GPU進行更快的文本分類 1346次閱讀
- 機器學習:簡單的術語帶你領略貝葉斯優化之美 2075次閱讀
- 貝葉斯方法到貝葉斯網絡 3305次閱讀
- 帶你入門常見的機器學習分類算法——邏輯回歸、樸素貝葉斯、KNN、SVM、決策樹 1w次閱讀
- 為什么AlphaGo調參用貝葉斯優化?手動調參需要8.3天 4421次閱讀
- 貝葉斯統計的一個實踐案例讓你更快的對貝葉斯算法有更多的了解 1.4w次閱讀
- 樸素貝葉斯算法詳細總結 3.4w次閱讀
- 機器學習之樸素貝葉斯 906次閱讀
- 基于概率的常見的分類方法--樸素貝葉斯 5263次閱讀
- 怎樣通俗易懂地解釋貝葉斯網絡和它的應用? 4168次閱讀
- 貝葉斯分類算法及其實現 7453次閱讀
- 如何理解貝葉斯公式 3941次閱讀
下載排行
本周
- 1DC電源插座圖紙
- 0.67 MB | 2次下載 | 免費
- 2AN158 GD32VW553 Wi-Fi開發指南
- 1.51MB | 2次下載 | 免費
- 3AN148 GD32VW553射頻硬件開發指南
- 2.07MB | 1次下載 | 免費
- 4AN111-LTC3219用戶指南
- 84.32KB | 次下載 | 免費
- 5AN153-用于電源系統管理的Linduino
- 1.38MB | 次下載 | 免費
- 6AN-283: Σ-Δ型ADC和DAC[中文版]
- 677.86KB | 次下載 | 免費
- 7SM2018E 支持可控硅調光線性恒流控制芯片
- 402.24 KB | 次下載 | 免費
- 8AN-1308: 電流檢測放大器共模階躍響應
- 545.42KB | 次下載 | 免費
本月
- 1ADI高性能電源管理解決方案
- 2.43 MB | 450次下載 | 免費
- 2免費開源CC3D飛控資料(電路圖&PCB源文件、BOM、
- 5.67 MB | 138次下載 | 1 積分
- 3基于STM32單片機智能手環心率計步器體溫顯示設計
- 0.10 MB | 130次下載 | 免費
- 4使用單片機實現七人表決器的程序和仿真資料免費下載
- 2.96 MB | 44次下載 | 免費
- 53314A函數發生器維修手冊
- 16.30 MB | 31次下載 | 免費
- 6美的電磁爐維修手冊大全
- 1.56 MB | 24次下載 | 5 積分
- 7如何正確測試電源的紋波
- 0.36 MB | 17次下載 | 免費
- 8感應筆電路圖
- 0.06 MB | 10次下載 | 免費
總榜
- 1matlab軟件下載入口
- 未知 | 935121次下載 | 10 積分
- 2開源硬件-PMP21529.1-4 開關降壓/升壓雙向直流/直流轉換器 PCB layout 設計
- 1.48MB | 420062次下載 | 10 積分
- 3Altium DXP2002下載入口
- 未知 | 233088次下載 | 10 積分
- 4電路仿真軟件multisim 10.0免費下載
- 340992 | 191367次下載 | 10 積分
- 5十天學會AVR單片機與C語言視頻教程 下載
- 158M | 183335次下載 | 10 積分
- 6labview8.5下載
- 未知 | 81581次下載 | 10 積分
- 7Keil工具MDK-Arm免費下載
- 0.02 MB | 73810次下載 | 10 積分
- 8LabVIEW 8.6下載
- 未知 | 65988次下載 | 10 積分
評論
查看更多