<html>
<head>
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
</head>
<body bgcolor="#FFFFFF" text="#000000">
Estimados Académicos y alumnos, <br>
<br>
Se les invita para este Jueves 22 de Septiembre a las 14:00 hrs, al
Grupo de lectura aprendizaje de máquinas, este se realizará en la
Sala Multimedia del Sexto Piso, Beauchef 851, Torre Norte. <br>
<br>
<b>GLAM #1: The Gaussian Process Convolution Model (Felipe Tobar)<br>
Cuándo: 22/9/16, 1400hrs<br>
Dónde: Sala multimedia, 6to piso, CMM<br>
</b><br>
Tobar, Bui and Turner, "Learning Stationary Time Series using
Gaussian Processes with Nonparametric Kernels", Neural information
processing systems, 2015.<br>
<br>
<b>Abstract: </b>We introduce the Gaussian Process Convolution
Model (GPCM), a two-stage nonparametric generative procedure to
model stationary signals as the convolution between a
continuous-time white-noise process and a continuous-time linear
filter drawn from Gaussian process. The GPCM is a continuous-time
nonparametric-window moving average process and, conditionally, is
itself a Gaussian process with a nonparametric kernel defined in a
probabilistic fashion. The generative model can be equivalently
considered in the frequency domain, where the power spectral density
of the signal is specified using a Gaussian process. One of the main
contributions of the paper is to develop a novel variational
free-energy approach based on inter-domain inducing variables that
efficiently learns the continuous-time linear filter and infers the
driving white-noise process. In turn, this scheme provides
closed-form probabilistic estimates of the covariance kernel and the
noise-free signal both in denoising and prediction scenarios.
Additionally, the variational inference procedure provides
closed-form expressions for the approximate posterior of the
spectral density given the observed data, leading to new Bayesian
nonparametric approaches to spectrum estimation. The proposed GPCM
is validated using synthetic and real-world signals.<br>
<br>
------Próximas sesiones (tentativa):<br>
<br>
1) The Gaussian Process Convolution Model (Felipe Tobar, 22/9)<br>
3) Sparse and online Gaussian Processes (Christopher Ley, 29/9)<br>
2) Warped Gaussian Processes (Gonzalo Ríos, 6/10)<br>
4) Multi-output Gaussian processes (Gabriel Parra, 13/10)<br>
5) Identificación de sistemas usando kernels (Alejandro Bernardín,
20/10)<br>
6) Inferencia de Monte Carlo (Donato Vásquez, 27/10)<br>
7) Inferencia Variacional (F. Tobar + I. Castro, 3/11)<br>
8) Procesos de Dirichlet (Joaquín Rojas, 17/11)<br>
9) Deep Neural Networks (Matías Silva, 24/11)<br>
10) Random Forests (Romain Gouron, 1/12)<br>
11) Probabilistic Graphical models (Ignacio Reyes, 8/12)<br>
<br>
<br>
<br>
Esperando contar con su presencia, les saluda, <br>
<br>
Ma. Inés Rivera <br>
</body>
</html>