Jose C. Principe

Information Theoretic learning

9/6/2010, 12:00,
Facultad de Informática,
aula 2.grados


This talk describes our efforts to go beyond the second order moment assumption still prevalent in optimal signal processing and machine learning. We show how the second norm of the PDF can be estimated directly from data avoiding an explicit PDF estimation step. The link between PDF moments, information theory and Reproducing Kernel Hilbert spaces will be established. Applications to adaptive systems with entropic cost functions will be demonstrated. A generalized correlation function called correntopy will be defined and its applications in signal processing will be outlined. Correntopy leads to new measures of similarity, to a new definition of dependence subspaces and to new tests for causality.

Organiza: Amparo Alonso, Departamento de Computación.