14th IFAC Symposium on System Identification, SYSID 2006

SYSID-2006 Paper Abstract

Close

Paper ThB2.5

Seghouane, Abd-krim (National ICT Australia), Amari, Shun-ichi (RIKEN Brain Science Inst.)

Variants of the Kullback-Leibler Divergence and their Role in Model Selection

Scheduled for presentation during the Regular Session "Identifiability and Model Selection" (ThB2), Thursday, March 30, 2006, 16:50−17:10, Banquet Room

14th IFAC Symposium on System Identification, March 29 - 31, 2006, Newcastle, Australia

This information is tentative and subject to change. Compiled on July 22, 2018

Keywords Nonlinear System Identification, Maximum Likelihood Methods, Machine Learning and Data Mining

Abstract

The Akaike information criterion, AIC, is a widely used tool for model selection. AIC is derived as an asymptotically unbiased estimator of a function used for ranking candidate models which is a variant of the Kullback-Leibler divergence between the true model and the approximating candidate model. Despite the Kullback-Leibler's computational and theoretical advantages, what can become a nuisance in model selection applications is its lack of symmetry. Simple examples can show that reversing the role of the arguments in the Kullback-Leibler divergence can yield substantially different results. In this paper, three new functions for ranking candidate models are proposed. These functions are constructed by symmetrizing the Kullback-Leibler divergence between the true model and the approximating candidate model. The operations used for symmetrizing are the average, geometric and harmonic means. It is found that the original AIC criterion is an asymptotically unbiased estimator of these three different functions. A simulation study based on polynomial regression is also provided to compare the different proposed ranking functions with the AIC asymptotic estimation.