Loading Events

« All Events

  • This event has passed.

Machine learning seminar – “Ensemble Feature Selection Integrating Stability” and “A noise-robust Fast Sparse Bayesian Learning Model”

April 24 @ 15:00 - 16:00

Welcome to the joint HVL, UiB and HUS machine learning seminar!

The seminar series is an informal platform where people interested in machine learning can meet and discuss across institutions and disciplines. Experienced users, newcomers, and everyone in between are welcome to participate. In this way, we can share experiences, learn from each other and maybe develop a foundation for future collaborations.

Time: Wednesday April 24th 14:15-16:00
Place: Auditorium 4, UiB – Realfagsbygget

Program:

Xiaokang Zhang (joint work with Inge Jonassen): EFSIS: Ensemble Feature Selection Integrating Stability
Abstract: Ensemble learning that can be used to combine the predictions from multiple learners has been widely applied in pattern recognition, and has been reported to be more robust and accurate than the individual learners. This ensemble logic has recently also been more applied in feature selection. There are basically two strategies for ensemble feature selection, namely data perturbation and function perturbation. Data perturbation performs feature selection on data subsets sampled from the original dataset and then selects the features consistently ranked highly across those data subsets. This has been found to improve both the stability of the selector and the prediction accuracy for a classifier. Function perturbation frees the user from having to decide on the most appropriate selector for any given situation and works by aggregating multiple selectors. This has been found to maintain or improve classification performance. Here we propose a framework, EFSIS, combining these two strategies. Empirical results indicate that EFSIS gives both high prediction accuracy and stability.

Ingvild M. Helgøy (joint work with Yushu Li): A noise-robust Fast Sparse Bayesian Learning Model
Abstract: In this paper, we develop a probabilistic Bayesian learning framework to obtain a noise robust, fast and sparse solution in kernel based supervised learning.  The hierarchical model structure in our framework is designed in a way that the prior restriction will not only penalize unnecessary complexity of the model, but also depend on the variance of the random noise in the data. The parameters in the model are estimated by Fast Marginal Likelihood Maximization algorithm to achieve low computational cost. We compare our methodology with two other Sparse Bayesian Learning (SBL) models: Relevance vector machine (RVM) and a SBL model which is widely used in compressive sensing (CS). We show that our method will generally provide more sparse solutions, lower test error, and is much more flexible to high variance noise.

If you are interested in giving a talk at some of the upcoming meetings, please email us (Therese.Berge.Sjursen@hvl.no).

Details

Date:
April 24
Time:
15:00 - 16:00

Venue

Realfagbygget, UiB
Allégaten 41
Bergen, 5007 Norway