Mobile Monitoring Solutions

Close this search box.

Feature Selection For Unsupervised Learning

MMS Founder

Article originally posted on Data Science Central. Visit Data Science Central

This is my presentation for the IBM data science day, July 24.


After reviewing popular techniques used in supervised, unsupervised and semi-supervised machine learning, we focus on feature selection methods in these different contexts, especially the metrics used to assess the value of a feature or set of features, be it binary, continuous or categorical variables.

We go in deeper details and review modern feature selection techniques for unsupervised learning, typically relying on entropy-like criteria. While these criteria are usually model-dependent or scale-dependent, we introduce a new model-free, data-driven methodology in this context, with an application to an interesting number theory problem (simulated data set) in which each feature has a known theoretical entropy.

We also briefly discuss high precision computing as it is relevant to this peculiar data set, as well as units of information smaller than the bit.

To download the presentation, click here (PowerPoint document.)

DSC Resources

Subscribe for MMS Newsletter

By signing up, you will receive updates about our latest information.

  • This field is for validation purposes and should be left unchanged.