Settings

Theme

How deep learning on GPUs wins datamining contest without feature engineering

blog.kaggle.com

110 points by doobwa 13 years ago · 14 comments

Reader

etrain 13 years ago

The author points out something about these Machine Learning contests and Machine Learning in general that I've noticed for a while - feature selection tends to dominate learning algorithm selection. It's good to see that there are modern academic methods for feature discovery that seem to be on par with (or better than) a domain expert manually selecting features.

  • karpathy 13 years ago

    Yes, but just as with normal feature engineering, don't make the mistake of thinking that these methods are fully automatic work by magic. There is no free lunch.

    A common criticism with these methods is that they merely shift engineering from features to parameters that specify the architecture. There are many choices to be made: The exact number of layers, number of neurons per layer, the connectivity, sparsity parameters, non-linearities, sizes of receptive fields, learning rates, weight decays, pre-training schedule etc etc etc. Perhaps even worse, while you can use intuition to design features, it is not as trivial to see if you should be using a sigmoid, tanh, or rectified linear units (+associated parameters for each) in the 3rd layer of the network. And maybe even worse, these parameters can actually have quite a strong effect on the final performance.

    These are still powerful models and we are learning a lot about what works and what doesn't (and I'm optimistic) but don't make the mistake of thinking they are automatic. For now, you need to know what you're doing.

    • dave_sullivan 13 years ago

      The problem with using intuition for feature engineering is that intuition is often wrong. But more than that, it's much less scalable than a system that learns progressively higher level features from lower level data automatically.

      And parameter selection isn't nearly as involved in practice--but it is important to know what you're doing because you'll basically be translating from papers to code in order to get a working implementation, at least until some startup comes out with a plug and play deep learning paas

      For anyone interested in playing with the tools that are available, deeplearning.net provides a set of very good tutorials with working code.

    • doobwaOP 13 years ago

      I agree these methods still require a fair amount of expert knowledge and intuition in order to make the various choices you mention. On the other hand, Bayesian optimization can prove useful for exploring such a space. A recent paper (http://arxiv.org/pdf/1206.2944.pdf) used Bayesian optimization with GPs to find hyperparameter settings for a deep convolutional network. The resulting hyperparameters gave state of the art performance, beating those chosen by an expert (Krizhevsky, the researcher who recently won ImageNet).

    • marshallp 13 years ago

      Is there a study comprehensively comparing the results of those different settings on a variety of data sets. (your opinion is commonly stated but I can't find any thorough evidence to it)

username3 13 years ago

http://webcache.googleusercontent.com/search?q=cache:http://...

doobwaOP 13 years ago

Given that pharma is a massive industry and that drug discovery often costs around 1 billion dollars, the top prize of $22,000 seems awfully low. Will we start to see larger prizes, or will startups take this technology and monetize better than academia currently does?

  • jklio 13 years ago

    With Geoffrey Hinton involved as a supervisor I expect they were on the bleeding edge for other reasons anyway and just decided to scoop up some extra cash as well. I've not looked closely but Kaggle does seem to be a little like 99designs though.

  • joelthelion 13 years ago

    There is no shortage of people interested in machine learning.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection