Microsoft Download Center Archive
![]() | Local Deep Kernel Learning |
There has been an explosion in the size of modern day training sets with the advent of big data, cheap crowdsourcing and other techniques for gathering labelled information efficiently. Last published: November 26, 2013.
- There has been an explosion in the size of modern day training sets with the advent of big data, cheap crowdsourcing and other techniques for gathering labelled information efficiently. This presents a significant challenge for non-linear SVMs since their cost of prediction can grow linearly with the size of the training set. Thus, even though non-linear SVMs have defined the state-of-the-art on multiple benchmark tasks, their use in real world applications remains limited.We develop a Local Deep Kernel Learning (LDKL) technique for efficient non-linear SVM prediction while maintaining classification accuracy above an acceptable threshold. LDKL learns a tree-based primal feature embedding which is high dimensional and sparse. Primal based classification decouples prediction costs from the number of support vectors and the size of the training set and LDKL’s tree-structured features efficiently encode non-linearities while speeding up prediction exponentially over the state-of-the-art. We develop routines for optimizing over the space of tree-structured features and efficiently scale to problems with more than half a million training points. Experiments on benchmark data sets reveal that LDKL can reduce prediction costs by more than three orders of magnitude in some cases with a moderate sacrifice in classification accuracy as compared to RBF-SVMs. Furthermore, LDKL can achieve better classification accuracies over leading methods for speeding up non-linear SVM prediction. In particular, LDKL is significantly better than kernel approximation techniques, such as Random Fourier Features and Nyström, as it focusses on the decision boundary rather than modeling the kernel everywhere in space. LDKL can also be much faster than reduced set methods as its tree-structured features can be computed in logarithmic time.
Files
![]() | Status: LiveThis download is still available on microsoft.com. The downloads below will come directly from the Microsoft Download Center. |
File | Size |
---|---|
![]() SHA1: 2f6a78894d2b9f8b458da68f9db9594c6f3d791c | 1.86 MB |
File sizes and hashes are retrieved from the Wayback Machine’s indexes. They may not match the latest versions of files hosted on Microsoft servers.
System Requirements
Operating Systems: Windows 10, Windows 7, Windows 8
- Windows 7, Windows 8, or Windows 10
Installation Instructions
- Click Download and follow the instructions.