AKSDA

Accelerated Kernel Subclass Discriminant Analysis and SVM combination: An efficient dimensionality reduction and classification method, for very high-dimensional data.

Description

AKSDA is a new, GPU-accelerated, state-of-the-art C++ method (provided both as source code and command-line executable) for supervised dimensionality reduction and classification, using multiple kernels. It greatly reduces the dimensionality of the input data, while at the same time it increases their linear separability. Used in conjunction with linear SVMs, it achieves state-of-the-art classification results, consistently higher than Kernel SVM approaches, at orders-of-magnitude shorter training times.

Use AKSDA to make full use of your (even low-end!) Nvidia GPU, to accelerate your classification or quickly reduce the dimensionality of any data, and achieve higher accuracy and reduced training times (in most cases shorter than those of current state-of-the-art linear SVMs)!

AKSDA takes as input an annotated training set (both for binary and multiclass problems) and a test set. It will reduce the dimensionality of the feature vectors provided for the training and testing sets using any number of the available kernels. The resulting projected data can be used to train (on the training set) and apply (on the test set) a linear SVM classifier, which will result in an increase in classification performance, at a fraction of the time required by other state-of-art SVM methods. Its output is both the final linear SVM classification result and the projection of the original input data in the reduced dimension.

In essence, this complete method can:

  • Accelerate classification by directly replacing any SVM classifier (which would typically be trained and applied on the original, high-dimensional feature vectors), or by using AKSDA just as a preprocessing (dimensionality reduction) step that you can then pair with any kind of classifier.
  • Perform fast and effective dimensionality reduction, by replacing any other supervised dimensionality reduction technique in other applications, such as data visualization, k-means clustering, Nearest Neighbor search etc.

Usage

  1. Download the aksda.rar from the list below.
  2. Unpack the aksda.rar archive creating a new folder.
  3. Run the software or use the .dll/.lib to link AKSDA as a plug-in library on a project of your own, following the instructions in the included “AKSDA documentation.docx" file.

Output

  1. Classification: This program outputs in a .txt the predictions for the test samples, as would any classic SVM classifier.  The predictions have columns equal to the number of classes if classes>2. For binary classification, the output is only 1 column. Each  row corresponds to one sample, and each column holds the probability value (or simple decision value), that the sample will belong to a specific class.
  2. Dimensionality reduction: This method will also output the input data projected in a much lower dimension, typically to the size of (N*M-2) elements per sample, where Ν is the number of classes and M is any number of requested subclasses per class. It has been successfully used to reduce 100K-element vectors to a dimension of 2, while still increasing the classification accuracy beyond state-of-art performance.

Main prerequisites

Version history

Release version (v 2.4, released 2016-6-9):   Updated dead links in documentation. Cleaner source code, more comments.

Release version (v 2.3, released 2016-4-12): Bugfixes on numerical stability, visual output. Removal of some obsolete messages.

Release version (v 2.2, released 2016-3-10): Bugfixes (mostly visual/output), more helper functions available.

Release version (v 2.1, released 2016-2-25): Various minor Bug fixes.

Release version (v 2.0, released 2016-2-12): Multiclass, multikernel capabilities. Unrestricted, full version.

Demo version (v1.0.1, released 2015-11-25): Improved speed and memory consumption. Supports 4 total precision options. Incorporates the changes of Eigen 3.2.7

Demo version (v1.0.0, released 2015-10-09)

Downloads

Latest Edition (v 2.4)

Publications

This software is the AGSDA implementation presented in the following paper:

S. Arestis-Chartampilas, N. Gkalelis, V. Mezaris, "AKSDA-MSVM: a GPU-accelerated multiclass learning framework for multimedia", Proc. ACM Multimedia 2016, Amsterdam, The Netherlands, Oct. 2016.

It is an extension of the earlier method presented in:

S. Arestis-Chartampilas, N. Gkalelis, V. Mezaris, "GPU accelerated generalised subclass discriminant analysis for event and concept detection in video", Proc. ACM Multimedia 2015, Brisbane, Australia, Oct. 2015 [download link]

If you find this software useful in your research work, please cite the above paper(s).

License

Every individual piece of software or source code used to make this trial version has its own license that applies. Look below for more info.

Cuda has its own license (http://docs.nvidia.com/cuda/eula/index.html#nvidia-cuda-toolkit-license-agreement) which you might have already agreed to, by installing their toolkit and drivers in the “dependencies” folder.

Eigen is MPL2 with some features under different licensing (http://eigen.tuxfamily.org/index.php?title=Main_Page#License). Eigen’s license dictates that we clearly provide a way to access their source code. Their source code can be obtained freely in http://eigen.tuxfamily.org/index.php?title=Main_Page .

Libsvm (http://www.csie.ntu.edu.tw/~cjlin/libsvm/COPYRIGHT)

Liblinear (http://www.csie.ntu.edu.tw/~cjlin/liblinear/COPYRIGHT)

Opencv (3-clause BSD License) (http://opencv.org/license.html)

Vlfeat (BSD license) (http://www.vlfeat.org/license.html)

Downloading, copying, installing or using any part of our software implies you have read, understood, and agreed upon, every license presented in this document (each applying to its own product). If the links are inaccessible, we provide a separate folder named “licenses” in aksda.rar containing copies of every one of them.


Our software (AKSDA) is covered by the following license: Copyright (C) 2015 Stavros Arestis-Chartampilas, Nikolaos Gkalelis, Vasileios Mezaris / CERTH-ITI. All Rights Reserved.

The AKSDA software, including the provided executables, source code, documentation and all other materials (excluding third-party software, for which separate licenses apply, as specified above and in the respective web pages), is provided only for academic, non-commercial use. Commercial use is prohibited. For information on non-academic use possibilities, please contact bmezaris@iti.gr

This software is provided by the authors "as is" and any express or implied warranties, including, but not limited to, the implied warranties of merchantability and fitness for a particular purpose are disclaimed. In no event shall the authors be liable for any direct, indirect, incidental, special, exemplary, or consequential damages (including, but not limited to, procurement of substitute goods or services; loss of use, data, or profits; or business interruption) however caused and on any theory of liability, whether in contract, strict liability, or tort (including negligence or otherwise) arising in any way out of the use of this software, even if advised of the possibility of such damage. No warranty or representation of any kind is made, given or implied as to the absence of any infringement of any proprietary rights of third parties.


 

Acknowledgements

This work was supported by the EC under contracts FP7-600826 ForgetIT and H2020-687786 InVID, and by Nvidia Corporation with the donation of a Tesla K40 GPU.

Contacts

You may contact Vasileios Mezaris by sending e-mail at bmezaris@iti.gr for any question or remark you may have with respect to this tool.