We are pleased to announce the release of Intel® Data Analytics Acceleration Library 2016 Beta!
Intel® Data Analytics Acceleration Library is a C++ and Java API library of optimized analytics building blocks for all data analysis stages, from data acquisition to data mining and machine learning. It is a library essential for engineering high performance data application solutions.
To join the free Beta program and get instructions on downloading the software, follow the links below:
- Intel® Data Analytics Acceleration Library beta for Linux*
- Intel® Data Analytics Acceleration Library beta for OS X*
- Intel® Data Analytics Acceleration Library beta for Windows*
Visit our User Forum and join the discussions if you have any questions.
This is the initial Beta release and it has introduced many features including:
- C++ and Java programming languages API.
- Data mining and analysis algorithms for
- Computing correlation distance and Cosine distance
- PCA (Correlation, SVD)
- Matrix decomposition (SVD, QR, Cholesky)
- Computing statistical moments
- Computing variance-covariance matrices
- Univariate and multivariate outlier detection
- Association rule mining
- Algorithms for supervised and unsupervised machine learning:
- Linear regressions
- Naïve Bayes classifier
- AdaBoost, LogitBoost, and BrownBoost classifiers
- SVM
- K-Means clustering
- Expectation Maximization (EM) for Gaussian Mixture Models (GMM)
- Support for local and distributed data sources:
- In-file and in-memory CSV
- MySQL
- HDFS
- Support for Resilient Distributed Dataset (RDD) objects for Apache Spark*.
- Data compression and decompression:
- ZLIB
- LZO
- RLE
- BZIP2
- Data serialization and deserialization.
Supported operating systems are Windows*, Linux*, and OS X*.
Please join other users in discussions about Intel Data Analy on the User Forum.
This program will end in late June 2015. During the program, we will contact you to gain feedback. Thank you for your interest in Intel Data Analytics Acceleration Library. Let us know how we can help you!