インテル® DAAL プログラミング・ガイド

文献目録 (英語)

インテル® DAAL に実装されているアルゴリズムの詳細は、次の文献を参照してください。

[Agrawal94]

Rakesh Agrawal, Ramakrishnan Srikant. Fast Algorithms for Mining Association Rules. Proceedings of the 20th VLDB Conference Santiago, Chile, 1994.

[Ben05]

Ben-Gal I. Outlier detection. In: Maimon O. and Rockach L. (Eds.) Data Mining and Knowledge Discovery Handbook: A Complete Guide for Practitioners and Researchers", Kluwer Academic Publishers, 2005, ISBN 0-387-24435-2.

[Biernacki2003]

C. Biernacki, G. Celeux, and G. Govaert. Choosing Starting Values for the EM Algorithm for Getting the Highest Likelihood in Multivariate Gaussian Mixture Models. Computational Statistics & Data Analysis, 41, 561-575, 2003.

[Billor2000]

Nedret Billor, Ali S. Hadib, and Paul F. Velleman. BACON: blocked adaptive computationally efficient outlier nominators. Computational Statistics & Data Analysis, 34, 279-298, 2000.

[Boser92]

B. E. Boser, I. Guyon, and V. Vapnik. A training algorithm for optimal marginclassifiers.. Proceedings of the Fifth Annual Workshop on Computational Learning Theory, pp: 144–152, ACM Press, 1992.

[bzip2]

http://www.bzip.org/

[Dempster77]

A.P.Dempster, N.M. Laird, and D.B. Rubin. Maximum-likelihood from incomplete data via the em algorithm. J. Royal Statist. Soc. Ser. B., 39, 1977.

[Fan05]

Rong-En Fan, Pai-Hsuen Chen, Chih-Jen Lin. Working Set Selection Using Second Order Information for Training Support Vector Machines.. Journal of Machine Learning Research 6 (2005), pp: 1889–1918.

[Freund99]

Yoav Freund, Robert E. Schapire. Additive Logistic regression: a statistical view of boosting. Journal of Japanese Society for Artificial Intelligence (14(5)), 771-780, 1999.

[Freund01]

Yoav Freund. An adaptive version of the boost by majority algorithm. Machine Learning (43), pp. 293-318, 2001.

[Friedman00]

Jerome Friedman, Trevor Hastie, and Robert Tibshirani. Additive Logistic regression: a statistical view of boosting. The Annals of Statistics, 28(2), pp: 337-407, 2000.

[Hastie2009]

Trevor Hastie, Robert Tibshirani, Jerome Friedman. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Second Edition (Springer Series in Statistics), Springer, 2009. Corr. 7th printing 2013 edition (2011/12/23).

[Hsu02]

Chih-Wei Hsu and Chih-Jen Lin. A Comparison of Methods for Multiclass Support Vector Machines. IEEE Transactions on Neural Networks, Vol. 13, No. 2, pp: 415-425, 2002.

[Iba92]

Wayne Iba, Pat Langley. Induction of One-Level Decision Trees. Proceedings of Ninth International Conference on Machine Learning, pp: 233-240, 1992.

[Joachims99]

Thorsten Joachims. Making Large-Scale SVM Learning Practical. Advances in Kernel Methods - Support Vector Learning, B. Schölkopf, C. Burges, and A. Smola (ed.), pp: 169 – 184, MIT Press Cambridge, MA, USA 1999.

[Lloyd82]

Stuart P Lloyd. Least squares quantization in PCM. IEEE Transactions on Information Theory 28 (2): pp: 129–137, 1982.

[lzo]

http://www.oberhumer.com/opensource/lzo/

[Renie03]

Jason D.M. Rennie, Lawrence, Shih, Jaime Teevan, David R. Karget. Tackling the Poor Assumptions of Naïve Bayes Text classifiers. Proceedings of the Twentieth International Conference on Machine Learning (ICML-2003), Washington DC, 2003.

[rle]

http://data-compression.info/Algorithms/RLE/index.htm

[Sokolova09]

Marina Sokolova, Guy Lapalme. A systematic analysis of performance measures for classification tasks. Information Processing and Management 45 (2009), pp. 427–437. ダウンロード URL: http://atour.iro.umontreal.ca/rali/sites/default/files/publis/SokolovaLapalme-JIPM09.pdf

[West79]

D.H.D. West. Updating Mean and Variance Estimates: An improved method. Communications of ACM, 22(9), pp: 532-535, 1979.

[Wu04]

Ting-Fan Wu, Chih-Jen Lin, Ruby C. Weng. Probability Estimates for Multi-class Classification by Pairwise Coupling. Journal of Machine Learning Research 5, pp: 975-1005, 2004.

[zLib]

http://www.zlib.net/