References

Abu-Mostafa, Y. S., Magdon-Ismail, M., & Lin, H.-T. (2012). Learning from data. AMLBook.
Aler, R., Valls, J. M., & Boström, H. (2020). Study of hellinger distance as a splitting metric for random forests in balanced and imbalanced classification datasets. Expert Systems with Applications, 149, 113264.
Bergstra, J., & Bengio, Y. (2012). Random search for hyper-parameter optimization. Journal of Machine Learning Research, 13(2).
Bishop, C. M. (2006). Pattern recognition and machine learning (information science and statistics). Springer-Verlag.
Chen, T., & Guestrin, C. (2016). XGBoost: A scalable tree boosting system. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 785–794. https://doi.org/10.1145/2939672.2939785
Chicco, D., & Jurman, G. (2020). Machine learning can predict survival of patients with heart failure from serum creatinine and ejection fraction alone. BMC Medical Informatics and Decision Making, 20 (16). https://doi.org/10.1186/s12911-020-1023-5
Hastie, T., Tibshirani, R., Friedman, J. H., & Friedman, J. H. (2009). The elements of statistical learning: Data mining, inference, and prediction (Vol. 2). Springer.
James, G., Witten, D., Hastie, T., Tibshirani, R., et al. (2013). An introduction to statistical learning (Vol. 112). Springer.
Murphy, K. P. (2013). Machine learning : A probabilistic perspective. MIT Press. https://www.amazon.com/Machine-Learning-Probabilistic-Perspective-Computation/dp/0262018020/ref=sr_1_2?ie=UTF8&qid=1336857747&sr=8-2
Wright, M. N., & Ziegler, A. (2017). Ranger: A fast implementation of random forests for high dimensional data in c++ and r. Journal of Statistical Software, 77(1), 1–17. https://doi.org/10.18637/jss.v077.i01
Zou, H., & Hastie, T. (2005). Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society. Series B (Statistical Methodology), 67(2), 301–320. http://www.jstor.org/stable/3647580