Principal component analysis as tool for data reduction with an application
The recent trends in collecting huge datasets have posed a great challenge that is brought by the high dimensionality and aggravated by the presence of irrelevant dimensions. Machine learning models for regression is recognized as a convenient way of improving the estimation for empirical models. Popular machine learning models is support vector regression (SVR). However, the usage of principal component analysis (PCA) as a variable reduction method along with SVR is suggested. The principal component analysis helps in building a predictive model that is simple as it contains the smallest number of variables and efficient. In this paper, we investigate the competence of SVR with PCA to explore its performance for a more accurate estimation. Simulation study and Renal Failure (RF) data of SVR optimized by four different kernel functions; linear, polynomial, radial basis, and sigmoid functions using R software, version (R x64 3.2.5) to compare the behavior of ε SVR and v-SVR models for different sample sizes ranges from small, moderate to large such as; 50, 100, and 150. The performance criteria are root mean squared error (RMSE) and coefficient of determination R2 showed the superiority of ε-SVR over v- SVR. Furthermore, the implementation of SVR after employing PCA improves the results. Also, the simulation results showed that the best performing kernel function is the linear kernel. For real data the results showed that the best kernels are linear and radial basis function. It is also clear that, with ε-SVR and v-SVR, the RMSE values for almost kernel functions decreased with increasing sample size. Therefore, the performance of ε-SVR improved after applying PCA. In addition sample size n=50 gave good results for linear and radial kernel
Chowdhury, U. N., Chakravarty, S. K., Hossain, Md. T. (2018). Short-Term Financial Time Series Forecasting Integrating Principal Component Analysis and Independent Component Analysis with Support Vector Regression. Journal of Computer and Communications, 06 (03), 51–67. doi: https://doi.org/10.4236/jcc.2018.63004
Yu, H., Chen, R., Zhang, G. (2014). A SVM Stock Selection Model within PCA. Procedia Computer Science, 31, 406–412. doi: https://doi.org/10.1016/j.procs.2014.05.284
Glaser, J. I., Benjamin, A. S., Farhoodi, R., Kording, K. P. (2019). The roles of supervised machine learning in systems neuroscience. Progress in Neurobiology, 175, 126–137. doi: https://doi.org/10.1016/j.pneurobio.2019.01.008
Lee, J. A., Verleysen, M. (2009). Quality assessment of dimensionality reduction: Rank-based criteria. Neurocomputing, 72 (7-9), 1431–1443. doi: https://doi.org/10.1016/j.neucom.2008.12.017
Jolliffe, I. T., Cadima, J. (2016). Principal component analysis: a review and recent developments. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 374 (2065), 20150202. doi: https://doi.org/10.1098/rsta.2015.0202
Rosipal, R., Girolami, M., Trejo, L. J., Cichocki, A. (2001). Kernel PCA for Feature Extraction and De-Noising in Nonlinear Regression. Neural Computing & Applications, 10 (3), 231–243. doi: https://doi.org/10.1007/s521-001-8051-z
Shi, S., Li, G., Chen, H., Hu, Y., Wang, X., Guo, Y., Sun, S. (2018). An efficient VRF system fault diagnosis strategy for refrigerant charge amount based on PCA and dual neural network model. Applied Thermal Engineering, 129, 1252–1262. doi: https://doi.org/10.1016/j.applthermaleng.2017.09.117
Chao, D., Zhou, W., Ye, C., Zhang, Q., Chen, Y., Gu, L. et. al. (2019). An electrolytic Zn–MnO2 battery for high‐voltage and scalable energy storage. Angewandte Chemie International Edition, 58 (23), 7823–7828. doi: https://doi.org/10.1002/anie.201904174
Vapnik, V. (2000). The nature of statistical learning theory. Springer, 314. doi: https://doi.org/10.1007/978-1-4757-3264-1
Mechelli, A., Vieira, S. (Eds.) (2019). Machine learning: methods and applications to brain disorders. Academic Press. doi: https://doi.org/10.1016/C2017-0-03724-2
Blanco, V., Puerto, J., Rodriguez-Chia, A. M. (2020). On lp-Support Vector Machines and Multidimensional Kernels. Journal of Machine Learning Research, 21 (14). Available at: https://jmlr.org/papers/volume21/18-601/18-601.pdf
Astuti, W., Adiwijaya (2018). Support vector machine and principal component analysis for microarray data classification. Journal of Physics: Conference Series, 971, 012003. doi: https://doi.org/10.1088/1742-6596/971/1/012003
Drucker, H., Burges, C. J., Kaufman, L., Smola, A., Vapnik, V. (1996). Support vector regression machines. Advances in Neural Information Processing Systems 9 (NIPS 1996), 155–161. Available at: https://papers.nips.cc/paper/1996/file/d38901788c533e8286cb6400b40b386d-Paper.pdf
Jiang, M., Zhu, L., Wang, Y., Xia, L., Shou, G., Liu, F., Crozier, S. (2011). Application of kernel principal component analysis and support vector regression for reconstruction of cardiac transmembrane potentials. Physics in Medicine and Biology, 56 (6), 1727–1742. doi: https://doi.org/10.1088/0031-9155/56/6/013
Shokri, S., Sadeghi, M., Marvast, M., Narasimhan, S. (2015). Integrating principal component analysis and vector quantization with support vector regression for sulfur content prediction in HDS process. Chemical Industry and Chemical Engineering Quarterly, 21 (3), 379–390. doi: https://doi.org/10.2298/ciceq140418039s
Chowdhury, U. N., Rayhan, M. A., Chakravarty, S. K., Hossain, M. T. (2017). Integration of principal component analysis and support vector regression for financial time series forecasting. International Journal of Computer Science and Information Security (IJCSIS), 15 (8), 28–32.
Naik, G. R. (Ed.) (2018). Advances in Principal Component Analysis. Springer. doi: https://doi.org/10.1007/978-981-10-6704-4
Vapnik, V. N. (1998). Statistical Learning Theory. Adaptive and Learning Systems for Signal Processing, Communications, and Control. Signal Processing, Communications, and Control.
Scholkopf, B., Bartlett, P. L., Smola, A. J., Williamson, R. (1999). Shrinking the tube: a new support vector regression algorithm. Advances in Neural Information Processing Systems, 330–336.
Higuchi, I., Eguchi, S. (2004). Robust principal component analysis with adaptive selection for tuning parameters. Journal of Machine Learning Research, 453–471.
Copyright (c) 2022 Shereen Hamdy Abdel Latif, Asraa Sadoon Alwan, Amany Mousa Mohamed
This work is licensed under a Creative Commons Attribution 4.0 International License.
Our journal abides by the Creative Commons CC BY copyright rights and permissions for open access journals.
Authors, who are published in this journal, agree to the following conditions:
1. The authors reserve the right to authorship of the work and pass the first publication right of this work to the journal under the terms of a Creative Commons CC BY, which allows others to freely distribute the published research with the obligatory reference to the authors of the original work and the first publication of the work in this journal.
2. The authors have the right to conclude separate supplement agreements that relate to non-exclusive work distribution in the form in which it has been published by the journal (for example, to upload the work to the online storage of the journal or publish it as part of a monograph), provided that the reference to the first publication of the work in this journal is included.