• Journal of Electronic Science and Technology
  • Vol. 22, Issue 1, 100246 (2024)
Peng Wang1,2,*, Ji Guo1, and Lin-Feng Li3
Author Affiliations
  • 1School of Finance and Economics, Xizang Minzu University, Xianyang, 712082, China
  • 2Research Center for Quality Development of Xizang Special Industries, Xianyang, 712082, China
  • 3School of Computer Science and Technology, Xinjiang University, Urumqi, 830017, China
  • show less
    DOI: 10.1016/j.jnlest.2024.100246 Cite this Article
    Peng Wang, Ji Guo, Lin-Feng Li. Machine learning model based on non-convex penalized huberized-SVM[J]. Journal of Electronic Science and Technology, 2024, 22(1): 100246 Copy Citation Text show less
    References

    [1] Roy A., Chakraborty S.. Support vector machine in structural reliability analysis: A review. Reliab. Eng. Syst. Safe., 233, 109126:1-12(2023).

    [2] M. Tanveer, T. Rajani, R. Rastogi, Y.H. Shao, M.A. Ganaie, Comprehensive review on twin suppt vect machines, Ann. Oper. Res. 310 (2) (Mar. 2022) 1–46.

    [3] S.L. Peng, W.W. Wang, Y.L. Chen, X.L. Zhong, Q.H. Hu, Regressionbased hyperparameter learning f suppt vect machines, IEEE T. Neur. . Lear. (Oct. 2023), doi: 10.1109TNNLS.2023.3321685.

    [4] Tian Y.-J., Shi Y., Liu X.-H.. Recent advances on support vector machines research. Technol. Econ. Dev. Eco., 18, 5-33(2012).

    [5] V.N. Vapnik, The Nature of Statistical Learning They, Springer, New Yk, 1995.

    [6] Evgeniou T., Pontil M., Poggio T.. Regularization networks and support vector machines. Adv. Comput. Math., 13, 1-50(2000).

    [7] J. Zhu, S. Rosset, T. Hastie, R. Tibshirani, 1nm suppt vect machines, in: Proc. of 16th Intl. Conf. on Neural Infmation Processing Systems, Whistler, 2003, pp. 49–56.

    [8] Wegkamp M., Yuan M.. Support vector machines with a reject option. Bernoulli, 17, 1368-1385(2011).

    [9] Fan J.-Q., Li R.-Z.. Variable selection via nonconcave penalized likelihood and its oracle properties. J. Am. Stat. Assoc., 96, 1348-1360(2001).

    [10] Zou H.. The adaptive LASSO and its oracle properties. J. Am. Stat. Assoc., 101, 1418-1429(2006).

    [11] Meinshausen N., Yu B.. LASSO-type recovery of sparse representations for high-dimensional data. Ann. Stat., 37, 246-270(2009).

    [12] Araveeporn A.. The higher-order of adaptive LASSO and elastic net methods for classification on high dimensional data. Mathematics, 9, 1091:1-14(2021).

    [13] Lee J.H., Shi Z.-T., Gao Z.. On LASSO for predictive regression. J. Econometrics, 229, 322-349(2022).

    [14] Zhang C.-H., Huang J.. The sparsity and bias of the LASSO selection in high-dimensional linear regression. Ann. Stat., 36, 1567-1594(2008).

    [15] Kim Y., Choi H., Oh H.-S.. Smoothly clipped absolute deviation on high dimensions. J. Am. Stat. Assoc., 103, 1665-1673(2008).

    [16] Zhang C.-H.. Nearly unbiased variable selection under minimax concave penalty. Ann. Stat., 38, 894-942(2010).

    [17] Zhang X., Wu Y.-C., Wang L., Li R.-Z.. Variable selection for support vector machines in moderately high dimensions. J. R. Stat. Soc. B, 78, 53-76(2016).

    [18] Mangasarian O.L.. A finite Newton method for classification. Optim. Method. Softw., 17, 913-929(2002).

    [19] Rosset S., Zhu J.. Piecewise linear regularized solution paths. Ann. Stat., 35, 1012-1030(2007).

    [20] Yang Y., Zou H.. An efficient algorithm for computing the HHSVM and its generalizations. J. Comput. Graph. Stat., 22, 396-415(2013).

    [21] Yang Y., Zou H.. A fast unified algorithm for solving group-LASSO penalize learning problems. Stat. Comput., 25, 1129-1141(2015).

    [22] Fang K.-N., Wang P., Zhang X.-C., Zhang Q.-Z.. Structured sparse support vector machine with ordered features. J. Appl. Stat., 49, 1105-1120(2022).

    Peng Wang, Ji Guo, Lin-Feng Li. Machine learning model based on non-convex penalized huberized-SVM[J]. Journal of Electronic Science and Technology, 2024, 22(1): 100246
    Download Citation