Pure Mathematics for Theoretical Computer Science

Journal DOI

https://doi.org/10.54216/PMTCS

Submit Your Paper

2995-3162ISSN (Online)

Volume 5 , Issue 2 , PP: 01-10, 2025 | Cite this article as | XML | Html | PDF | Full Length Article

Employing OSCAR Variable Selection Method in Linear Regression with an Application

Anwer Fawzi Ali 1 *

  • 1 Al-Qadisiya Governorate Education Directorate, Iraq - (anwerfawzi49@gmail.com)
  • Doi: https://doi.org/10.54216/PMTCS.050201

    Received: January 23, 2025 Revised: March 27, 2025 Accepted: May 31, 2025
    Abstract

    This study investigates the effectiveness of variable selection techniques in linear regression models under grouped structures and correlation among predictors. Specifically, it evaluates and compares the performance of three prominent methods: LASSO, Elastic Net, and OSCAR. The simulation study spans multiple scenarios, including varying correlation levels and sample sizes, and utilizes key metrics such as Mean Squared Error (MSE), True Positive Rate (TPR), False Positive Rate (FPR), and Grouping Accuracy. The results reveal the superior performance of OSCAR, particularly in grouped settings, where it consistently achieves lower error rates and better variable selection accuracy. A real data application using the prostate cancer dataset further supports the empirical advantages of OSCAR over its counterparts, especially in scenarios involving correlated and grouped predictors. The findings provide strong evidence in favor of OSCAR as a reliable tool for robust regression modeling.

    Keywords :

    LASSO , Elastic Net , OSCAR , Variable Selection , Grouped Predictors

    References

    [1]       Alhamzawi, R., and H. T. M. Ali, “The Bayesian elastic net regression,” Communications in Statistics—Simulation and Computation, vol. 47, no. 4, pp. 1168–1178, 2018, doi: 10.1080/03610918.2017.1325275.

     

    [2]       Fan, J., and R. Li, “Variable selection via nonconcave penalized likelihood and its oracle properties,” Journal of the American Statistical Association, vol. 96, no. 456, pp. 1348–1360, 2001, doi: 10.1198/016214501753382273.

     

    [3]       Hans, C., “Bayesian lasso regression,” Biometrika, vol. 96, no. 4, pp. 835–845, 2009, doi: 10.1093/biomet/asp047.

     

    [4]       Kyung, M., J. Gill, M. Ghosh, and G. Casella, “Penalized regression, standard errors, and Bayesian lassos,” Bayesian Analysis, vol. 5, no. 2, pp. 369–411, 2010, doi: 10.1214/10-BA507.

     

    [5]       Li, Q., and N. Lin, “The Bayesian elastic net,” Bayesian Analysis, vol. 5, no. 1, pp. 151–170, 2010, doi: 10.1214/10-BA507.

     

    [6]       Park, T., and G. Casella, “The Bayesian Lasso,” Journal of the American Statistical Association, vol. 103, no. 482, pp. 681–686, 2008, doi: 10.1198/016214508000000337.

     

    [7]       Raheem, S. H., F. H. Alhusseini, and T. H. Alshaybawee, “Bayesian reciprocal group lasso composite quantile regression,” Journal of Applied Statistics and Data Science, accepted for publication, 2024.

     

    [8]       Raheem, S. H., F. H. Alhusseini, and T. H. Alshaybawee, “Bayesian skewed t multivariate censored quantile regression for neuroimaging data,” Communications in Statistics—Simulation and Computation, accepted for publication, 2024.

     

    [9]       Tibshirani, R., “Regression shrinkage and selection via the lasso,” Journal of the Royal Statistical Society: Series B (Methodological), vol. 58, no. 1, pp. 267–288, 1996, doi: 10.1111/j.2517-6161.1996.tb02080.x.

     

    [10]    Zou, H., and T. Hastie, “Regularization and variable selection via the elastic net,” Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol. 67, no. 2, pp. 301–320, 2005, doi: 10.1111/j.1467-9868.2005.00503.x.

     

    [11]    Bondell, H. D., and B. J. Reich, “Simultaneous regression shrinkage, variable selection, and supervised clustering of predictors,” Journal of the American Statistical Association, vol. 103, no. 482, pp. 516–529, 2008.

     

    [12]    Petry, S., and G. Tutz, “Penalized regression with ordered categorical predictors,” Statistics and Computing, vol. 21, no. 4, pp. 437–449, 2011.

     

    [13]    Luo, Z., D. Sun, K.-C. Toh, and N. Xiu, “Solving OSCAR regularization problems using a semismooth Newton-based augmented Lagrangian method,” Journal of Machine Learning Research, vol. 20, no. 1, pp. 1–46, 2019.

    Cite This Article As :
    Fawzi, Anwer. Employing OSCAR Variable Selection Method in Linear Regression with an Application. Pure Mathematics for Theoretical Computer Science, vol. , no. , 2025, pp. 01-10. DOI: https://doi.org/10.54216/PMTCS.050201
    Fawzi, A. (2025). Employing OSCAR Variable Selection Method in Linear Regression with an Application. Pure Mathematics for Theoretical Computer Science, (), 01-10. DOI: https://doi.org/10.54216/PMTCS.050201
    Fawzi, Anwer. Employing OSCAR Variable Selection Method in Linear Regression with an Application. Pure Mathematics for Theoretical Computer Science , no. (2025): 01-10. DOI: https://doi.org/10.54216/PMTCS.050201
    Fawzi, A. (2025) . Employing OSCAR Variable Selection Method in Linear Regression with an Application. Pure Mathematics for Theoretical Computer Science , () , 01-10 . DOI: https://doi.org/10.54216/PMTCS.050201
    Fawzi A. [2025]. Employing OSCAR Variable Selection Method in Linear Regression with an Application. Pure Mathematics for Theoretical Computer Science. (): 01-10. DOI: https://doi.org/10.54216/PMTCS.050201
    Fawzi, A. "Employing OSCAR Variable Selection Method in Linear Regression with an Application," Pure Mathematics for Theoretical Computer Science, vol. , no. , pp. 01-10, 2025. DOI: https://doi.org/10.54216/PMTCS.050201