Volume 10 , Issue 1 , PP: 31-38, 2023 | Cite this article as | XML | Html | PDF | Full Length Article
Shaymaa Riyadh Thanoon 1 *
Doi: https://doi.org/10.54216/GJMSA.0100104
This research provides a conceptual framework and examples for applying Bayesian techniques to binary and vector data. For the binary data, for observations take on one of two possible values, Bayesian logistic regression and Bayesian networks are techniques, applicable Bayesian logistic regression places priors on the coefficients and derives the posterior using the likelihoods under a logistic model. Bayesian networks represent dependencies between binary variables graphically and perform inference using conditional probability tables. For vector data, where observations are multi-dimensional, Bayesian linear regression places priors on the regression coefficients and finds posterior using the likelihoods under linear model. Gaussian process regression models the relationship between inputs and outputs as a draw from a Gaussian process prior and computes the posterior process given observed data. The research provides the conceptual framework underlying Bayesian analysis, including key concepts such as prior and posterior distributions. It highlights the advantages of Bayesian methods like the ability to incorporate domain knowledge and model uncertainty. Numerical examples demonstrate how Bayesian techniques can be applied to binary and vector data classification tasks. The abstract summarizes the core ideas and contributions of the research on this topic.
Binary data , Gaussian process , Logistic regression , Vector data
[1] Albert, J. (2009). Bayesian computation with R. Springer.
[2] Bach, F., Jenatton, R., Mairal, J., & Obozinski, G. (2012). Optimization with sparsity-inducing penalties. Foundations and Trends® in Machine Learning, 4(1), 1–106.
[3] Barber, D. (2012). Bayesian reasoning and machine learning. Cambridge University Press.
[4] Bardenet, R., Doucet, A., & Holmes, C. (2017). On Markov chain Monte Carlo methods for tall data. Journal of Machine Learning Research, 18(47).
[5] Bishop, C. M., & Nasrabadi, N. M. (2006). Pattern recognition and machine learning (Vol. 4, Issue 4). Springer.
[6] Bivand, R. S., Pebesma, E. J., Gómez-Rubio, V., & Pebesma, E. J. (2008). Applied spatial data analysis with R (Vol. 747248717). Springer.
[7] Briceño-Arias, L. M., Chierchia, G., Chouzenoux, E., & Pesquet, J.-C. (2019). A random block-coordinate Douglas–Rachford splitting method with low computational complexity for binary logistic regression. Computational Optimization and Applications, 72, 707–726.
[8] Gelman, A., Carlin, J. B., Stern, H. S., Dunson, D. B., Vehtari, A., & Rubin, D. B. (2013). Bayesian data analysis. CRC press.
[9] Gelman, A., & Hill, J. (2006). Data analysis using regression and multilevel/hierarchical models. Cambridge university press.
[10] Hosmer Jr, D. W., Lemeshow, S., & Sturdivant, R. X. (2013). Applied logistic regression (Vol. 398). John Wiley & Sons.
[11] Jeliazkov, I., & Rahman, M. A. (2013). Binary and ordinal data analysis in economics: Modeling and estimation. Mathematical Modeling with Multidisciplinary Applications, 123–150.
[12] Jensen, F. V, & Nielsen, T. D. (2007). Bayesian networks and decision graphs (Vol. 2). Springer.
[13] Koller, D., & Friedman, N. (2009). Probabilistic graphical models: principles and techniques. MIT press.