Subsampling Strategies for Bayesian Variable Selection and Model Averaging in GLM and BGNLM
Abstract: Bayesian Generalized Nonlinear Models (BGNLM) offer a flexible alternative to GLM while still providing better interpretability than machine learning techniques such as neural networks. In BGNLM, the methods of Bayesian Variable Selection and Model Averaging are applied in an extended GLM setting. Models are fitted to data using MCMC within a genetic framework in an algorithm called GMJMCMC. In this thesis, we present a new implementation of the algorithm as a package in the programming language R. We also present a novel algorithm called S-IRLS-SGD for estimating the MLE of a GLM by subsampling the data. Finally, we present some theory combining the novel algorithm with GMJMCMC/MJMCMC/MCMC and a number of experiments demonstrating the performance of the contributed algorithm.
AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)