My project for STATS-608A in Fall 2018 at the University of Michigan
-
Updated
Dec 12, 2018 - TeX
My project for STATS-608A in Fall 2018 at the University of Michigan
Machine learning time series regressions
Block coordinate descent for group lasso
We explored various approaches to deal with high-dimensional data in this study, and we compared them using simulation and soil datasets. We discovered that grouping had a significant impact on model correctness and error reduction. For the core projection step, we first looked at the properties of all the algorithms and how they function to com…
Molecular-property prediction with sparsity
Procedure of variable selection in the context of redundancy between explanatory variables, which holds true with high dimensional data
Regularization paths of linear, logistic, Poisson, or Cox models with overlapping grouped covariates
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Penalized least squares estimation using the Orthogonalizing EM (OEM) algorithm
This is a development version of DMRnet — Delete or Merge Regressors Algorithms for Linear and Logistic Model Selection and High-Dimensional Data.
北大文再文-最优化方法(凸优化)-程序作业 Course homework for Optimization Methods 2023 Fall, PKU WenZW
R Package: Adaptively weighted group lasso for semiparametic quantile regression models
Add a description, image, and links to the group-lasso topic page so that developers can more easily learn about it.
To associate your repository with the group-lasso topic, visit your repo's landing page and select "manage topics."