publication . Preprint . 2017

Parameter Selection Algorithm For Continuous Variables

Tavallali, Peyman; Razavi, Marianne; Brady, Sean;
Open Access English
  • Published: 19 Jan 2017
In this article, we propose a new algorithm for supervised learning methods, by which one can both capture the non-linearity in data and also find the best subset model. To produce an enhanced subset of the original variables, an ideal selection method should have the potential of adding a supplementary level of regression analysis that would capture complex relationships in the data via mathematical transformation of the predictors and exploration of synergistic effects of combined variables. The method that we present here has the potential to produce an optimal subset of variables, rendering the overall process of model selection to be more efficient. The cor...
free text keywords: Statistics - Applications, Statistics - Methodology, Statistics - Machine Learning
Funded by
  • Funder: National Institutes of Health (NIH)
  • Project Code: N01HC025195-005
Download from
35 references, page 1 of 3

3.1. Values for ς. In fact, the independence limit ς can be characterized with the VIF concept. As mentioned, V IFj = Cjj , however, this formula can also be written as

[1] Framingham heart study. Accessed: 2016-07-14.

[2] American Heart Association et al. Heart disease and stroke statistics-at-a-glance, 2015.

[3] George EP Box and David R Cox. An analysis of transformations. Journal of the Royal Statistical Society. Series B (Methodological), pages 211-252, 1964. [OpenAIRE]

[4] George EP Box and Paul W Tidwell. Transformation of the independent variables. Technometrics, 4(4):531-550, 1962. [OpenAIRE]

[5] E.J. Candes and T. Tao. Near-optimal signal recovery from random projections: Universal encoding strategies? Information Theory, IEEE Transactions on, 52(12):5406-5425, 2006.

[6] Emmanuel Candes and Terence Tao. The dantzig selector: Statistical estimation when p is much larger than n. The Annals of Statistics, pages 2313-2351, 2007. [OpenAIRE]

[7] S.S. Chen, D.L. Donoho, and M.A. Saunders. Atomic decomposition by basis pursuit. SIAM journal on scientific computing, 20(1):33-61, 1998.

[8] Thomas R Dawber, Gilcin F Meadors, and Felix E Moore Jr. Epidemiological approaches to heart disease: The framingham study*. American Journal of Public Health and the Nations Health, 41(3):279-286, 1951.

[9] Bradley Efron, Trevor Hastie, Iain Johnstone, Robert Tibshirani, et al. Least angle regression. The Annals of statistics, 32(2):407-499, 2004.

[10] MA Efroymson. Multiple regression analysis. Mathematical methods for digital computers, 1:191-203, 1960.

[11] Jerome Friedman, Trevor Hastie, and Robert Tibshirani. The elements of statistical learning, volume 1. Springer series in statistics Springer, Berlin, 2001.

[12] George M Furnival. All possible regressions with less computation. Technometrics, 13(2):403-408, 1971.

[13] George M Furnival and Robert W Wilson. Regressions by leaps and bounds. Technometrics, 42(1):69-79, 2000.

[14] MJ Garside. The best sub-set in multiple regression analysis. Applied Statistics, pages 196-200, 1965.

35 references, page 1 of 3
Powered by OpenAIRE Open Research Graph
Any information missing or wrong?Report an Issue