How to reduce predictors the right way for a logistic regression modelValidating a logistic regression for a specific $x$Logistic regression with sparse predictor variablesWhat represents the output of a logistic regression in RSequential classification methodsLogistic Regression: Does my model selection process make sense?Transformations for Logistic Regression PredictorsLogistic Regression Model building (dropping p-values)Maximum number of categorical predictors in multinomial (polytomous) logistic regressionHow to determine the best forecasting model for this type of data?Why are ROC curves and AUC values not always relevant?
How do I fix the group tension caused by my character stealing and possibly killing without provocation?
Does the Crossbow Expert feat's extra crossbow attack work with the reaction attack from a Hunter ranger's Giant Killer feature?
Proving an identity involving cross products and coplanar vectors
How do I prevent inappropriate ads from appearing in my game?
Can you identify this lizard-like creature I observed in the UK?
Sigmoid with a slope but no asymptotes?
Possible Eco thriller, man invents a device to remove rain from glass
What's the name of the logical fallacy where a debater extends a statement far beyond the original statement to make it true?
How to make a list of partial sums using forEach
Why is the principal energy of an electron lower for excited electrons in a higher energy state?
What does "Scientists rise up against statistical significance" mean? (Comment in Nature)
Grepping string, but include all non-blank lines following each grep match
Should I warn a new PhD Student?
Why the various definitions of the thin space ,?
What the heck is gets(stdin) on site coderbyte?
Can I say "fingers" when referring to toes?
How many people need to be born every 8 years to sustain population?
Difference between shutdown options
If Captain Marvel (MCU) were to have a child with a human male, would the child be human or Kree?
Origin of pigs as a species
Unable to disable Microsoft Store in domain environment
How do you justify more code being written by following clean code practices?
How to make money from a browser who sees 5 seconds into the future of any web page?
Is there a reason to prefer HFS+ over APFS for disk images in High Sierra and/or Mojave?
How to reduce predictors the right way for a logistic regression model
Validating a logistic regression for a specific $x$Logistic regression with sparse predictor variablesWhat represents the output of a logistic regression in RSequential classification methodsLogistic Regression: Does my model selection process make sense?Transformations for Logistic Regression PredictorsLogistic Regression Model building (dropping p-values)Maximum number of categorical predictors in multinomial (polytomous) logistic regressionHow to determine the best forecasting model for this type of data?Why are ROC curves and AUC values not always relevant?
$begingroup$
So I have been reading some books (or parts of them) on modeling (F. Harrell's "Regression Modeling Strategies" among others), since my current situation right now is that I need to do a logistic model based on binary response data. I have both continuous, categorical, and binary data (predictors) in my data set. Basically I have around 100 predictors right now, which obviously is way too many for a good model. Also, many of these predictors are kind of related, since they are often based on the same metric, although a bit different.
Anyhow, what I have been reading, using univariate regression and step-wise techniques is some of the worst things you can do in order to reduce the amount of predictors. I think the LASSO technique is quite okay (if I understood that correctly), but obviously you just can't use that on 100 predictors and think any good will come of that.
So what are my options here ? Do I really just have to sit down, talk to all my supervisors, and smart people at work, and really think about what the top 5 best predictors could/should be (we might be wrong), or which approach(es) should I consider instead ?
And yes, I also know that this topic is heavily discussed (online and in books), but it sometimes seems a bit overwhelming when you are kind of new in this modeling field.
logistic predictive-models modeling predictor
$endgroup$
add a comment |
$begingroup$
So I have been reading some books (or parts of them) on modeling (F. Harrell's "Regression Modeling Strategies" among others), since my current situation right now is that I need to do a logistic model based on binary response data. I have both continuous, categorical, and binary data (predictors) in my data set. Basically I have around 100 predictors right now, which obviously is way too many for a good model. Also, many of these predictors are kind of related, since they are often based on the same metric, although a bit different.
Anyhow, what I have been reading, using univariate regression and step-wise techniques is some of the worst things you can do in order to reduce the amount of predictors. I think the LASSO technique is quite okay (if I understood that correctly), but obviously you just can't use that on 100 predictors and think any good will come of that.
So what are my options here ? Do I really just have to sit down, talk to all my supervisors, and smart people at work, and really think about what the top 5 best predictors could/should be (we might be wrong), or which approach(es) should I consider instead ?
And yes, I also know that this topic is heavily discussed (online and in books), but it sometimes seems a bit overwhelming when you are kind of new in this modeling field.
logistic predictive-models modeling predictor
$endgroup$
add a comment |
$begingroup$
So I have been reading some books (or parts of them) on modeling (F. Harrell's "Regression Modeling Strategies" among others), since my current situation right now is that I need to do a logistic model based on binary response data. I have both continuous, categorical, and binary data (predictors) in my data set. Basically I have around 100 predictors right now, which obviously is way too many for a good model. Also, many of these predictors are kind of related, since they are often based on the same metric, although a bit different.
Anyhow, what I have been reading, using univariate regression and step-wise techniques is some of the worst things you can do in order to reduce the amount of predictors. I think the LASSO technique is quite okay (if I understood that correctly), but obviously you just can't use that on 100 predictors and think any good will come of that.
So what are my options here ? Do I really just have to sit down, talk to all my supervisors, and smart people at work, and really think about what the top 5 best predictors could/should be (we might be wrong), or which approach(es) should I consider instead ?
And yes, I also know that this topic is heavily discussed (online and in books), but it sometimes seems a bit overwhelming when you are kind of new in this modeling field.
logistic predictive-models modeling predictor
$endgroup$
So I have been reading some books (or parts of them) on modeling (F. Harrell's "Regression Modeling Strategies" among others), since my current situation right now is that I need to do a logistic model based on binary response data. I have both continuous, categorical, and binary data (predictors) in my data set. Basically I have around 100 predictors right now, which obviously is way too many for a good model. Also, many of these predictors are kind of related, since they are often based on the same metric, although a bit different.
Anyhow, what I have been reading, using univariate regression and step-wise techniques is some of the worst things you can do in order to reduce the amount of predictors. I think the LASSO technique is quite okay (if I understood that correctly), but obviously you just can't use that on 100 predictors and think any good will come of that.
So what are my options here ? Do I really just have to sit down, talk to all my supervisors, and smart people at work, and really think about what the top 5 best predictors could/should be (we might be wrong), or which approach(es) should I consider instead ?
And yes, I also know that this topic is heavily discussed (online and in books), but it sometimes seems a bit overwhelming when you are kind of new in this modeling field.
logistic predictive-models modeling predictor
logistic predictive-models modeling predictor
edited 1 hour ago
Ben Bolker
23.4k16393
23.4k16393
asked 1 hour ago
Denver DangDenver Dang
226110
226110
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
+1 for "sometimes seems a bit overwhelming". It really depends (as Harrell clearly states; see the section at the end of Chapter 4) whether you want to do
confirmatory analysis ($to$ reduce your predictor complexity to a reasonable level without looking at the responses, by PCA or subject-area considerations or ...)predictive analysis ($to$ use appropriate penalization methods). Lasso could very well work OK with 100 predictors, if you have a reasonably large sample. Feature selection will be unstable, but that's OK if all you care about is prediction. I have a personal preference for ridge-like approaches that don't technically "select features" (because they never reduce any parameter to exactly zero), but whatever works ...
You'll have to use cross-validation to choose the degree of penalization, which will destroy your ability to do inference (construct confidence intervals on predictions) unless you use cutting-edge high-dimensional inference methods (e.g. Dezeure et al 2015; I have not tried these approaches but they seem sensible ...)
exploratory analysis: have fun, be transparent and honest, don't quote any p-values.
$endgroup$
add a comment |
$begingroup$
There are many different approaches. What I would recommend is trying some simple ones, in the following order:
- L1 regularization (with increasing penalty; the larger the regularization coefficient, the more features will be eliminated)
- Recursive Feature Elimination (https://scikit-learn.org/stable/modules/feature_selection.html#recursive-feature-elimination) -- removes features incrementally by eliminating the features associated with the smallest model coefficients (assuming that those are the least important once; obviously, it's very crucial here to normalize the input features)
- Sequential Feature Selection (http://rasbt.github.io/mlxtend/user_guide/feature_selection/SequentialFeatureSelector/) -- removes features based on how important they are for predictive performance
New contributor
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "65"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e)
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom))
StackExchange.using('gps', function() StackExchange.gps.track('embedded_signup_form.view', location: 'question_page' ); );
$window.unbind('scroll', onScroll);
;
$window.on('scroll', onScroll);
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f398638%2fhow-to-reduce-predictors-the-right-way-for-a-logistic-regression-model%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
+1 for "sometimes seems a bit overwhelming". It really depends (as Harrell clearly states; see the section at the end of Chapter 4) whether you want to do
confirmatory analysis ($to$ reduce your predictor complexity to a reasonable level without looking at the responses, by PCA or subject-area considerations or ...)predictive analysis ($to$ use appropriate penalization methods). Lasso could very well work OK with 100 predictors, if you have a reasonably large sample. Feature selection will be unstable, but that's OK if all you care about is prediction. I have a personal preference for ridge-like approaches that don't technically "select features" (because they never reduce any parameter to exactly zero), but whatever works ...
You'll have to use cross-validation to choose the degree of penalization, which will destroy your ability to do inference (construct confidence intervals on predictions) unless you use cutting-edge high-dimensional inference methods (e.g. Dezeure et al 2015; I have not tried these approaches but they seem sensible ...)
exploratory analysis: have fun, be transparent and honest, don't quote any p-values.
$endgroup$
add a comment |
$begingroup$
+1 for "sometimes seems a bit overwhelming". It really depends (as Harrell clearly states; see the section at the end of Chapter 4) whether you want to do
confirmatory analysis ($to$ reduce your predictor complexity to a reasonable level without looking at the responses, by PCA or subject-area considerations or ...)predictive analysis ($to$ use appropriate penalization methods). Lasso could very well work OK with 100 predictors, if you have a reasonably large sample. Feature selection will be unstable, but that's OK if all you care about is prediction. I have a personal preference for ridge-like approaches that don't technically "select features" (because they never reduce any parameter to exactly zero), but whatever works ...
You'll have to use cross-validation to choose the degree of penalization, which will destroy your ability to do inference (construct confidence intervals on predictions) unless you use cutting-edge high-dimensional inference methods (e.g. Dezeure et al 2015; I have not tried these approaches but they seem sensible ...)
exploratory analysis: have fun, be transparent and honest, don't quote any p-values.
$endgroup$
add a comment |
$begingroup$
+1 for "sometimes seems a bit overwhelming". It really depends (as Harrell clearly states; see the section at the end of Chapter 4) whether you want to do
confirmatory analysis ($to$ reduce your predictor complexity to a reasonable level without looking at the responses, by PCA or subject-area considerations or ...)predictive analysis ($to$ use appropriate penalization methods). Lasso could very well work OK with 100 predictors, if you have a reasonably large sample. Feature selection will be unstable, but that's OK if all you care about is prediction. I have a personal preference for ridge-like approaches that don't technically "select features" (because they never reduce any parameter to exactly zero), but whatever works ...
You'll have to use cross-validation to choose the degree of penalization, which will destroy your ability to do inference (construct confidence intervals on predictions) unless you use cutting-edge high-dimensional inference methods (e.g. Dezeure et al 2015; I have not tried these approaches but they seem sensible ...)
exploratory analysis: have fun, be transparent and honest, don't quote any p-values.
$endgroup$
+1 for "sometimes seems a bit overwhelming". It really depends (as Harrell clearly states; see the section at the end of Chapter 4) whether you want to do
confirmatory analysis ($to$ reduce your predictor complexity to a reasonable level without looking at the responses, by PCA or subject-area considerations or ...)predictive analysis ($to$ use appropriate penalization methods). Lasso could very well work OK with 100 predictors, if you have a reasonably large sample. Feature selection will be unstable, but that's OK if all you care about is prediction. I have a personal preference for ridge-like approaches that don't technically "select features" (because they never reduce any parameter to exactly zero), but whatever works ...
You'll have to use cross-validation to choose the degree of penalization, which will destroy your ability to do inference (construct confidence intervals on predictions) unless you use cutting-edge high-dimensional inference methods (e.g. Dezeure et al 2015; I have not tried these approaches but they seem sensible ...)
exploratory analysis: have fun, be transparent and honest, don't quote any p-values.
answered 1 hour ago
Ben BolkerBen Bolker
23.4k16393
23.4k16393
add a comment |
add a comment |
$begingroup$
There are many different approaches. What I would recommend is trying some simple ones, in the following order:
- L1 regularization (with increasing penalty; the larger the regularization coefficient, the more features will be eliminated)
- Recursive Feature Elimination (https://scikit-learn.org/stable/modules/feature_selection.html#recursive-feature-elimination) -- removes features incrementally by eliminating the features associated with the smallest model coefficients (assuming that those are the least important once; obviously, it's very crucial here to normalize the input features)
- Sequential Feature Selection (http://rasbt.github.io/mlxtend/user_guide/feature_selection/SequentialFeatureSelector/) -- removes features based on how important they are for predictive performance
New contributor
$endgroup$
add a comment |
$begingroup$
There are many different approaches. What I would recommend is trying some simple ones, in the following order:
- L1 regularization (with increasing penalty; the larger the regularization coefficient, the more features will be eliminated)
- Recursive Feature Elimination (https://scikit-learn.org/stable/modules/feature_selection.html#recursive-feature-elimination) -- removes features incrementally by eliminating the features associated with the smallest model coefficients (assuming that those are the least important once; obviously, it's very crucial here to normalize the input features)
- Sequential Feature Selection (http://rasbt.github.io/mlxtend/user_guide/feature_selection/SequentialFeatureSelector/) -- removes features based on how important they are for predictive performance
New contributor
$endgroup$
add a comment |
$begingroup$
There are many different approaches. What I would recommend is trying some simple ones, in the following order:
- L1 regularization (with increasing penalty; the larger the regularization coefficient, the more features will be eliminated)
- Recursive Feature Elimination (https://scikit-learn.org/stable/modules/feature_selection.html#recursive-feature-elimination) -- removes features incrementally by eliminating the features associated with the smallest model coefficients (assuming that those are the least important once; obviously, it's very crucial here to normalize the input features)
- Sequential Feature Selection (http://rasbt.github.io/mlxtend/user_guide/feature_selection/SequentialFeatureSelector/) -- removes features based on how important they are for predictive performance
New contributor
$endgroup$
There are many different approaches. What I would recommend is trying some simple ones, in the following order:
- L1 regularization (with increasing penalty; the larger the regularization coefficient, the more features will be eliminated)
- Recursive Feature Elimination (https://scikit-learn.org/stable/modules/feature_selection.html#recursive-feature-elimination) -- removes features incrementally by eliminating the features associated with the smallest model coefficients (assuming that those are the least important once; obviously, it's very crucial here to normalize the input features)
- Sequential Feature Selection (http://rasbt.github.io/mlxtend/user_guide/feature_selection/SequentialFeatureSelector/) -- removes features based on how important they are for predictive performance
New contributor
New contributor
answered 1 hour ago
resnetresnet
1594
1594
New contributor
New contributor
add a comment |
add a comment |
Thanks for contributing an answer to Cross Validated!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e)
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom))
StackExchange.using('gps', function() StackExchange.gps.track('embedded_signup_form.view', location: 'question_page' ); );
$window.unbind('scroll', onScroll);
;
$window.on('scroll', onScroll);
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f398638%2fhow-to-reduce-predictors-the-right-way-for-a-logistic-regression-model%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e)
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom))
StackExchange.using('gps', function() StackExchange.gps.track('embedded_signup_form.view', location: 'question_page' ); );
$window.unbind('scroll', onScroll);
;
$window.on('scroll', onScroll);
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e)
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom))
StackExchange.using('gps', function() StackExchange.gps.track('embedded_signup_form.view', location: 'question_page' ); );
$window.unbind('scroll', onScroll);
;
$window.on('scroll', onScroll);
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e)
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom))
StackExchange.using('gps', function() StackExchange.gps.track('embedded_signup_form.view', location: 'question_page' ); );
$window.unbind('scroll', onScroll);
;
$window.on('scroll', onScroll);
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown