Multiple regression results help The 2019 Stack Overflow Developer Survey Results Are In Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern)Not-significant F but a significant coefficient in multiple linear regressionWhen to transform predictor variables when doing multiple regression?Combining multiple imputation results for hierarchical regression in SPSSHow to deal with different outcomes between pairwise correlations and multiple regressionProbing effects in a multivariate multiple regressionPredictor flipping sign in regression with no multicollinearityMeta analysis of Multiple regressionWhat is the difference between each predictor's standardized betas (from multiple regression) and it's Pearson's correlation coefficient?Multiple linear regression coefficients meaningWhat is the correct way to follow up a multivariate multiple regression?

Wall plug outlet change

How to colour the US map with Yellow, Green, Red and Blue to minimize the number of states with the colour of Green

How to split my screen on my Macbook Air?

Semisimplicity of the category of coherent sheaves?

What can I do if neighbor is blocking my solar panels intentionally?

Is every episode of "Where are my Pants?" identical?

What force causes entropy to increase?

How to pronounce 1ターン?

Is this wall load bearing? Blueprints and photos attached

What is special about square numbers here?

Take groceries in checked luggage

Wolves and sheep

How did the audience guess the pentatonic scale in Bobby McFerrin's presentation?

rotate text in posterbox

When did F become S in typeography, and why?

Why can't devices on different VLANs, but on the same subnet, communicate?

Derivation tree not rendering

Didn't get enough time to take a Coding Test - what to do now?

Simulating Exploding Dice

Sort a list of pairs representing an acyclic, partial automorphism

Relations between two reciprocal partial derivatives?

Can withdrawing asylum be illegal?

Is it ethical to upload a automatically generated paper to a non peer-reviewed site as part of a larger research?

Segmentation fault output is suppressed when piping stdin into a function. Why?



Multiple regression results help



The 2019 Stack Overflow Developer Survey Results Are In
Announcing the arrival of Valued Associate #679: Cesar Manara
Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern)Not-significant F but a significant coefficient in multiple linear regressionWhen to transform predictor variables when doing multiple regression?Combining multiple imputation results for hierarchical regression in SPSSHow to deal with different outcomes between pairwise correlations and multiple regressionProbing effects in a multivariate multiple regressionPredictor flipping sign in regression with no multicollinearityMeta analysis of Multiple regressionWhat is the difference between each predictor's standardized betas (from multiple regression) and it's Pearson's correlation coefficient?Multiple linear regression coefficients meaningWhat is the correct way to follow up a multivariate multiple regression?



.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








2












$begingroup$


For my first ever research paper I've run a hierarchal multiple linear regression with two predictors and one outcome variable, however I don't understand my results. I've found predictor A to be a significant predictor for my outcome variable alone. However, when both my predictors are in the model, predictor A is not a significant predictor, only predictor B is. How can this be if predictor A was significant in the first model? How does predictor B change how significant predictor A is?



Thank you!










share|cite|improve this question







New contributor




ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.







$endgroup$


















    2












    $begingroup$


    For my first ever research paper I've run a hierarchal multiple linear regression with two predictors and one outcome variable, however I don't understand my results. I've found predictor A to be a significant predictor for my outcome variable alone. However, when both my predictors are in the model, predictor A is not a significant predictor, only predictor B is. How can this be if predictor A was significant in the first model? How does predictor B change how significant predictor A is?



    Thank you!










    share|cite|improve this question







    New contributor




    ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.







    $endgroup$














      2












      2








      2





      $begingroup$


      For my first ever research paper I've run a hierarchal multiple linear regression with two predictors and one outcome variable, however I don't understand my results. I've found predictor A to be a significant predictor for my outcome variable alone. However, when both my predictors are in the model, predictor A is not a significant predictor, only predictor B is. How can this be if predictor A was significant in the first model? How does predictor B change how significant predictor A is?



      Thank you!










      share|cite|improve this question







      New contributor




      ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.







      $endgroup$




      For my first ever research paper I've run a hierarchal multiple linear regression with two predictors and one outcome variable, however I don't understand my results. I've found predictor A to be a significant predictor for my outcome variable alone. However, when both my predictors are in the model, predictor A is not a significant predictor, only predictor B is. How can this be if predictor A was significant in the first model? How does predictor B change how significant predictor A is?



      Thank you!







      multiple-regression mlr






      share|cite|improve this question







      New contributor




      ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.











      share|cite|improve this question







      New contributor




      ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      share|cite|improve this question




      share|cite|improve this question






      New contributor




      ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.









      asked 10 hours ago









      ummmmummmm

      111




      111




      New contributor




      ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.





      New contributor





      ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






      ummmm is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.




















          3 Answers
          3






          active

          oldest

          votes


















          1












          $begingroup$

          regression coefficients reflect the simultaneous effects of multiple predictors. If the two predictors are inter-dependent (i.e. correlated) the results can differ from single input models.






          share|cite|improve this answer









          $endgroup$




















            1












            $begingroup$

            The tests in multiple regression are "added last" tests. That means they test whether the model significantly improves after including the extra variable in a regression that contains all other predictors.



            In your model with no predictors, adding A improves the model, so the test of A is significant in the model with only A.



            In a model with A already in the model, adding B improves the model, so the test of B is significant in the model with A and B. But in a model with B already in the model, adding A doesn't improve the model, so the test of A is not significant in the model with A and B. B is doing all the work that A would do, so adding A doesn't improve the model beyond B.



            As @IrishStat mentioned, this can occur when A and B are correlated (positively or negatively) with each other. It's a fairly common occurrence in regression modeling. The conclusion you might draw is that A predicts the outcome when B is not in the model (i.e., unavailable), but after including B, A doesn't do much more to predict the outcome. Unfortunately, without more information about the causal structure of your variables, there is little more interpretation available.






            share|cite|improve this answer









            $endgroup$




















              0












              $begingroup$

              To expand a little on @Noah and @IrishStat's answers, in a multiple regression, coefficients for each independent variable/predictor are estimated to obtain the direct effect of each variable, using variation unique to that variable and the variable's correlation with the outcome variable, not using variation shared by predictors. (In technical terms, we are talking about variance and covariance of these variables.) The less unique variation there is, the less significant the estimate will become.



              So why, in your example, did you end up with an insignificant predictor A when B was added, and not with a significant predictor A and insignificant predictor B? It is likely because the proportion of variance of predictor A that it has in common with predictor B is larger than the proportion of variance of predictor B that it has in common with predictor A.






              share|cite|improve this answer









              $endgroup$













                Your Answer








                StackExchange.ready(function()
                var channelOptions =
                tags: "".split(" "),
                id: "65"
                ;
                initTagRenderer("".split(" "), "".split(" "), channelOptions);

                StackExchange.using("externalEditor", function()
                // Have to fire editor after snippets, if snippets enabled
                if (StackExchange.settings.snippets.snippetsEnabled)
                StackExchange.using("snippets", function()
                createEditor();
                );

                else
                createEditor();

                );

                function createEditor()
                StackExchange.prepareEditor(
                heartbeatType: 'answer',
                autoActivateHeartbeat: false,
                convertImagesToLinks: false,
                noModals: true,
                showLowRepImageUploadWarning: true,
                reputationToPostImages: null,
                bindNavPrevention: true,
                postfix: "",
                imageUploader:
                brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
                contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
                allowUrls: true
                ,
                onDemand: true,
                discardSelector: ".discard-answer"
                ,immediatelyShowMarkdownHelp:true
                );



                );






                ummmm is a new contributor. Be nice, and check out our Code of Conduct.









                draft saved

                draft discarded


















                StackExchange.ready(
                function ()
                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f402894%2fmultiple-regression-results-help%23new-answer', 'question_page');

                );

                Post as a guest















                Required, but never shown

























                3 Answers
                3






                active

                oldest

                votes








                3 Answers
                3






                active

                oldest

                votes









                active

                oldest

                votes






                active

                oldest

                votes









                1












                $begingroup$

                regression coefficients reflect the simultaneous effects of multiple predictors. If the two predictors are inter-dependent (i.e. correlated) the results can differ from single input models.






                share|cite|improve this answer









                $endgroup$

















                  1












                  $begingroup$

                  regression coefficients reflect the simultaneous effects of multiple predictors. If the two predictors are inter-dependent (i.e. correlated) the results can differ from single input models.






                  share|cite|improve this answer









                  $endgroup$















                    1












                    1








                    1





                    $begingroup$

                    regression coefficients reflect the simultaneous effects of multiple predictors. If the two predictors are inter-dependent (i.e. correlated) the results can differ from single input models.






                    share|cite|improve this answer









                    $endgroup$



                    regression coefficients reflect the simultaneous effects of multiple predictors. If the two predictors are inter-dependent (i.e. correlated) the results can differ from single input models.







                    share|cite|improve this answer












                    share|cite|improve this answer



                    share|cite|improve this answer










                    answered 8 hours ago









                    IrishStatIrishStat

                    21.4k42342




                    21.4k42342























                        1












                        $begingroup$

                        The tests in multiple regression are "added last" tests. That means they test whether the model significantly improves after including the extra variable in a regression that contains all other predictors.



                        In your model with no predictors, adding A improves the model, so the test of A is significant in the model with only A.



                        In a model with A already in the model, adding B improves the model, so the test of B is significant in the model with A and B. But in a model with B already in the model, adding A doesn't improve the model, so the test of A is not significant in the model with A and B. B is doing all the work that A would do, so adding A doesn't improve the model beyond B.



                        As @IrishStat mentioned, this can occur when A and B are correlated (positively or negatively) with each other. It's a fairly common occurrence in regression modeling. The conclusion you might draw is that A predicts the outcome when B is not in the model (i.e., unavailable), but after including B, A doesn't do much more to predict the outcome. Unfortunately, without more information about the causal structure of your variables, there is little more interpretation available.






                        share|cite|improve this answer









                        $endgroup$

















                          1












                          $begingroup$

                          The tests in multiple regression are "added last" tests. That means they test whether the model significantly improves after including the extra variable in a regression that contains all other predictors.



                          In your model with no predictors, adding A improves the model, so the test of A is significant in the model with only A.



                          In a model with A already in the model, adding B improves the model, so the test of B is significant in the model with A and B. But in a model with B already in the model, adding A doesn't improve the model, so the test of A is not significant in the model with A and B. B is doing all the work that A would do, so adding A doesn't improve the model beyond B.



                          As @IrishStat mentioned, this can occur when A and B are correlated (positively or negatively) with each other. It's a fairly common occurrence in regression modeling. The conclusion you might draw is that A predicts the outcome when B is not in the model (i.e., unavailable), but after including B, A doesn't do much more to predict the outcome. Unfortunately, without more information about the causal structure of your variables, there is little more interpretation available.






                          share|cite|improve this answer









                          $endgroup$















                            1












                            1








                            1





                            $begingroup$

                            The tests in multiple regression are "added last" tests. That means they test whether the model significantly improves after including the extra variable in a regression that contains all other predictors.



                            In your model with no predictors, adding A improves the model, so the test of A is significant in the model with only A.



                            In a model with A already in the model, adding B improves the model, so the test of B is significant in the model with A and B. But in a model with B already in the model, adding A doesn't improve the model, so the test of A is not significant in the model with A and B. B is doing all the work that A would do, so adding A doesn't improve the model beyond B.



                            As @IrishStat mentioned, this can occur when A and B are correlated (positively or negatively) with each other. It's a fairly common occurrence in regression modeling. The conclusion you might draw is that A predicts the outcome when B is not in the model (i.e., unavailable), but after including B, A doesn't do much more to predict the outcome. Unfortunately, without more information about the causal structure of your variables, there is little more interpretation available.






                            share|cite|improve this answer









                            $endgroup$



                            The tests in multiple regression are "added last" tests. That means they test whether the model significantly improves after including the extra variable in a regression that contains all other predictors.



                            In your model with no predictors, adding A improves the model, so the test of A is significant in the model with only A.



                            In a model with A already in the model, adding B improves the model, so the test of B is significant in the model with A and B. But in a model with B already in the model, adding A doesn't improve the model, so the test of A is not significant in the model with A and B. B is doing all the work that A would do, so adding A doesn't improve the model beyond B.



                            As @IrishStat mentioned, this can occur when A and B are correlated (positively or negatively) with each other. It's a fairly common occurrence in regression modeling. The conclusion you might draw is that A predicts the outcome when B is not in the model (i.e., unavailable), but after including B, A doesn't do much more to predict the outcome. Unfortunately, without more information about the causal structure of your variables, there is little more interpretation available.







                            share|cite|improve this answer












                            share|cite|improve this answer



                            share|cite|improve this answer










                            answered 7 hours ago









                            NoahNoah

                            3,6811417




                            3,6811417





















                                0












                                $begingroup$

                                To expand a little on @Noah and @IrishStat's answers, in a multiple regression, coefficients for each independent variable/predictor are estimated to obtain the direct effect of each variable, using variation unique to that variable and the variable's correlation with the outcome variable, not using variation shared by predictors. (In technical terms, we are talking about variance and covariance of these variables.) The less unique variation there is, the less significant the estimate will become.



                                So why, in your example, did you end up with an insignificant predictor A when B was added, and not with a significant predictor A and insignificant predictor B? It is likely because the proportion of variance of predictor A that it has in common with predictor B is larger than the proportion of variance of predictor B that it has in common with predictor A.






                                share|cite|improve this answer









                                $endgroup$

















                                  0












                                  $begingroup$

                                  To expand a little on @Noah and @IrishStat's answers, in a multiple regression, coefficients for each independent variable/predictor are estimated to obtain the direct effect of each variable, using variation unique to that variable and the variable's correlation with the outcome variable, not using variation shared by predictors. (In technical terms, we are talking about variance and covariance of these variables.) The less unique variation there is, the less significant the estimate will become.



                                  So why, in your example, did you end up with an insignificant predictor A when B was added, and not with a significant predictor A and insignificant predictor B? It is likely because the proportion of variance of predictor A that it has in common with predictor B is larger than the proportion of variance of predictor B that it has in common with predictor A.






                                  share|cite|improve this answer









                                  $endgroup$















                                    0












                                    0








                                    0





                                    $begingroup$

                                    To expand a little on @Noah and @IrishStat's answers, in a multiple regression, coefficients for each independent variable/predictor are estimated to obtain the direct effect of each variable, using variation unique to that variable and the variable's correlation with the outcome variable, not using variation shared by predictors. (In technical terms, we are talking about variance and covariance of these variables.) The less unique variation there is, the less significant the estimate will become.



                                    So why, in your example, did you end up with an insignificant predictor A when B was added, and not with a significant predictor A and insignificant predictor B? It is likely because the proportion of variance of predictor A that it has in common with predictor B is larger than the proportion of variance of predictor B that it has in common with predictor A.






                                    share|cite|improve this answer









                                    $endgroup$



                                    To expand a little on @Noah and @IrishStat's answers, in a multiple regression, coefficients for each independent variable/predictor are estimated to obtain the direct effect of each variable, using variation unique to that variable and the variable's correlation with the outcome variable, not using variation shared by predictors. (In technical terms, we are talking about variance and covariance of these variables.) The less unique variation there is, the less significant the estimate will become.



                                    So why, in your example, did you end up with an insignificant predictor A when B was added, and not with a significant predictor A and insignificant predictor B? It is likely because the proportion of variance of predictor A that it has in common with predictor B is larger than the proportion of variance of predictor B that it has in common with predictor A.







                                    share|cite|improve this answer












                                    share|cite|improve this answer



                                    share|cite|improve this answer










                                    answered 32 mins ago









                                    AlexKAlexK

                                    1908




                                    1908




















                                        ummmm is a new contributor. Be nice, and check out our Code of Conduct.









                                        draft saved

                                        draft discarded


















                                        ummmm is a new contributor. Be nice, and check out our Code of Conduct.












                                        ummmm is a new contributor. Be nice, and check out our Code of Conduct.











                                        ummmm is a new contributor. Be nice, and check out our Code of Conduct.














                                        Thanks for contributing an answer to Cross Validated!


                                        • Please be sure to answer the question. Provide details and share your research!

                                        But avoid


                                        • Asking for help, clarification, or responding to other answers.

                                        • Making statements based on opinion; back them up with references or personal experience.

                                        Use MathJax to format equations. MathJax reference.


                                        To learn more, see our tips on writing great answers.




                                        draft saved


                                        draft discarded














                                        StackExchange.ready(
                                        function ()
                                        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstats.stackexchange.com%2fquestions%2f402894%2fmultiple-regression-results-help%23new-answer', 'question_page');

                                        );

                                        Post as a guest















                                        Required, but never shown





















































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown

































                                        Required, but never shown














                                        Required, but never shown












                                        Required, but never shown







                                        Required, but never shown







                                        Popular posts from this blog

                                        名間水力發電廠 目录 沿革 設施 鄰近設施 註釋 外部連結 导航菜单23°50′10″N 120°42′41″E / 23.83611°N 120.71139°E / 23.83611; 120.7113923°50′10″N 120°42′41″E / 23.83611°N 120.71139°E / 23.83611; 120.71139計畫概要原始内容臺灣第一座BOT 模式開發的水力發電廠-名間水力電廠名間水力發電廠 水利署首件BOT案原始内容《小檔案》名間電廠 首座BOT水力發電廠原始内容名間電廠BOT - 經濟部水利署中區水資源局

                                        Prove that NP is closed under karp reduction?Space(n) not closed under Karp reductions - what about NTime(n)?Class P is closed under rotation?Prove or disprove that $NL$ is closed under polynomial many-one reductions$mathbfNC_2$ is closed under log-space reductionOn Karp reductionwhen can I know if a class (complexity) is closed under reduction (cook/karp)Check if class $PSPACE$ is closed under polyonomially space reductionIs NPSPACE also closed under polynomial-time reduction and under log-space reduction?Prove PSPACE is closed under complement?Prove PSPACE is closed under union?

                                        Is my guitar’s action too high? Announcing the arrival of Valued Associate #679: Cesar Manara Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)Strings too stiff on a recently purchased acoustic guitar | Cort AD880CEIs the action of my guitar really high?Μy little finger is too weak to play guitarWith guitar, how long should I give my fingers to strengthen / callous?When playing a fret the guitar sounds mutedPlaying (Barre) chords up the guitar neckI think my guitar strings are wound too tight and I can't play barre chordsF barre chord on an SG guitarHow to find to the right strings of a barre chord by feel?High action on higher fret on my steel acoustic guitar