понедельник, 29 июля 2019 г.

Deciphering Interactions in Logistic Regression

Deciphering Interactions in Logistic Regression.

Deciphering Interactions in Logistic Regression
Some Definitions. Showing that odds are ratios. Natural log of the odds, also known as a logit. Showing that odds ratios are actually ratios of ratios. Computing Odds Ratio from Logistic Regression Coefficient. Computing Probability from Logistic Regression Coefficients. Where Xb is the linear predictor. About Logistic Regression. Logistic regression fits a maximum likelihood logit model. The model estimates conditional means in terms of logits (log odds). The logit model is a linear model in the log odds metric. Logistic regression results can be displayed as odds ratios or as probabilities. Probabilities are a nonlinear transformation of the log odds results. In general, linear models have a number of advantages over nonlinear models and are easier to work with. For example, in linear models the slopes and/or differences in means do not change for differing values of a covariate. This is not necessarily the case for nonlinear models. The problem in logistic regression is that, even though the model is linear in log odds, many researchers feel that log odds are not a natural metric and are not easily interpreted. Probability is a much more natural metric. However, the logit model is not linear when working in the probability metric. Thus, the predicted probabilities change as the values of a covariate change. In fact, the estimated probabilities depend on all variables in the model not just the variables in the interaction. So what is a linear model? A linear model is linear in the betas (coefficients). By extension, a nonlinear model must be nonlinear in the betas. Below are three example of linear and nonlinear models. First, is an example of a linear model and its graph.
Deciphering Interactions in Logistic Regression
Next we have an example of a nonlinear model and its graph. In this case its an exponential growth model. Lastly we have another nonlinear model. This one shows the nonlinear transformation of log odds to probabilities. Logistic Regression Transformations. This is an attempt to show the different types of transformations that can occur with logistic regression models. Logistic interactions are a complex concept. Common wisdom suggests that interactions involves exploring differences in differences. If the differences are not different then there is no interaction. But in logistic regression interaction is a more complex concept. Researchers need to decide on how to conceptualize the interaction. Is the interaction to be conceptualized in terms of log odds (logits) or odds ratios or probability? This decision can make a big difference. An interaction that is significant in log odds may not be significant in terms of difference in differences for probability. Or vice versa . Model 1: categorical by categorical interaction. Log odds metric — categorical by categorical interaction. Variables f and h are binary predictors, while cv1 is a continuous covariate. The nolog option suppresses the display of the iteration log; it is used here simply to minimize the quantity of output. The interaction term is clearly significant. We could manually compute the expected logits for each of the four cells in the model. We can also use a cell-means model to obtain the expected logits for each cell when cv1 =0. The nocons option is used omit the constant term. Because the constant is not included in the calculations, a coefficient for the reference group is calculated. And here is what the expected logits look like in a 2×2 table.

Комментариев нет:

Отправить комментарий