“Minds are like parachutes - they only function when open.”
- Thomas Dewar
Logistic regression is useful for situations in which you want to be able to predict the presence or absence of a characteristic or outcome based on values of a set of predictor variables. It is similar to a linear regression model but is suited to models where the dependent variable is dichotomous. Logistic regression coefficients can be used to estimate odds ratios for each of the independent variables in the model. Logistic regression is applicable to a broader range of research situations than discriminant analysis.
Logistic Regression provides the following unique features:
Hosmer-Lemeshow test of goodness of fit for the model.
Contrasts to define model parameterization.
Alternative cut points for classification.
Model fitted on one set of cases to a held-out set of cases.
Saves predictions, residuals, and influence statistics.
Multinomial Logistic Regression
Multinomial Logistic Regression is useful for situations in which you want to be able to classify subjects based on values of a set of predictor variables. This type of regression is similar to logistic regression, but it is more general because the dependent variable is not restricted to two categories.
Multinomial Logistic Regression provides the following unique features:
Pearson and deviance chi-square tests for goodness of fit of the model.
Specification of subpopulations for grouping of data for goodness-of-fit tests.
Listing of counts, predicted counts, and residuals by subpopulations.
Correction of variance estimates for over-dispersion.
Covariance matrix of the parameter estimates.
Tests of linear combinations of parameters.
Explicit specification of nested models.
Fit 1-1 matched conditional logistic regression models using differenced variables.
This procedure measures the relationship between the strength of a stimulus and the proportion of cases exhibiting a certain response to the stimulus. It is useful for situations where you have a dichotomous output that is thought to be influenced or caused by levels of some independent variable(s) and is particularly well suited to experimental data. This procedure will allow you to estimate the strength of a stimulus required to induce a certain proportion of responses, such as the median effective dose.
PROBIT Command Additional Features: The command syntax language also allows you to:
Request an analysis on both the probit and logit models.
Control the treatment of missing values.
Transform the covariates by bases other than base 10 or natural log.
Nonlinear regression is a method of finding a nonlinear model of the relationship between the dependent variable and a set of independent variables. Unlike traditional linear regression, which is restricted to estimating linear models, nonlinear regression can estimate models with arbitrary relationships between independent and dependent variables. This is accomplished using iterative estimation algorithms. Note that this procedure is not necessary for simple polynomial models of the form Y = A + BX**2. By defining W = X**2, we get a simple linear model, Y = A + BW, which can be estimated using traditional methods such as the Linear Regression procedure.
Conditional Logic (Nonlinear Regression): You can specify a segmented model using conditional logic. To use conditional logic within a model expression or a loss function, you form the sum of a series of terms, one for each condition. Each term consists of a logical expression (in parentheses) multiplied by the expression that should result when that logical expression is true.
Standard linear regression models assume that variance is constant within the population under study. When this is not the case—for example, when cases that are high on some attribute show more variability than cases that are low on that attribute—linear regression using ordinary least squares (OLS) no longer provides optimal model estimates. If the differences in variability can be predicted from another variable, the Weight Estimation procedure can compute the coefficients of a linear regression model using weighted least squares (WLS), such that the more precise observations (that is, those with less variability) are given greater weight in determining the regression coefficients. The Weight Estimation procedure tests a range of weight transformations & indicates which will give the best fit to the data.
WLS Command Additional Features: The command syntax language also allows you to:
Provide a single value for the power.
Specify a list of power values, or mix a range of values with a list of values for the power.
Two-Stage Least-Squares Regression
Standard linear regression models assume that errors in the dependent variable are uncorrelated with the independent variable(s). When this is not the case (for example, when relationships between variables are bidirectional), linear regression using ordinary least squares (OLS) no longer provides optimal model estimates. Two-stage least-squares regression uses instrumental variables that are uncorrelated with the error terms to compute estimated values of the problematic predictor(s) (the first stage), & then uses those computed values to estimate a linear regression model of the dependent variable (the second stage). Since the computed values are based on variables that are uncorrelated with the errors, the results of the two-stage model are optimal.
2SLS Command Additional Features: The command syntax language also allows you to estimate multiple equations simultaneously.
Categorical Variable Coding Schemes
In many procedures, you can request automatic replacement of a categorical independent variable with a set of contrast variables, which will then be entered or removed from an equation as a block. You can specify how the set of contrast variables is to be coded, usually on the CONTRAST subcommand. This appendix explains & illustrates how different contrast types requested on CONTRAST actually work.