To understand the general form of the score statistics, let be the vector of first partial derivatives of the log likelihood with respect to the parameter vector , and let be the matrix of second partial derivatives of the log likelihood with respect to . That is, is the gradient vector, and is the Hessian matrix. Let be either or the expected value of . Consider a null hypothesis . Let be the MLE of under . The chi-square score statistic for testing is defined by
and it has an asymptotic distribution with r degrees of freedom under , where r is the number of restrictions imposed on by .
When you use SELECTION= FORWARD, BACKWARD, or STEPWISE, the procedure calculates a residual chi-square score statistic and reports the statistic, its degrees of freedom, and the p-value. This section describes how the statistic is calculated.
Suppose there are s explanatory effects of interest. The full cumulative response model has a parameter vector
where are intercept parameters, and are the common slope parameters for the s explanatory effects. The full generalized logit model has a parameter vector
where is the slope parameter for the jth effect in the ith logit.
Consider the null hypothesis , where for the cumulative response model, and , for the generalized logit model. For the reduced model with t explanatory effects, let be the MLEs of the unknown intercept parameters, let be the MLEs of the unknown slope parameters, and let , be those for the generalized logit model. The residual chi-square is the chi-square score statistic testing the null hypothesis ; that is, the residual chi-square is
where for the cumulative response model , and for the generalized logit model , where denotes a vector of zeros.
The residual chi-square has an asymptotic chi-square distribution with degrees of freedom ( for the generalized logit model). A special case is the global score chi-square, where the reduced model consists of the k intercepts and no explanatory effects. The global score statistic is displayed in the "Testing Global Null Hypothesis: BETA=0" table. The table is not produced when the NOFIT option is used, but the global score statistic is displayed.
These tests are performed when you specify SELECTION= FORWARD or STEPWISE, and are displayed when the DETAILS option is specified. In the displayed output, the tests are labeled "Score Chi-Square" in the "Analysis of Effects Eligible for Entry" table and in the "Summary of Stepwise (Forward) Selection" table. This section describes how the tests are calculated.
Suppose that k intercepts and t explanatory variables (say ) have been fit to a model and that is another explanatory variable of interest. Consider a full model with the k intercepts and explanatory variables () and a reduced model with excluded. The significance of adjusted for can be determined by comparing the corresponding residual chi-square with a chi-square distribution with one degree of freedom (k degrees of freedom for the generalized logit model).
For an ordinal response, PROC LOGISTIC performs a test of the parallel lines assumption. In the displayed output, this test is labeled "Score Test for the Equal Slopes Assumption" when the LINK= option is NORMIT or CLOGLOG. When LINK=LOGIT, the test is labeled as "Score Test for the Proportional Odds Assumption" in the output. For small sample sizes, this test might be too liberal (Stokes, Davis, and Koch, 2000, p. 249). This section describes the methods used to calculate the test.
For this test the number of response levels, , is assumed to be strictly greater than 2. Let Y be the response variable taking values . Suppose there are s explanatory variables. Consider the general cumulative model without making the parallel lines assumption
where is the link function, and is a vector of unknown parameters consisting of an intercept and s slope parameters . The parameter vector for this general cumulative model is
Under the null hypothesis of parallelism , there is a single common slope parameter for each of the s explanatory variables. Let be the common slope parameters. Let and be the MLEs of the intercept parameters and the common slope parameters. Then, under , the MLE of is
and the chi-square score statistic has an asymptotic chi-square distribution with degrees of freedom. This tests the parallel lines assumption by testing the equality of separate slope parameters simultaneously for all explanatory variables.