
doi: 10.3758/bf03205385
Although coefficient alpha has not traditionally been considered of aid in item analysis (Crano & Brewer, 1973, p.240-241), Serlin and Kaiser (1976) have recently described an algorithm and computer program for item analysis that attempts to maximize coefficient alpha by deleting items from an instrument. Involved in that procedure is successive extraction of the first principal component from modifications of the item intercorrelation matrix. Since Lord (1958) has shown that the first principal component pattern coefficients divided by the corresponding item standard deviations yield the item weights producing the maximum differentially weighted coefficient alpha, the item with the smallest of these weights in absolute value is dropped and the extraction process is repeated until only two items remain. As a limitation to their procedure, Serlin and Kaiser noted that "ideally, when choosing an item for removal from consideration in the stepwise procedure described, the item discarded should be the one which leads to the maximum increase (or least decrease) in alpha; this procedure is not computationally feasible. Removing the item with the smallest weight ui (in absolute value) seems a reasonable alternative" (1976, p. 759). However, even in iteratively dropping the item that leads to the largest alpha, the subset of items that yields the maximum alpha will not necessarily be found. Only in trying all possible combinations of items is one assured of fmding the maximum coefficient alpha. But, even with the aid of the computer, trying all possible combinations certainly does become prohibitive in computation time required when the number of items becomes appreciable. Morris (in press) demonstrated on several data sets from the literature that, while the all-combinations method necessarily selected the item subset manifesting the maximum coefficient alpha in all cases, the method that iteratively drops items such that alpha is maximized selected the same set of items in each case. On the other hand, the Serlin-Kaiser (1976) principal component method selected some item sets with inferior coefficient alphas and actually took an average of twice as much computer time as the iterative method described by Serlin and Kaiser as "ideal," but "not computationally feasible." An algorithm and computer program was suggested, utilizing the all-combinations procedure for small scales and the method of iteratively dropping items such that alpha is maximized for large scales. Either of these suggested procedures to maximize coefficient alpha may on occasion yield an unsatisfactory set of items. The item subset chosen will have maximum internal consistency reliability, but possibly at the expense of validity. Even though the entire set of items submitted to item analysis may be considered on empirical or logical grounds to be unidimensional, certain items that are obviously logically related to the measurement domain of interest may be excluded in the interest of reaching the absolute maximum reliability. The purpose of this paper is to refine the previously described (Morris, in press) item-analysis algorithm and computer program for maximizing coefficient alpha to better handle this validity problem when it occurs. In the new version, certain items may be "forced" into the final scale at the user's option. These items would be a few that the researcher considers to best summarize the intent of his measurement scale. These items might be logically selected from a scale, or items of summary content might be included by design to "anchor" the scale's validity to the psychological construct of interest. The coefficient alpha reliability would be maximized with the restriction that the forced items must appear in the item subset selected. Description. The program will perform an item analysis on any combinations of items entered as subscales (including all items) by the all-combinations method or by deleting the item that leads to the maximum coefficient alpha. Any number (including zero) of items in any of the subscales can be specified as forced at the user's discretion. The user choses the method to be used with the practical limitation that the iterative method is used if the number of nonforced items in the scale is more than 15. Liberal program dimensions allow up to 100 forced and nonforced items arranged in up to 15 subscales. These dimensions are easily altered should the user desire. Input. The user may enter the subjects' raw scores on each of the items, or he may introduce the item intercorrelation or covariance matrix. Indices assigning items to desired subscales and a variable format must also be entered. Output. Program output includes, contingent upon data introduced, the item means, standard deviations, and intercorrelation matrix. For the iterative method, the item numbers and coefficient alpha are output for each item subset selected down to two items, or down to all forced items if there are more than two. The subset manifesting the largest coefficient alpha is also noted. The item numbers and coefficient alpha for the best subset are output for the all-combinations method. Computer and Language. This FORTRAN IV program has been used on a CDC CYBER 74 and an IBM 370/165, in both batch and interactive modes.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 1 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
