Peter H. Schönemann
Professor Emeritus • Department of Psychological Sciences • Purdue University

Abstract 19

[19]

Wang, M.M., Schonemann, P. H., and Rusk, J. G.

A conjugate gradient algorithm for the multidimensional analysis of preference data

Multivariate Behavioral Research, 1975, 10, 45-80.

Abstract

In continuation of earlier work on a new individual difference model for the multidimensional analysis of preference data (Schonemann and Wang, 1972), a relatively efficient algorithm for applying the model to fallible data was developed which does not require storage for second order derivatives.

Several different versions of such an algorithm were compared in terms of robustness, accuracy, and speed of convergence. The results strongly suggest that the so-called intervening conjugate gradient method (which iterates for only two of the three sets of unknowns and solves for the third set algebraically at each stage) is the most effective method for most purposes.

The algorithm was applied to a relatively large set of 1968 presidential election data that had been previously analyzed by a different method.. The outcome of this empirical study not only confirmed earlier results but also led, as a consequence of the stronger metric structure of the present model, to a more detailed and informative description of these data.

Notes

A continuation of  earlier work on metric unfolding, this paper presents a detailed account of an iterative program for fitting  the multidimensional preference model described in Schonemann and Wang (1970).

Attention is drawn to the subspace problem of multidimensional unfolding: When one set of points lies in a proper subspace of the other set (e.g., on a line within a plane) only the orthogonal projections of the points of  the larger set into the smaller set are recoverable. We suggest a computational method  for detecting such subspace problems, which indeed did arise in the voting data.