Please use this identifier to cite or link to this item:
|Type:||Artigo de periódico|
|Title:||An estimation method for latent traits and population parameters in Nominal Response Model|
|Abstract:||The nominal response model (NRM) was proposed by Bock [Psychometrika 37 (1972) 29-51] in order to improve the latent trait (ability) estimation in multiple choice tests with nominal items. When the item parameters are known, expectation a posteriori or maximum a posteriori methods are commonly employed to estimate the latent traits, considering a standard symmetric normal distribution as the latent traits prior density. However, when this item set is presented to a new group of examinees, it is not only necessary to estimate their latent traits but also the population parameters of this group. This article has two main purposes: first, to develop a Monte Carlo Markov Chain algorithm to estimate both latent traits and population parameters concurrently. This algorithm comprises the Metropolis Hastings within Gibbs sampling algorithm (MHWGS) proposed by Patz and Junker [Journal of Educational and Behavioral Statistics 24 (1999b) 346-366]. Second, to compare, in the latent trait recovering, the performance of this method with three other methods: maximum likelihood, expectation a posteriori and maximum a posteriori. The comparisons were performed by varying the total number of items (NI), the number of categories and the values of the mean and the variance of the latent trait distribution. The results showed that MHWGS outperforms the other methods concerning the latent traits estimation as well as it recoveries properly the population parameters. Furthermore, we found that NI accounts for the highest percentage of the variability in the accuracy of latent trait estimation.|
|Subject:||Nominal response model|
|Editor:||Brazilian Statistical Association|
|Appears in Collections:||Artigos e Materiais de Revistas Científicas - Unicamp|
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.