Abstract
Item analysis is a general term that refers to the specific methods used in education to evaluate test items, typically for the purpose of test construction and revision. Construction and analysis of science examinations in high school curricula needs the integration of integrated process and skill type which help students develop their rational and logical thinking. This exploratory study aimed to investigate the comparative results of item analyses using Rasch Model Analysis and SPSS. It examined the reliability, mean difficulty, mean ability, discrimination index, mean trend scores, infit and misfit, and item-person and fit maps using the results of the Basic & Integrated Science Process and Skill Test (BISPST) administered to secondary science high school teachers in the Division of Nueva Vizcaya. The Rasch Model Analysis and SPSS revealed the same unsatisfactory value of internal consistency or the reliability coefficient Cronbach’s alpha (0.66 and 0.67), mean score (19.52) and standard deviation (4.53) of the BISPST. The mean item difficulty was set at 0 by definition (SD = 2.02) and the mean examinee ability was 1.56 (SD = 0.98), which means that the examinees were able to correctly answer 61% of items on average. 33 items with difficulty lower than the least able examinee. The mean ability using Rasch Model Analysis per item showed that high performers in the test got the correct answers compared to the low performing group. This is the same and consistent with the results of the mean trend scores of the upper, middle and lower groups as generated by the SPSS. Choices in the BISPST were found to be effective distracters. Although most items needs revision as revealed by the values generated by both softwares, no items were found to be misfit in the distribution.