Validation of Developed Instruments on Study Habits, Creativity, and Self Concept on College Students in Nigeria

Authors

  • Sulaiman Adamu Mayanchi School of Education, Faculty of Social Science & Humanities, Universiti Teknologi Malaysia (UTM), Johor, Malaysia Author
  • Aqeel Khan School of Education, Faculty of Social Science & Humanities, Universiti Teknologi Malaysia (UTM), Johor, Malaysia Author
  • Adibah Binti Abdul Latif School of Education, Faculty of Social Science & Humanities, Universiti Teknologi Malaysia (UTM), Johor, Malaysia Author

DOI:

https://doi.org/10.61841/zf5aws72

Keywords:

Validation, Reliability, Rash Measurement Approach, Study Habit, Creativity and Self-concept

Abstract

The role of validating a questionnaire is to ensure that the questionnaires for use in research settings are psychometrically sound, valid, and reliable, as well as efficient and effective. This article will therefore provide the validation of existing resources on study habits, creativity, and self-concept through the use of the Rasch Measurement approach. A pilot study was randomly selected from each of the two colleges in Zamfara State, with a sample of 180 respondents. The employed Rasch-based Winsteps code produces the required parameter estimate automatically. The results of the reliability coefficients showed the reliability of 0.83, 0.96, and 0.87 items, with 0.96, 0.91, and 0.94 corresponding to Cronbach's alpha. The unidimensional construct of the test measures supported by the raw variance explained by measurements of 48.3 percent, 53.8 percent, and 53.0 percent closely match the variance expected. Investigation of the map of the item person revealed that all items fell within the respondents ' ability level. Likewise, fitness indices showed that 5 items were listed to delete items for study habits, and subscale creativity reveals that all 23 items in the scale have reasonable fitness, while 5 items have poor self-concept subscale fitness indices. The results confirm the accuracy of the explanations and inferences of the scores on the objects and subscales of the instruments. 

Downloads

Download data is not yet available.

References

[1] Abdul Aziz, A., Masodi, M.S. & Zaharim, A. Asas (2013). Model pengukuran Rasch. Bangi:

Universiti Kebangsaan Malaysia.

[2] Abdullah, N. & Lim, B. K. (2013). Parallel Circuit Conceptual Understanding Test (PCCUT). Procedia—

Social and Behavioral Sciences, 90 (InCULT 2012), 431–440.

[3] Alana Unfried & Malinda et al. (2015). Journal of Psycho-educational Assessment 1–

18 © 2015 SAGE Publications Reprints and permissions: sagepub.com/journalsPermissions.nav DOI:10.1177/0734282915571160 jpa.sagepub.com.

[4] Alshemmeri, F., Putih, A., Siraj, S., Khan, A., Abdallah, N. (2011). Art Ability and Academic Achievement in the Kingdom of Saudi Arabia: Role of Age and Sex. New Educational Review, 26 (4), 238-247.

[5] Azrilah Abdul Aziz, Muhammad Shahar Jusoh, Omar, A. R., Mohd Haris Amlus, & Tuan Salwani Awang

(2014). Construct Validity: A Rasch Measurement Model Approach. Journal of Applied Science and Agriculture, 9(12), 7–12.

[6] Baghaei P. (2008). The Rasch model as a construct validation tool. Rasch Measurement Transactions, 22

(1): 1145-1146.

[7] Banta, T.W. A. (2007). Warning on Measuring Learning Outcomes. Inside Higher Education.

[8] Bond T., Fox C. M. (2015) Applying the Rasch Model: Fundamental Measurement

in the Human Sciences, 3rd Edn. New York, NY: Taylor & Francis. 10.4324/9781315814698 [CrossRef] [Go

Google Scholar

[9] Brown, T. A., & Moore, M. T. (2012). Confirmatory factor analysis. In R. H. Hoyle (Ed.), Handbook

of structural equation modeling (pp. 361-379). New York, NY: Guilford Press.

[10] Cohen, A. S., Morrison, S. C., Callaway, D. A. (2013). Computerized facial analysis for understanding

Blunted affect: Initial feasibility, reliability, and validity data. Schizophrenia research. 148(1-3):111-6.

[11] Connell, J., Carlton, J., Grundy, A., Taylor Buck, E., Keetharuth, A. D., Ricketts, T., & Brazier, J. (2018). The

importance of content and face validity in instrument development: lessons learnt from service users when

developing the Recovering Quality of Life measure (ReQoL). Quality of Life Research, 27(7), 1893–1902.

[12] Cheung, G. W. & Rensvold, R. B. (2002). Evaluating goodness-of-fit indexes for testing

measurement invariance. Structural Equation Modeling, 9, 233-255.

[13] Embretson, S. E., & Reise, S. P. (2000). Item response theory for psychologists. Mahwah, NJ: Lawrence

Erlbaum Associates.

[14] Figlio, D.N. and M.E. Lucas. Do High Grading Standards Affect Student Performance? Journal

of Public Economics, 88: 1815-1834, 2004.

[15] Fuchs, L.S., D. Fuchs, K. Karns, C.L. Hamlett, and M. Katzaroff. 1999. Mathematics

Performance Assessment in the Classroom: Effects on Teacher Planning and Student Problem Solving.

American Educational Research Journal, 36(3): 609-649.

[16] Gorsuch, R. L. (1983). Factor analysis (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum

[17] Guilford, J.P. (1946). New standards for test evaluation. Educational and Psychological Measurement, 6,

427-439.

[18] Hair, J. F., Anderson, R. E., Tatham, R. L., & Black, W. C. (1995). Multivariate data analysis.

Englewood Cliffs, NJ: Prentice-Hall.

[19] Hair, J. F., Black, W. C., Babin, B. J., Anderson, R. E., & Tatham, R. L. (2006). Multivariate data

analysis (6th ed.). Upper Saddle River, NJ: Prentice Hall.

[20] Hu, L., & Bentler, P. M. (1999). Cut-off criteria for fit indexes in covariance structure analysis:

Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1-55.

[21] Karabatsos, G. (2000). A critique of Rasch residual fit statistics. Journal of Applied Measurement, 1, 152-

176.

[22] Kathy E. Green and Catherine G. Frantom, (2002.) Survey Development and Validation with the Rasch

Model. A paper presented at the International Conference on Questionnaire Development, Evaluation, and

Testing, Charleston, SC, November 14-17.

[23] Khan, A., et al. (2020). Mediating Effect of Positive Psychological Strength and Study Skills on

Examination Anxiety among Nigerian College Students. Sustainability, 12, 1479.

[24] Khan, A. (2013). Predictors of Positive Psychological Strengths and Subjective Well-being among North

Indian Adolescents: Role of Mentoring and Educational Encouragement. Social Indicators Research, 114,

3, 1285-1293.

[25] Kelley T.L. (1927). Interpretation of educational measurements. Yonkers, NY, World Book Company

[26] Kline, R. B. (2005). Principles and practice of structural equation modeling (2nd ed.). New York,

NY: Guilford Press.

[27] Lawshe, C. H. (1975). A quantitative approach to content validity. Personnel Psychology, 28, 563-575.

[28] Linacre JM. (2012). A user’s guide to Winsteps-Ministep: Rasch model computer programs.

Program manual 3.68.0.

[29] Linacre, J.M. (2019). Winsteps (Version 4.4.2) [Computer Software]. Beaverton, Oregon: Winsteps.com.

[30] Linacre, J. M., & Wright, B. D. (2000). WINSTEPS: A Rasch computer program. Chicago: MESA Press.

[31] Linacre, J. M. (2002). Understanding Rasch measurement: Optimizing rating scale category effectiveness.

Journal of Applied Measurement, 3, 85-106.

[32] Li, Y. (2016). How to Determine the Validity and Reliability of an Instrument. Miami University Discovery Center.

[33] Leal Filho, W., & Kovaleva, M. (2015). Research methods. Environmental Science and Engineering (Subseries Environmental Science), 5(9783319109053), 81–82.

[34] Levine, J. B., & Saintonge, S. (1993). Psychometric properties of the Separation-Individuation Test of

Adolescence with a clinical population. Journal of Clinical Psychology, 49, 492-507.

[35] Rasch, G. (1960). Probabilistic models for some intelligence and attainment tests. University of

Chicago Press, Chicago.

[36] Reckase, M. D., Ackerman, T. A., & Carlson, J. E. (1988). Building a unidimensional test using

multidimensional items. Journal of Educational Measurement, 25, 193-203.

[37] Smith, R. M. (1992). Application of Rasch measurement. Chicago: MESA Press.

[38] Smith, R. M., Schumacker, R. E., & Bush, J. M. (1995, April). Using item mean squares to evaluate fit to

the Rasch model. Paper presented at the annual meeting of the American Educational Research

Association, San Francisco.

[39] Vandenberg, R. J. & Lance, C. E. (2000). A review and synthesis of the measurement

Invariance literature: Suggestions, practices, and recommendations for organizational research.

Organizational Research Methods, 3, 4-69.

[40] Wu, M. & Adams, R. (2007). Applying the Rasch model to psycho-social measurement: A practical

approach. Educational Measurement Solutions, Melbourne

[41] Wolfe, E. W., & Chiu, C. W. T. (1999). Measuring change across multiple occasions using the Rasch rating

scale model. Journal of Outcome Measurement, 3, 360-381.

[42] Wright, B. D., & Stone, M. H. (1979). Best test design. Chicago: MESA Press.

[43] Wright, B. D. (1977). Solving measurement problems with the Rasch model. Journal of Educational

Measurement, 14, 97-116.

[44] Wright, B.D. and M.C. Mok (2013). An Overview of the Family of Rasch Measurement Models. A

chapter in the Introduction of Rasch Measurement, Chapter 1: 1-24.

[45] Tahir L, Mohd Said, MNH., Daud, K, Vazhathodi, SH, Khan A. (2014). The Benefits of Headship Mentoring: An Analysis of Malaysian Novice Head Teachers’ Perceptions. Educational Management Administration & Leadership, 44, 420-450.

Downloads

Published

31.07.2020

How to Cite

Adamu Mayanchi, S., Khan, A., & Binti Abdul Latif, A. (2020). Validation of Developed Instruments on Study Habits, Creativity, and Self Concept on College Students in Nigeria. International Journal of Psychosocial Rehabilitation, 24(5), 5545-5553. https://doi.org/10.61841/zf5aws72