BİLGİSAYAR DESTEKLİ ÇOKTAN SEÇMELİ VE İKİ AŞAMALI KİMYA TESTİNE KATILAN ÖĞRENCİLERİN YANITLAMA SÜRELERİ VE DİKKAT DÜZEYLERİNİN İNCELENMESİ

INVESTIGATION OF THE RESPONSE TIMES AND ATTENTION LEVELS OF STUDENTS TAKING THE COMPUTER AIDED MULTIPLE-CHOICE AND TWO-TIER CHEMISTRY TESTS

Authors

Abstract

In Turkey, there are many measurement and evaluation tools used for different purposes. One of the most commonly used is multiple-choice tests. In the 1980s, two-tier tests were developed to use the positive aspects of multiple-choice tests and to minimize the negative aspects. The purpose of the tests is to determine how much the student has learned. Learning is an internal process that includes sensory awareness, attention, recognition, transformation, acquisition, and processing of information. In order for the information to be processed, the process of obtaining the information starts with attention. Attention is a clear and vivid position of the mind toward an object or set of thoughts. Focus and concentration are the foundation of attention. In tests, besides attention, response time is also important. Being fast has been the first step in the struggle for survival since primitive times. It is necessary to be effective as well as being fast. Therefore, the duration of the work is as important as the level of attention. The aim of this study was to analyze the students' attention levels and response times in the multiple-choice chemistry test and the two-tier chemistry diagnostic test (TTCDT) performed in a computer environment.  In this context, it was considered to use the participants' response times and attention data measured with the NeuroSky brain sensor during the multiple-choice and tiered diagnostic tests. In this study, data were collected together based on correlational and causal research methods. The participants were the pre-service science teachers studying at a state university in the fall semester of the 2021-2022 academic year. In this context, the peaks related to the attention signals produced by the NeuroSky Brain sensor and the test response times of the students who took the multiple-choice chemistry test and the students who took the TTCDT (Two-Tier Chemistry Diagnostic Test) were examined. Mann-Whitney U and Wilcoxon Tests, which are non-parametric analysis methods, were used in comparisons between students' response time and Neurosky attention level. There was no significant difference between the response times of the students who participated in the Multiple-Choice Chemistry Diagnostic Test and the response times of the students who participated in the TTCDT. When the response times of the students participating in TTCDT were compared for the first and second tiers, a significant difference was determined; It was understood that the students completed the second tier in less time. It was found that there was a moderate relationship between response times and attention frequencies in the second tier of TTCDT.

Keywords: Computer-aided exam, attention, response time, multiple-choice test, a two-tier test.

Öz

Türkiye’de farklı amaçlarla kullanılan birçok ölçme ve değerlendirme araçları yer almaktadır. En yaygın olarak kullanılanlardan birisi de çoktan seçmeli testlerdir. 1980’li yıllarda çoktan seçmeli testlerin, olumlu yönlerini kullanıp olumsuz yönlerini en aza indirmek için iki aşamalı testler geliştirilmiştir. Bu aşamalı testlerin amacı öğrencinin ne kadar öğrendiğini tespit etmektir. Öğrenme içsel bir süreç olup duyusal farkındalık, dikkat, tanıma, dönüştürme, bilginin alınmasını ve işlenmesini içermektedir. Bilginin işlenebilmesi için bilginin alınma süreci dikkat ile başlar. Dikkat, bir nesneye veya düşünce dizisine yönelik olarak zihnin net ve canlı bir pozisyon almasıdır. Dikkatin temelinde odaklanma ve konsantrasyon vardır. Testlerde dikkatin yanı sıra yanıtlama süresi de önemlidir. İlkel çağlardan günümüze hızlı olmak, hayatta kalma mücadelesinin ilk adımı olmuştur. Hızlı olmak kadar etkili olmak da gereklidir. Bu yüzden dikkat düzeyi kadar yapılan işin süresi de önemlidir. Bu çalışmanın amacı, bilgisayar ortamında yapılan çoktan seçmeli kimya testi ile iki aşamalı kimya tanı testinde öğrencilerin dikkat düzeyleri ile yanıtlama sürelerinin incelenerek analiz edilmesidir. Bu kapsamda çoktan seçmeli ve aşamalı testler süresince katılımcıların NeuroSky beyin sensörüyle ölçülen dikkat verilerinin ve yanıtlama sürelerinin kullanılması düşünülmüştür. Bu çalışmada veriler korelasyonel ve nedensel araştırma yöntemine dayalı olarak toplanmıştır. Çalışmanın katılımcıları 2021-2022 eğitim öğretim yılının güz yarıyılında bir devlet üniversitesinde öğrenim gören Fen Bilgisi Öğretmen adaylarıdır. Çalışmada çoktan seçmeli kimya testini alan öğrencilerle İAKTT (İki Aşamalı Kimya Tanı Testi)’yi alan öğrencilerin NeuroSky Beyin sensörüne dayalı olarak ürettiği dikkat sinyallerine ilişkin tepe değerleri ile test yanıtlama süreleri incelenmiştir. Öğrencilerin yanıtlama süresi ve Neurosky dikkat düzeyi arasındaki karşılaştırmalarda parametrik olmayan analiz yöntemlerinden Mann-Whitney U ve Wilcoxon Testinden yararlanılmıştır. ÇSKTT (Çoktan Seçmeli Kimya Tanı Testi)’ye katılan öğrencilerin yanıtlama süreleri ile İAKTT’ye katılan öğrencilerin yanıtlama süreleri arasında anlamlı bir farklılık görülmemiştir. İAKTT’ye katılan öğrencilerin birinci ve ikinci aşamalarının yanıtlama süreleri karşılaştırıldığında anlamlı bir farklılık belirlenmiş; öğrencilerin ikinci aşamayı daha kısa sürede tamamladığı anlaşılmıştır. İAKTT’nin ikinci aşamasında yanıtlama süreleri ile dikkat frekansları arasında orta düzeyde bir ilişkinin olduğu bulunmuştur.

Anahtar Terimler: Bilgisayar destekli sınav, dikkat, yanıtlama süresi, çoktan seçmeli test, iki aşamalı test.

KAYNAKÇA

Altuner, F. (2019). Investigation of the relationship between item statistics and item response time (Unpublished Master’s Thesis). Mersin University, Institute of Education Sciences, Mersin. Andrewes, D. (2009). Neuropsychology: from theory to practice. Psychology Press. Atkinson, R. C., & Shiffrin, R. M. (1968). Human memory: A proposed system and its control processes. In K. W. Spence, & J. T. Spence(Eds.), The psychology of learning and motivation. New York: Academic Press. Doi: 10.1016/s0079-7421(08)60422-3. Balcı. A. (2005). Sosyal bilimlerde araştırma: Yöntem, teknik ve ilkeler (5.Baskı). Ankara: Pegem Publishing.

Bandura, A. (1997). Self–efficacy: The exercise of control. New York: Freeman and Company.

Bayazıt, A. (2013). Investigating the effects of different question modalities on eye movements, performance and response time. (Unpublished Doctorate Dissertation). Hacettepe University Graduate School of Science and Engineering, Ankara. Bayazıt, Ö. (2007). An examination of current collaborative supply chain practices. International Journal of Business Innovation and Research, 1(3), pp. 253-266. Doi:10.1504/IJBIR.2007.012110 Bernt, F. M., & Bugbee, A. C. (1988). Your time is up! An assessment of time limits for American college students. Examination Research Report No. 88–1. Bryn Mawr, PA: The American College. Bodmann, S. M. & Robinson, D. H. (2004). Speed and performance differences among computer-based and paper pencil tests. Journal of Educational Computing Research, 31(1), pp. 51–60. Doi: 10.2190/grqq-yt0f-7lkb-f033 Broadbent, D. E. (1958). Perception and communication. Pergamon Press, Fitzroy Square, London, UK. Browarska, N., Zygarlicki, J., Pelc, M., Niemczynowicz, M., Zygarlicka, M., & Kawala-Sterniuk, A. (2021, August). Pilot study on using innovative counting peaks method for assessment purposes of the EEG data recorded from a single-channel non-invasive brain-computer interface. In 2021 25th International Conference on Methods and Models in Automation and Robotics (MMAR) (pp. 68-72). IEEE. Doi: 10.1109/mmar49549.2021.9528447

Buchberger, F., Campos, B. P., Kallos, D., & Stephenson, J. (1500). Green paper on teacher education in Europe. Umeå, Sweden: Thematic Network on Teacher Education in Europe. 30 March 2016 retrieved from http://www.cep.edu.rs/sites/default/files/greenpaper.pdf

Büyüköztürk, Ş. (2002). Veri analizi el kitabı (1. Baskı). Ankara: Pegem A Yayıncılık. Büyüköztürk, Ş., Çakmak, E. K., Akgün, Ö. E., Karadeniz, Ş., & Demirel, F. (2008). Bilimsel araştırma yöntemleri. Ankara: Pegem Akademi Yayıncılık. Chan S.-C., Lu T.-S., & Tsai R.-C. (2014). Incorporating response time to analyze test data with mixture structural equation modeling. Psychological Testing, 61, pp. 463-488. Chen, C. M., & Wu, C. H. (2015). Effects of different video lecture types on sustained attention, emotion, cognitive load, and learning performance. Computers & Education80, pp. 108-121. Doi: 10.1016/j.compedu.2014.08.015 Clariana R., Wallace P. (2002). Paper-based versus computer-based assessment: Key factors associated with the test mode effect. British Journal of Educational Technology, 33, pp. 593–602. Doi: 10.1111/1467-8535.00294 Delen, E. (2015). Enhancing a computer-based testing environment with optimum ıtem response time. Eurasia Journal of Mathematics, Science and Technology Education, 11(6), pp.1457-1472. Doi: 10.1111/1467-8535.00294 DeMars, C.E: (2000). Test stakes and item format interactions. Applied Measurement in Education, 13(1), pp. 55-77. Doi: 10.1207/s15324818ame1301_3. Doi: 10.1207/s15324818ame1301_3 Demirci, E. (2011), Beyin Dalgalarıyla Oyun Oynamak, TÜBİTAK Bilim Teknik Dergisi,  44 (520), ss. 18-24. Demirel, Ç., Kandemir, H., & Köse, H. (2018). Controlling a robot with extraocular muscles using EEG device.  26th Signal Processing and Communications Applications Conference (SIU). Doi: 10.1109/SIU.2018.8404157 Direnga, J, Timmermann, D., Presentati, B., Brose, A., & Kautz, C. (2015, July). Do students spend more time on difficult questions? Analysis of item response time versus correctness in the SCI/CATS. Research in Engineering Education Symposium (REES 2015). Direnga, J., Presentati, B., Timmermann, D., Brose, A., & Kautz, C.H. (2015, June). Does it stick? – Investigating long-term retention of conceptual knowledge in mechanics ınstruction. In presented at 2015 ASEE Annual Conference & Exposition, Seattle, Washington. 10.18260/p.23897

Eells, R. J. (2011). Meta-analysis of the relationship between collective teacher efficacy and student achievement, Unpublished Doctorate Thesis, Layola University Chicago, Chicago, IL.

Fernández‐Castillo, A., & Caurcel, M. J. (2015). State test‐anxiety, selective attention and concentration in university students. International Journal of Psychology50(4), pp. 265-271. Doi: 10.1002/ijop.12092 Fraenkel, J. R., & Wallen, N. (2000). How to design and evaluate research in education (4th ed.). NY: McGraw-Hill. Gass, C S., & Curiel, R.E. (2011). Test anxiety in relation to measures of cognitive and intellectual functioning. Archives of Clinical Neuropsychology , 26(5), pp. 396-404. Doi:10.1093/arclin/acr034

Goddard, R. G., Hoy, W. K., & Hoy, A. W. (1504). Collective efficacy: Theoretical development, empirical evidence, and future directions. Educational Researchers 33(3), pp. 3-13.

Goldhammer, F., Naumann, J., Stelter, A., Tóth, K., Rölke, H., & Klieme, E. (2014). The time on task effect in reading and problem solving is moderated by task difficulty and skill: Insights from a computer-based large-scale assessment. Journal of Educational Psychology, 106(3), pp. 608-626. Doi:10.1037/a0034716 Halkitis, P. N., Jones, J. P., & Pradhan, J. (1996, April 8-12). Estimating testing time: The effects of item characteristics on response latency. In presented annual meeting of the American Educational Research Association, New York. Hestenes, D., Wells, M. ve Swackhamer, G. (1992). Force concept ıinventory. Phys. Teach. 30(3), pp. 141-158. Doi: 10.1119/1.2343497 Hunt, R. R., & Ellis, H. C. (2004). Fundamentals of cognitive psychology (Edition 7). McGraw-Hill. İlgün-Dibek, M. (2020). Silent predictors of test disengagement in PIAAC 2012. Journal of Measurement and Evaluation in Education and Psychology, 11(4), pp. 430-450. Doi:10.21031/epod.796626 İnel, Y. (2014). The effects of computer based instructional materials used in social study of sixth grade students' attention and motivation levels (Unpublished Doctorate Dissertation), Gazi University, Institute of Education Sciences, Ankara. Kokubo, Y., & Shoji, Y. (2018). Relationship between Brain Waves and Examination Achievements. Information Engineering Express4(1), pp. 53-62. Doi: 10.52731/iee.v4.i1.254 Lasry, N., Watkins, J., Mazur, E., & Ibrahim, A. (2013). Response times to conceptual questions. American Journal of Physics, 81(9), pp. 703-706. Doi: 10.1119/1.4812583 Lee J., Moreno K. E., Sympson J. B. (1986). The effects of mode of test administration on test performance. Educational and Psychological Measurement, 46, pp. 467–473. Doi: 10.1177/001316448604600224 Luecht, M.(2005). Some useful costbenefit criteria for evaluating computerbased test delivery models and systems. Journal of Applied Testing Technology, 7(2), pp. 1-31. Mann, M., & Treagust, D. F. (1998). A pencil and paper instrument to diagnose students’ conception of breathing, gas exchange and respiration. Australian Science Teachers Journal, 44(2), pp. 55-59. Mason B. J., Patry M., & Bernstein D.J. (2001). An examination of the equivalence between non-adaptive computer-based and traditional testing. Journal of Educational Computing Research, 24, pp. 29–39. Doi: 10.2190/9epm-b14r-xqwt-wvnl Mayerl, J. (2013). Response latency measurement in surveys. detecting strong attitudes and response effects. Survey Methods: Insights from the Field. Retrieved from https://surveyinsights.org/?p=1063. Doi: 10.13094/SMIF-2013-00005 Mintzes, J. J., Wandersee, J. H., & Novak, J. D. (2001). Assessing understanding in biology. Journal of Biological Education, 35(3), pp. 118-125. Doi: 10.1080/00219266.2001.9655759

National Association for the Education of Young Children [NAEYC] (1509). NAEYC standards for early childhood professional preparation programs. 28 July 2015 retrieved from https://www.naeyc.org/files/naeyc/file/positions/ProfPrepStandards09.pdf 

Nikou, S., & Economides, A. A. (2013). Student achievement in paper, computer/web and mobile-based assessment. In presented Balkan Conference in Informatics, BCI '13, Thessaloniki, Greece. Ommerborn, R., & Schuemer, R. (2001). Using computers in distance study: Results of a survey amongst disabled distance students. FernUniversität-Gesamthochschule in Hagen. Özkan, N. (2017). EMDR device design and determination of optimum operation parameters with signal processing techniques (Unpublished Master’s Thesis). Afyon Kocatepe University, Institute of Education Sciences, Afyonkarahisar. Pacheco-Unguetti, A. P., Acosta, A., Lupianez, J., Roman, N.,& Derakshan, N. (2012). Response inhibition and attentional control in anxiety. Quarterly Journal of Experimental Psychology, 65, pp. 646–660. Doi:10.1080/17470218.2011.637114. Palmer, D. H. (1998). Measuring contextual error in the diagnosis of alternative conceptions in science. Issues in Educational Research, 8(1), pp. 65-76. Pashler, H.E. (1984). Processing stages in overlapping tasks: Evidence for a central bottleneck. Journal of Experimental Psychology: Human Perception and Performance, 10, 358-377. Doi: 10.1037/0096-1523.10.3.358 Ponce, H. R., Mayer, R. E., Sitthiworachart, J., & López, M. J. (2020). Effects on response time and accuracy of technology-enhanced cloze tests: An eye-tracking study. Educational Technology Research and Development, 68(5), pp. 2033-2053. Doi: 10.1007/s11423-020-09740-1 Posner, M. I., Walker, J. A., Friedrich, F. J., & Rafal, R. D. (1984). Effects of parietal injury on covert orienting of attention. The Journal of Neuroscience, 4(7), pp. 1863-1864. Doi: 10.1523/jneurosci.04-07-01863.1984 Prisacari, A.A., & Danielson, J. (2017). Computer-based versus paper-based testing: Investigating testing  mode  with  cognitive  load  and  scratch  paper  use.  Computers  in  Human  Behavior, 77, pp. 1–10. Doi: 10.1016/j.chb.2017.07.044 Russell M. (1999). Testing on computers: A follow-up study comparing performance on computer and on paper. (Unpublished doctoral dissertation). Boston College, The Graduate School of Education, USA. Russell,M., & O’Connor, K. (2003). Computer-based testing and validity: a look backand into the future. Lynch School of Education in TASC Publications, Boston College. Schatz, P., & Browndyke, J. (2002). Applications of computer-based neuropsychological assessment. Journal of Head Trauma Rehabilitation, 17(5), pp. 395-410. Doi: 10.1097/00001199-200210000-00003 Schnipke, D. L. (1995). Assessing speededness in computer-based tests using item response times (Unpublished doctoral dissertation). Johns Hopkins University, Baltimore, MD. Schnipke, D.L., & Scrams, D.J. (1999). Representing response time information in item banks (LSAC Computerized Testing Report No. 97-09). Newtown, PA: Law School Admission Council. Semmes, R., Davison, M. L., & Close, C. (2011). Modeling individual differences in numerical reasoning speed as a random effect of response time limits. Applied Psychological Measurement,35(6), pp. 433–446. Doi:10.1177/0146621611407305 Setzer, J.C., Wise, S.L., Heuvel, J.R., & Ling, G. (2013) An Investigation of examinee test-taking effort on a large-scale assessment. Applied Measurement in Education, 26(1), pp. 34-49. Doi: 10.1080/08957347.2013.739453 Sevinç  E.,(2006).Beyin Bilgisayar Arayüzleri, http://www.rehabilitasyon.com/action/makale/1/Beyin_Bilgisayar_Arayuzleri-2299 (15 aralık 2022). Shadiev, R., & Huang, Y. M. (2020). Investigating student attention, meditation, cognitive load, and satisfaction during lectures in a foreign language supported by speech-enabled language translation. Computer Assisted Language Learning33(3), pp. 301-326. Doi: 10.1080/09588221.2018.1559863 Soland J., Wise S.L., & Gao, L. (2019). Identifying disengaged survey responses: New evidence using response time metadata. Applied Measurement in Education, 32(2), pp. 151-165. Doi: 10.1080/08957347.2019.1577244 Solso, R., Maclin, K. M., & Maclin, O. H. (2009). Cognitive psychology [Bilişsel psikoloji] (in English) (4. Baskı, A. Ayçiçeği-Dinn, Trans.). İstanbul: Kitabevi. Soraghan, C., Matthews, F., Kelly, D., Ward, T., Markham, C., Pearlmutter, B.A., & O?Neill, R., (2006, November). A dual-channel  optical  brain-computer  interface  in  a  gaming  environment. In presented  CGAMES  2006  -  9th International Conference on Computer, UK. Spray J. A., Ackerman T.A., Reckase M.D., & Carlson J.E. (1989). Effect of the medium of item presentation on examinee performance and item characteristics. Journal of Educational Measurement, 26, pp. 261–271. Doi: 10.1111/j.1745-3984.1989.tb00332.x Stankov, L. and Roberts, R.D. (1997). Mental speed is not the ‘basic’ process of intelligence. Personality and Individual Differences, 22(1), pp. 69-84. Doi: 10.1016/s0191-8869(96)00163-8 Streiner, D.L. (2003) Being Inconsistent About Consistency: When Coefficient Alpha Does and Doesn't Matter. Journal of Personality Assessment, 80(3), pp. 217-222. Doi: 10.1207/S15327752JPA8003_01 Swanson, D.B., Case, S.E., Ripkey, D.R., Clauser, B.E., & Holtman, M.C. (2001). Relationships among Item Characteristics, Examinee Characteristics, and Response Times on USMLE Step 1. Academic Medicine, 76, pp. 114–116. Doi: 10.1097/00001888-200110001-00038 Swerdzewski, P. J., Harmes, J. C., & Finney, S. J. (2011). Two approaches for identifying low-motivated students in a low-stakes assessment context. Applied Measurement in Education, 24, pp. 162–188. Doi:10.1080/08957347.2011.555217 Tamir, P. (1971). An Alternative Approach to The Construction of Multiple Choice Test Items. Journal of Biological Education, 5, pp. 305-307. Doi: 10.1080/00219266.1971.9653728 Tan, K. C. D., Goh, K. N., Chia, S. L., & Treagust, D. F. (2002). Development and application of a two-tier multiple choice diagnostic instrument to assess high school students’ understanding of inorganic chemistry qualitative analysis. Journal of Research in Science Teaching, 39(4), pp. 283-301. Doi: 10.1002/tea.10023 Truell, A. D. (2005). Comparing student performance on two computer-based user interfaces and paper-and pencil-test formats. NABTE Review,32, pp. 29-35. Truell, A. D., Zhao, J. J., & Alexander, M. W. (2005). The impact of settable test item exposure control interface format on postsecondary business student test performance. Journal of Career and Technical Education, 22(1), pp. 31-41. Doi: 10.21061/jcte.v22i1.668 Türkoguz, S. (2020a). Investigation of three-tier diagnostic and multiple choice tests on chemistry concepts with response change behaviour. International Education Studies13(9), pp. 10-22. Doi: 10.5539/ies.v13n9p10 Türkoguz, S. (2020b). Comparison of threshold values of three-tier diagnostic and multiple-choice tests based on response time. Anatolian Journal of Education5(2), pp. 19-36. Doi: 10.29333/aje.2020.522a Wainer, H, Dorans, N. J., Flaugher, R., Green B. F., & Mislevy R. J. (2000). Computerized adaptive testing: A primer. Routledge. Weeks, J. P. , Von Davier, M.& Yamamoto, K. (2016). Using response time data to inform the coding of omitted responses. Psychological Test and Assessment Modeling, 58 (4), pp. 671-701 Wirth, J. (2008). Computer-based tests: alternatives for test and item design. In J. Hartig, E. Klieme, & D. Leutner (Eds.), Assessment of competencies in educational contexts (pp. 235–252). Göttingen: Hogrefe. Wise, S. & Kong, X. (2005). Response time effort: A new measure of examinee motivation in computer-based tests. Applied Measurement in Education, 18(2), pp. 163–183. Doi: 10.1207/s15324818ame1802_2 Wise, S. L. (2017). Rapid-guessing behavior: Its identification, interpretation, and implications. Educational Measurement: Issues and Practice, 36(4), pp. 52–61. Doi: 10.1111/emip.12165 Wise, S. L., & Gao, L. (2017). A general approach to measuring test-taking effort on computer-based tests. Applied Measurement in Education, 30(4), pp. 343-354. Doi: 10.1080/08957347.2017.1353992 Wise, S. L., & Kingsbury, G. G. (2016). Modeling student test-taking motivation in the context of an adaptive achievement test. Journal of Educational Measurement, 53(1), pp. 86-105. Doi:10.1111/jedm.12102 Wise, S. L., & Ma, L. (2012, January). Setting response time thresholds for a CAT item poolThe normative threshold method. In presented at the annual meeting of the National Council on Measurement in Education, Vancouver, Canada.  Yang, C. L., O’neill, T. R., & Kramer, G. A. (2002). Examining item difficulty and response time on perceptual ability test items. J. Appl. Measure. 3, pp. 282–299. Yavuz, H. Ç. (2019). The effects of log data on students’ performance. Journal of Measurement and Evaluation in Education and Psychology, 10(4), pp. 378-390. Doi: 10.21031/epod.564232 Yudhana, A., Muslim, A., Wati, D. E., Puspitasari, I., Azhari, A., & Mardhia, M. M. (2020). Human emotion recognition based on EEG signal using fast fourier transform and K-Nearest neighbor. Adv. Sci. Technol. Eng. Syst. J5, pp. 1082-1088. Doi: 10.25046/aj0506131

Downloads

Published

2023-04-30