The High Potential of Computer-Based Reading Assessment

Authors

  • Pauline Auphan Laboratoire EMC, Université Lyon 2 et LabEx CORTEX ANR-11- LABX-0042
  • Jean Ecalle Laboratoire EMC, Université Lyon 2 et LabEx CORTEX ANR-11- LABX-0042
  • Annie Magnan Laboratoire EMC, Université Lyon 2 et LabEx CORTEX ANR-11- LABX-0042, Université de Lyon; Institut Universitaire de France

DOI:

https://doi.org/10.21432/cjlt27847

Keywords:

computer-based assessment, word reading, reading comprehension, primary and secondary children, validity, reliability

Abstract

The aim of this study is to propose advantages provided by computerized tools when assessing reading ability. A new computer-based reading assessment evaluating both word reading and reading comprehension processes was administered to 687 children in primary (N=400) and secondary (N=287) schools. Accuracy (weighted scores) and speed of access (response times) automatically recorded by the software were analyzed based on developmental issues (Anovas), correlation matrices, structural equation modeling and clinical interpretation. Results underlined the validity and reliability of the tool. The Discussion addresses the limitations of the present computer-based assessment and presents perspectives for taking fuller advantage of computerized technologies.

Author Biographies

Pauline Auphan, Laboratoire EMC, Université Lyon 2 et LabEx CORTEX ANR-11- LABX-0042

Equipe Apprentissage, Développement et Troubles du Langage, Docteur en psychologie cognitive

Jean Ecalle, Laboratoire EMC, Université Lyon 2 et LabEx CORTEX ANR-11- LABX-0042

Equipe Apprentissage, Développement et Troubles du Langage, Professeur

Annie Magnan, Laboratoire EMC, Université Lyon 2 et LabEx CORTEX ANR-11- LABX-0042, Université de Lyon; Institut Universitaire de France

Equipe Apprentissage, Développement et Troubles du Langage, Professeur

References

Association Médicale Mondiale. (1964). Déclaration d’Helsinki: Principes éthiques applicables à la recherche médicale impliquant des êtres humains.

Auphan, P., Ecalle, J., & Magnan, A. (2018). Computer-based assessment of reading ability and subtypes of readers with reading comprehension difficulties: a study in French children from G2 to G9. European Journal of Psychology of Education, 1-23.

Balota, D. A., & Chumbley, J. I. (1984). Are lexical decisions a good measure of lexical access? The role of word frequency in the neglected decision stage. Journal of Experimental Psychology: Human Perception and Performance, 10(3), 340. doi:10.1037/0096-1523.10.3.340

Bastien, J. C., & Scapin, D. L. (1995). Evaluating a user interface with ergonomic criteria. International Journal of Human-Computer Interaction, 7(2), 105–121.

Beauvais, L., Bouchafa, H., Beauvais, C., Kleinsz, N., Ecalle, J., & Magnan, A. (in press). Tinfolec: A new French web-based test for reading assessment in primary school. Canadian Journal of School Psychology.

Braze, D., Katz, L., Magnuson, J. S., Mencl, W. E., Tabor, W., Van Dyke, J. A., & Shankweiler, D. P. (2016). Vocabulary does not complicate the simple view of reading. Reading and Writing, 29(3), 435–451. doi:10.1007/s11145-015-9608-6

Cain, K., & Chiu, Y.-D. (2018). The simple view of reading across development: The prediction of grade 3 reading comprehension by prekindergarten skills. Remedial and Special Education. doi:10.1177.0741932518762055

Cain, K., & Oakhill, J. (2006). Assessment matters: Issues in the measurement of reading comprehension. British Journal of Educational Psychology, 76(4), 697–708. doi:10.1348/000709905X69807

Carlson, S. E., Seipel, B., & McMaster, K. (2014). Development of a new reading comprehension assessment: Identifying comprehension differences among readers. Learning and Individual Differences, 32, 40–53. doi:10.1016/j.lindif.2014.03.003

Cartwright, K. B. (2007). The contribution of graphophonological-semantic flexibility to reading comprehension in college students: Implications for a less simple view of reading. Journal of Literacy Research, 39(2), 173–193. doi:10.1080/ 10862960701331902

Catts, H. W., Herrera, S., Nielsen, D. C., & Bridges, M. S. (2015). Early prediction of reading comprehension within the simple view framework. Reading and Writing, 28(9), 1407-1425.

Cerdán, R., Vidal-Abarca, E., Martinez, T., Gilabert, R., & Gil, L. (2009). Impact of question-answering tasks on search processes and reading comprehension. Learning and Instruction, 19(1), 13–27. doi:10.1016/j.learninstruc.2007.12.003

Chen, S, R., & Vellutino, F. R. (1997). Prediction of reading ability: A cross-validation study of the simple view of reading. Journal of Literacy Research, 29(1), 1–24. doi: 10.1080/10862969709547947

Chua, Y. P., & Don, Z. M. (2013). Effects of computer-based educational achievement test on test performance and test takers’ motivation. Computers in Human Behavior, 29(5), 1889–1895. doi:10.1016/j.chb.2013.03.008

Colenbrander, D., Kohnen, S., Smith-Lock, K., & Nickels, L. (2016). Individual differences in the vocabulary skills of children with poor reading comprehension. Learning and Individual Differences, 50, 210–220. doi:10.1016/j.lindif.2016.07.021

Dambreville, S. C., & Bétrancourt, M. (1998). Ergonomie des documents techniques informatisés: Expériences et recommandations sur l’utilisation des organisateurs para-linguistiques. Les Hypermédias: Approches Cognitives et Ergonomiques, 123–138.

Delen, E. (2015). Enhancing a computer-based testing environment with optimum item response time. Eurasia Journal of Mathematics, Science & Technology Education, 11(6). doi:10.12973/eurasia.2015.1404a

Dyson, M. C., & Haselgrove, M. (2001). The influence of reading speed and line length on the effectiveness of reading from screen. International Journal of Human-Computer Studies, 54(4), 585–612. doi:10.1006/ijhc.2001.0458

Ebert, K. D., & Scott, C. M. (2016). Bringing the simple view of reading to the clinic: Relationships between oral and written language skills in a clinical sample. Journal of Communication Disorders, 62, 147–160. doi:10.1016/j.jcomdis.2016.07.002

Gil, L., Martinez, T., & Vidal-Abarca, E. (2015). Online assessment of strategic reading literacy skills. Computers & Education, 82, 50–59. doi:10.1016/j.compedu.2014.10.026

Gough, P. B., & Tunmer, W. E. (1986). Decoding, reading, and reading disability. Remedial and Special Education, 7(1), 6–10. doi:10.1177/074193258600700104

Greiff, S., Niepel, C., Scherer, R., & Martin, R. (2016). Understanding students’ performance in a computer-based assessment of complex problem solving: An analysis of behavioral data from computer-generated log files. Computers in Human Behavior, 61, 36–46. doi:10.1016/j.chb.2016.02.095

Hoover, W. A., & Gough, P. B. (1990). The simple view of reading. Reading and Writing, 2(2), 127–160. doi:10.1007/BF00401799

Jeong, H. (2014). A comparative study of scores on computer-based tests and paper-based tests. Behaviour & Information Technology, 33(4), 410–422. doi:10.1080/0144929X.2012.710647

Kim, Y. S. G. (2017). Why the simple view of reading is not simplistic: Unpacking component skills of reading using a direct and indirect effect model of reading (DIER). Scientific Studies of Reading, 21(4), 310-333.

Kimberlin, C. L., & Winterstein, A. G. (2008). Validity and reliability of measurement instruments used in research. American Journal of Health-System Pharmacy, 65(23), 2276–2284. doi:10.2146/ajhp070364

Kong, X. J., Wise, S. L., & Bhola, D. S. (2007). Setting the response time threshold parameter to differentiate solution behavior from rapid-guessing behavior. Educational and Psychological Measurement, 67(4), 606–619. doi:10.1177/0013164406294779

Lee, Y.-H., & Jia, Y. (2014). Using response time to investigate students’ test-taking behaviors in a NAEP computer-based study. Large-Scale Assessments in Education, 2(1), 8. doi:10.1186/s40536-014-0008-1

Lété, B., Sprenger-Charolles, L., & Colé, P. (2004). MANULEX: A grade-level lexical database from French elementary school readers. Behavior Research Methods, Instruments, & Computers, 36(1), 156–166. doi:10.3758/BF03195560

Lin, C.-S., Chang, S.-H., Liou, W.-Y., & Tsai, Y.-S. (2013). The development of a multimedia online language assessment tool for young children with autism. Research in Developmental Disabilities, 34(10), 3553–3565. doi:10.1016/j.ridd.2013.06.042

Mangen, A., Walgermo, B. R., & Brønnick, K. (2013). Reading linear texts on paper versus computer screen: Effects on reading comprehension. International Journal of Educational Research, 58, 61–68. doi:10.1016/j.ijer.2012.12.002

Margolin, S. J., Driscoll, C., Toland, M. J., & Kegler, J. L. (2013). E-readers, Computer Screens, or Paper: Does Reading Comprehension Change Across Media Platforms? Applied Cognitive Psychology, 27(4), 512–519. doi:10.1002/acp.2930

Maqableh, M., Moh’d Taisir Masa, R., & Mohammed, A. B. (2015). The acceptance and use of computer based assessment in higher education. Journal of Software Engineering and Applications, 8(10), 557. doi:10.4236/jsea.2015.810053

Massonnié, J., Bianco, M., Lima, L., & Bressoux, P. (2018). Longitudinal predictors of reading comprehension in French at first grade: Unpacking the oral comprehension component of the simple view. Learning and Instruction, 60, 166-179. Retrieved from https://eprints.bbk.ac.uk/26172/

Mayes, S. D., & Calhoun, S. L. (2008). WISC-IV and WIAT-II Profiles in Children With High-Functioning Autism. Journal of Autism and Developmental Disorders, 38(3), 428–439. doi:10.1007/s10803-007-0410-4

Mowbray, C. T., Holter, M. C., Teague, G. B., & Bybee, D. (2003). Fidelity Criteria: Development, Measurement, and Validation. American Journal of Evaluation, 24(3), 315–340. doi:10.1177/109821400302400303

OCDE. (2016). Les élèves en difficulté pourquoi décrochent-ils et comment les aider à réussir?

Oliveras-Rentas, R. E., Kenworthy, L., Roberson, R. B., Martin, A., & Wallace, G. L. (2012). WISC-IV Profile in High-Functioning Autism Spectrum Disorders: Impaired Processing Speed is Associated with Increased Autism Communication Symptoms and Decreased Adaptive Communication Abilities. Journal of Autism and Developmental Disorders, 42(5), 655–664. doi:10.1007/s10803-011-1289-7

Ouellette, G., & Beers, A. (2009). A not-so-simple view of reading: how oral vocabulary and visual-word recognition complicate the story. Reading and Writing, 23(2), 189–208. doi:10.1007/s11145-008-9159-1

Ouellette, G. P. (2006). What’s meaning got to do with it: The role of vocabulary in word reading and reading comprehension. Journal of Educational Psychology, 98(3), 554–566. doi:10.1037/0022-0663.98.3.554

Ozuru, Y., Best, R., Bell, C., Witherspoon, A., & McNamara, D. S. (2007). Influence of question format and text availability on the assessment of expository text comprehension. Cognition and Instruction, 25(4), 399–438. doi:10.1080/07370000701632371

Perfetti, C. A., & Hart, L. (2002). The lexical quality hypothesis. Precursors of Functional Literacy, 11, 67–86.

Perfetti, C., & Stafura, J. (2014). Word knowledge in a theory of reading comprehension. Scientific Studies of Reading, 18(1), 22–37. doi:10.1080/10888438.2013.827687

Pourcin, L., Sprenger-Charolles, L., El Ahmadi, A., & Colé, P. (2016). Reading and related skills in Grades 6, 7, 8, and 9: French normative data from EVALEC. Revue Européenne de Psychologie Appliquée/European Review of Applied Psychology, 66(1), 23–37. doi:10.1016/j.erap.2015.11.002

Protopapas, A., Simos, P. G., Sideridis, G. D., & Mouzaki, A. (2012). The components of the simple view of reading: A confirmatory factor analysis. Reading Psychology, 33(3), 217–240. doi:10.1080/02702711.2010.507626

Protopapas, A., & Skaloumbakas, C. (2007). Traditional and computer-based screening and diagnosis of reading disabilities in Greek. Journal of Learning Disabilities, 40(1), 15–36. doi:10.1177/00222194070400010201

Richter, T., Isberner, M.-B., Naumann, J., & Neeb, Y. (2013). Lexical quality and reading comprehension in primary school children. Scientific Studies of Reading, 17(6), 415–434. doi:10.1080/10888438.2013.764879

Ros, C., & Rouet, J. F. (2006). L’Ergonomie des Logiciels de Lecture: un Savoir-Faire en Emergence. Lecture et Technologie Numériques: Enjeux et Défis Des Technologies Numériques Pour l’enseignement et Les Pratiques de Lecture, 181–206.

Salmerón, L., & Delgado, P. (2019). Critical analysis of the effects of the digital technologies on reading and learning / Análisis crítico sobre los efectos de las tecnologías digitales en la lectura y el aprendizaje. Culture and Education, 31(3), 465–480. doi:10.1080/11356405.2019.1630958

Schaefer, B., Bowyer-Crane, C., Herrmann, F., & Fricke, S. (2015). Development of a tablet application for the screening of receptive vocabulary skills in multilingual children: A pilot study. Child Language Teaching and Therapy, 1–13. doi:10.1177/0265659015591634

Schatz, P., & Browndyke, J. (2002). Applications of computer-based neuropsychological assessment. The Journal of Head Trauma Rehabilitation, 17(5), 395–410. doi:10.1207/S15324826AN1001_6

Schreiber, J. B., Nora, A., Stage, F. K., Barlow, E. A., & King, J. (2006). Reporting structural equation modeling and confirmatory factor analysis results: A review. The Journal of Educational Research, 99(6), 323–338. doi:10.3200/JOER.99.6.323-338

Singleton, C. (2001). Computer-based assessment in education. Educational and Child Psychology, 18(3), 58–74.

Smith, G. T. (2005). On Construct Validity: Issues of Method and Measurement. Psychological Assessment, 17(4), 396–408. doi:10.1037/1040-3590.17.4.396

Sprenger-Charolles, L., Colé, P., Béchennec, D., & Kipffer-Piquard, A. (2005). French normative data on reading and related skills from EVALEC, a new computerized battery of tests (end Grade 1, Grade 2, Grade 3, and Grade 4). Revue Européenne de Psychologie Appliquée/European Review of Applied Psychology, 55(3), 157–186. doi:10.1016/j.erap.2004.11.002

Swart, N. M., Muijselaar, M. M., Steenbeek-Planting, E. G., Droop, M., de Jong, P. F., & Verhoeven, L. (2016). Differential lexical predictors of reading comprehension in fourth graders. Reading and Writing, 1–19. doi:10.1007/s11145-016-9686-0

Terzis, V., & Economides, A. A. (2011). The acceptance and use of computer based assessment. Computers & Education, 56(4), 1032–1044. doi:10.1016/j.compedu.2010.11.017

Tilstra, J., McMaster, K., Van den Broek, P., Kendeou, P., & Rapp, D. (2009). Simple but complex: Components of the simple view of reading across grade levels. Journal of Research in Reading, 32(4), 383–401. doi:10.1111/j.1467-9817.2009.01401.x

Tricot, A., Plégat-Soutjis, F., Camps, J.-F., Amiel, A., Lutz, G., & Morcillo, A. (2003). Utilité, Utilisabilité, Acceptabilité: Interpréter les Relations Entre Trois Dimensions de l’Evaluation des EIAH. In Environnements Informatiques pour l’Apprentissage Humain 2003 (pp. 391–402). ATIEF; INRP. Retrieved from https://edutice.archives-ouvertes.fr/edutice-00000154/

Tunmer, W. E., & Chapman, J. W. (2012). The simple view of reading redux vocabulary knowledge and the independent components hypothesis. Journal of Learning Disabilities, 45(5), 453–466. doi:10.1177/0022219411432685

Vellutino, F. R., Tunmer, W. E., Jaccard, J. J., & Chen, R. (2007). Components of reading ability: Multivariate evidence for a convergent skills model of reading development. Scientific Studies of Reading, 11(1), 3–32.

Vidal-Abarca, E., Martinez, T., Salmerón, L., Cerdán, R., Gilabert, R., Gil, L., Ferris, R. (2011). Recording online processes in task-oriented reading with Read&Answer. Behavior Research Methods, 43(1), 179–192. doi:10.3758/s13428-010-0032-1

Wang, S., Jiao, H., Young, M. J., Brooks, T., & Olson, J. (2008). Comparability of computer-based and paper-and-pencil testing in K–12 reading assessments a meta-analysis of testing mode effects. Educational and Psychological Measurement, 68(1), 5–24. doi:10.1177/0013164407305592

Wise, S. L., & DeMars, C. E. (2006). An application of item response time: The effort-moderated IRT model. Journal of Educational Measurement, 43(1), 19–38. doi:10.1111/j.1745-3984.2006.00002.x

Wise, S. L., & Kong, X. (2005). Response time effort: A new measure of examinee motivation in computer-based tests. Applied Measurement in Education, 18(2), 163–183. doi:10.1207/s15324818ame1802_2

Downloads

Published

2020-09-11

Issue

Section

Articles