An analysis of readability metrics on English exam texts
DOI:
https://doi.org/10.55492/dhasa.v3i01.3864Keywords:
English, readability metrics, text readability, highschool text examsAbstract
Readability metrics provide information on how difficult a text is to read. This information is relevant, for instance, to identify suitable texts for learner readers. Readability metrics have been developed for several languages, but no such metrics have been developed for the indigenous South African languages. One of the limitations in the development of the metrics is the availability of texts in these languages for which the readability is known. To resolve this issue, we would like to consider texts that are used in final year exams of language subjects at highschool. We expect these texts to have consistent readability throughout the years. Additionally, in South Africa, language subjects may be taught both as home language or first additional language. We expect there to be differences in readability between the exam texts for these subjects. To test these assumptions, in this article, we compute readability scores using nine existing readability metrics for the final year exams of English home language and English first additional language. The results show that indeed the readability of the texts is consistent over the years and significantly different between the two subjects. Generalizing over these results, we expect that we can use final year exam texts of other languages to develop readability metrics for the indigenous South African languages in future work. An analysis of the performance of the readability metrics on the English texts serves as a starting point to identify useful text properties to use for the development of the readability metrics for the indigenous South African languages.
Downloads
Published
Issue
Section
License
Copyright (c) 2022 Johannes Sibeko, Menno van Zaanen
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.