Faculty Perceptions and Preferences Toward Online Language Assessments: A Post-COVID Evaluation of Blackboard-Based Testing in Higher Education

Authors

  • Abdullah Alshayban

    Department of English Language and Literature, College of Languages and Humanities, Qassim University, Qassim 52571, Saudi Arabia

DOI:

https://doi.org/10.30564/fls.v7i10.11047
Received: 14 July 2025 | Revised: 18 July 2025 | Accepted: 7 August 2025 | Published Online: 13 October 2025

Abstract

This research investigated faculty perceptions and preferences about online assessments supported by Blackboard in higher education with a focus on the effectiveness of post-pandemic assessments. Thus, this research had four objectives: (a) to measure faculty attitudes toward fairness, safety, and feasibility of online assessments; (b) to compare those attitudes based on demographic variables (such as age, gender, rank, and familiarity with Blackboard); (c) to measure faculty preferences for specific objective versus subjective types of assessment; and (d) to compare such preferences based on demographic characteristics. This study specifically explores online assessment in the context of language learning, examining how faculty perceive and prefer Blackboard-based formats for evaluating linguistic performance and communication skills. Using purposive sampling, a total of 40 faculty members of varying ranks and disciplines participated in a quantitative, cross-sectional survey. A semi-structured Likert-type questionnaire with ranking items was administered for data collection. Descriptive and inferential statistical analyses were conducted using SPSS. Results showed that although faculty did appreciate the benefits of Linguistics Blackboard assessments, especially the automatic grading and administrative convenience, significant issues remained regarding academic integrity, cheating, and exam security. The findings provide important considerations for the existing literature on digital language assessment practices in higher education and offer practical implications for sustained assessment innovation beyond the pandemic period. This study adopts an exploratory design, providing preliminary insights specific to English Language faculty using the Blackboard LMS, rather than aiming for broad generalizations.

 

Keywords:

Perception; Blackboard Assessments; Post-Pandemic Assessments; Online Exams Integrity; Faculty Preferences; English-Language Assessments

References

[1] Anstey, L., Watson, G., 2018. A rubric for evaluating e-learning tools in higher education. Educause Review. 10(09). Available from: https://er.educause.edu/articles/2018/9/a-rubric-for-evaluating-e-learning-tools-in-higher-education

[2] Rapanta, C., Botturi, L., Goodyear, P., et al., 2020. Online university teaching during and after the COVID-19 crisis: Refocusing teacher presence and learning activity. Postdigital Science and Education. 2, 923–945. DOI: https://doi.org/10.1007/s42438-020-00155-y

[3] Center for Applied Linguistics, 2023. CAL launches assessment resource for post-secondary world language teachers. Available from: https://www.cal.org/news/cal-launches-assessment-resource-for-post-secondary-world-language-teachers/ (cited 6 June 2025).

[4] Sattayaraksa, W.D., Luangrangsee, P., Ratsameemonthon, C., et al., 2023. Understanding how demographic factors influence faculty members’ perceptions of online learning success: A case study in Thai private higher education. Journal of Pedagogical Research. 7(5), 48–68. Available from: https://www.ijopr.com/article/understanding-how-demographic-factors-influence-faculty-members-perceptions-of-online-learning-13733

[5] Cabrera, R.N., 2023. Faculty perceptions of online instruction and educational technology in higher education [Doctoral thesis]. University of Texas Rio Grande Valley: Edinburg, TX, USA. Available from: https://www.proquest.com/openview/aada4975040e3850230907f447f6a24d/1?pq-origsite=gscholar&cbl=18750&diss=y

[6] Creswell, J.W., 2009. Research designs: Qualitative, quantitative, and mixed methods approach. SAGE Publications: Thousand Oaks, CA, USA. Available from: https://www.ucg.ac.me/skladiste/blog_609332/objava_105202/fajlovi/Creswell.pdf

[7] Dörnyei, Z., 2007. Research methods in applied linguistics. Oxford University Press: Oxford, UK. Available from: https://www.researchgate.net/publication/346769442_Zoltan_Dornyei_Research_Methods_in_Applied_Linguistics_Oxford_Oxford_University_Press_2007

[8] Braga, P.R.V., Granero, C.M.O., Buck, E., 2024. Student and faculty perceptions of summative assessment methods in a block and blend mode of delivery. Journal of University Teaching and Learning Practice. 21(2), 1–20. Available from: https://search.informit.org/doi/abs/10.3316/informit.T2024110200000301505952792

[9] Mohamed, A.M., Nasim, S.M., Aljanada, R., et al., 2023. Lived experience: Students’ perceptions of English language online learning post COVID-19. Journal of University Teaching and Learning Practice. 20(7), 12. Available from: https://eric.ed.gov/?id=EJ1412101

[10] Ahmed, V., Anane, C., Alzaatreh, A., et al., 2023. Faculty perception of online education: Considerations for the post-pandemic world. Frontiers in Education. 8, 1258980. DOI: https://doi.org/10.3389/feduc.2023.1258980

[11] Balash, D.G., Korkes, E., Grant, M., et al., 2023. Educators’ perspectives of using (or not using) online exam proctoring. In Proceedings of the 32nd USENIX Security Symposium (USENIX Security 23), Anaheim, CA, USA, 9–11 August 2023; pp. 5091–5108. Available from: https://www.usenix.org/conference/usenixsecurity23/presentation/balash

[12] Aydın, C.H., 2005. Turkish mentors’ perception of roles, competencies and resources for online teaching. Turkish Online Journal of Distance Education. 6(3), 58–80. Available from: https://dergipark.org.tr/en/pub/tojde/issue/16929/176725

[13] Zeib, F., Tariq, R., 2024. Equity challenges in academic satisfaction through online learning platforms and post-COVID implications using multigroup analysis. Educational Technology & Society. 27(4), 302–318. Available from: https://www.researchgate.net/publication/384558986_Equity_challenges_in_academic_satisfaction_through_online_learning_platforms_and_post_COVID_implications_using_multigroup_analysis

[14] Chan, R., Bista, K., Allen, R., 2021. Online teaching and learning in higher education during COVID-19: International perspectives and experiences. Routledge: London, UK. Available from: https://www.researchgate.net/publication/352469194_Online_Teaching_and_Learning_in_Higher_Education_during_COVID-19_International_Perspectives_and_Experiences

[15] Soomro, K.A., Kale, U., Curtis, R., et al., 2020. Digital divide among higher education faculty. International Journal of Educational Technology in Higher Education. 17, 1–16. DOI: https://doi.org/10.1186/s41239-020-00191-5

[16] Alruwais, N., Wills, G., Wald, M., 2018. Advantages and challenges of using e-assessment. International Journal of Information and Education Technology. 8(1), 34–37. Available from: https://faculty.ksu.edu.sa/sites/default/files/advantages_and_challenges_of_using_e-assessment.pdf

[17] Viberg, O., Mutimukwe, C., Hrastinski, S., et al., 2024. Exploring teachers’ (future) digital assessment practices in higher education: Instrument and model development. British Journal of Educational Technology. 55(6), 2597–2616. DOI: https://doi.org/10.1111/bjet.13462

[18] Holden, O.L., Soomro, K.A., Kale, U., et al., 2020. Digital divide among higher education faculty. International Journal of Educational Technology in Higher Education. 17, 1–16. DOI: https://doi.org/10.1186/s41239-020-00191-5

[19] Heil, J., Ifenthaler, D., 2023. Online assessment in higher education: A systematic review. Online Learning. 27(1), 187–218. Available from: https://olj.onlinelearningconsortium.org/index.php/olj/article/view/3398

[20] Zhang, Z., Wasie, S., 2023. Educational technology in the post-pandemic era: Current progress, potential, and challenges. In Proceedings of the 15th International Conference on Education Technology and Computer (ICETC 2023), London, UK, 26–28 May 2023; pp. 40–46. DOI: https://doi.org/10.1145/3629296.3629303

[21] Sharma, V.K., Holbah, W.A., 2022. Online language assessment the exception, not the rule: For inclusive language learning. Arab World English Journal. (CALL (8)), 299–313. DOI: https://doi.org/10.24093/awej/call8.26

[22] Bui, T.H., 2022. A review of language testing and assessment in online teaching. International Journal of English Linguistics. 12(4), 54–62. DOI: https://doi.org/10.5539/ijel.v12n4p54

[23] Creswell, J.W., Creswell, J.D., 2018. Research design: Qualitative, quantitative, and mixed methods approach, 5th ed. SAGE Publications: Thousand Oaks, CA, USA.

[24] Idkhan, A.M., Idris, M.M.R., 2023. The impact of user satisfaction in the use of e-learning systems in higher education: A CB-SEM approach. International Journal of Environment, Engineering and Education. 5(3), 100–110. Available from: https://www.researchgate.net/publication/377093163

[25] Almahasees, Z., Mohsen, K., Amin, M.O., 2021. Faculty’s and students’ perceptions of online learning during COVID-19. Frontiers in Education. 6, 638470. DOI: https://doi.org/10.3389/feduc.2021.638470

[26] Elsalem, L., Al-Azzam, N., Jum’ah, A.A., et al., 2020. Remote e-exams during COVID-19 pandemic: A cross-sectional study of students’ preferences and academic dishonesty in faculties of medical sciences. Annals of Medicine and Surgery. 59, 186–191. DOI: https://doi.org/10.1016/j.amsu.2021.01.054

[27] Mukhtar, K., Javed, K., Arooj, M., et al., 2020. Advantages, limitations and recommendations for online learning during COVID-19 pandemic era. Pakistan Journal of Medical Sciences. 36(COVID19-S4), S27–S31. DOI: https://doi.org/10.12669/pjms.36.COVID19-S4.2785

[28] Field, A., 2024. Discovering statistics using IBM SPSS statistics, 6th ed. SAGE Publications: London, UK. Available from: https://books.google.com.pk/books?id=83L2EAAAQBAJ

[29] Pallant, J., 2020. SPSS survival manual: A step-by-step guide to data analysis using IBM SPSS, 7th ed. Routledge: London, UK. DOI: https://doi.org/10.4324/9781003117452

[30] Prottas, D.J., Cleaver, C.M., Cooperstein, D., 2016. Assessing faculty attitudes towards online instruction: A motivational approach. Online Journal of Distance Learning Administration. 19(4), 19. Available from: https://ojdla.com/archive/winter194/prottas_cleaver_cooperstein194.pdf

[31] Mellar, H., Peytcheva-Forsyth, R., Kocdar, S., et al., 2018. Addressing cheating in e-assessment using student authentication and authorship checking systems: Teachers’ perspectives. International Journal for Educational Integrity. 14(1), 1–21. DOI: https://doi.org/10.1007/s40979-018-0025-x

[32] Taherkhani, R., Aref, S., 2024. Students’ online cheating reasons and strategies: EFL teachers’ strategies to abolish cheating in online examinations. Journal of Academic Ethics. 22(3), 539–559. DOI: https://doi.org/10.1007/s10805-024-09502-1

[33] Liu, Q., Wald, N., Daskon, C., et al., 2024. Multiple-choice questions (MCQs) for higher-order cognition: Perspectives of university teachers. Innovations in Education and Teaching International. 61(4), 802–814. DOI: https://doi.org/10.1080/14703297.2023.2222715

[34] Xiromeriti, M., Newton, P.M., 2024. Solving not answering: Validation of guidance for writing higher-order multiple-choice questions in medical science education. Medical Science Educator. 34, 1–9. DOI: https://doi.org/10.1007/s40670-024-02140-7

[35] Turnitin, 2021. Subjective vs. objective assessments: What’s the difference? Available from: https://www.turnitin.com/blog/subjective-objective-assessments-differences (cited 6 June 2025).

[36] Northcote, M., Gosselin, K.P., Reynaud, D., et al., 2015. Navigating learning journeys of online teachers: Threshold concepts and self-efficacy. Issues in Educational Research. 25(3), 319–344. Available from: https://search.informit.org/doi/abs/10.3316/ielapa.535664788146465

Downloads

How to Cite

Alshayban, A. (2025). Faculty Perceptions and Preferences Toward Online Language Assessments: A Post-COVID Evaluation of Blackboard-Based Testing in Higher Education. Forum for Linguistic Studies, 7(10), 1016–1036. https://doi.org/10.30564/fls.v7i10.11047

Issue

Article Type

Article