Evaluating the usability of online learning environments in terms of design, use and operability
The CHAISE project pioneered in creating an online course, accessible on the Thinkific MOOC platform, aimed at tackling the mismatches in Blockchain Skills. The cutting-edge digital course includes 364 lessons, 33 hours of video content and 3 career specialisation pathways for the Blockchain Developer, Architect and Manager.
Information and Communication Technologies (ICTs) have played a key role in supporting the continuation of teaching and learning at university-level education (HEIs) and VET especially during the COVID-19 pandemic. But how are online learning environments considering aspects of Human Computer Interaction (HCI)? And how can external reviewers during an accreditation procedure assess the online learning environment complementarily to the ESG standards in Higher Education Area?
According to the Joint Research Center of European Commission (2010), there is an absence of broader set of internationally comparable indicators. These indicators could monitor progress in ICT uptake and unveil important information about use. Since frequency and purpose cannot be easily measured, comparable data and practices remain at national level, making it difficult to put in place tools for benchmarking policies at EU-level.
Many of ICT systems are not designed to cover learners with special needs or distraction disorders, a parameter that needs to be further researched and taken into consideration for the future. It is also impairing the efforts of more shy or less socialised students to interact with other peers or teachers, a behaviour that could be identifiable at physical level. Assignments and exams have also been affected.
In comparison to sit-down exams, ICT systems should be flexible enough to ensure more time for students to complete them and more time for teachers to grade them. Despite the challenges, the adoption of ICT methods provides students with the necessary skills to integrate into society and professional life, where technology-related competencies are an integral part of 21st century education.
Further to the use of EU frameworks, standards and indicators, complementary tools for evaluating an online learning environment can be considered by the external reviewers when evaluating a learning platform. Jakob Nielsen’s Heuristics (Nielsen & Molich, 1990) provide 10 general principles for human-computer interaction and serve as general rules of thumb for evaluating the design of an online learning environment. Out of the ten, we have chosen the following five heuristics:
Heuristic 1: Visibility of system status
The system should always keep users informed about what is going on, through appropriate feedback within a reasonable time.
Heuristic 2: Match between system and the real world
The system should speak the users’ language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order.
Heuristic 3: User control and freedom
Users often choose system functions by mistake and will need a clearly marked “emergency exit” to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.
Heuristic 7: Flexibility and efficiency of use
Accelerators — unseen by the novice user — may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions.
Heuristic 8: Aesthetic and minimalist design
Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.
The below scale is used in the heuristic evaluation:
Severity scale
0 = I don’t agree that this is a usability problem at all
1 = Cosmetic problem only: need not be fixed unless extra time is available
2 = Minor usability problem: fixing this should be given low priority
3 = Major usability problem: important to fix, so should be given high priority
4 = Usability catastrophe: imperative to fix this as soon as possible
The heuristic evaluation can be accompanied by usability research conducted by the HEIs/VET providers to measure the feedback from learners and instructors who are the end-users of the learning platform. This way the platform will meet high standards of quality in terms of design, use and operability.
Lastly, privacy and data protection issues, reliability and accessibility of local networks is of utmost importance to ensure trust and confidence in the ICT systems.
References:
Nielsen, J., & Molich, R. (1990, March). Heuristic evaluation of user interfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 249-256).
Joint Research Centre. (2010). Assessing the Effects of ICT in Education Indicators, Criteria and Benchmarks for International Comparisons: Indicators, Criteria and Benchmarks for International Comparisons. OECD Publishing.
EU Disclaimer:
The European Commission support for the production of this publication does not constitute an endorsement of the contents which reflects the views only of the authors, and the Commission cannot be held responsible for any use which may be made of the information contained therein.