I created a 20-question usability evaluation checklist for IT administrators and instructional designers to use when reviewing digital tools for higher education courses. the questions are based on Jakob Nielsen's 10 Usability Heuristics, which is a well known framework in UX design.
The checklist is designed to be filled out after someone has some experience using a tool. Some key aspects the checklist looks at is how the tool gives feedback, how consistent the interface is, how technical the langauge is, and how easy it is to recover from errors.
Scoring works on a 0 to 3 scale where lower scores mean better usability. There is also a Not Applicable option incase a question is not relevant for the type of software that is being tested.
This was built during my OLCÂ Instructional Design Professional Certification ID2 course. It has been adapted into an internal tool I use with coworkers when evaluating our software portfolio.
Before recommending a tool to an instructor, we should know how user friendly it is for their students. This checklist offers away to vet tools before they get intergrated into a course, so instructors are not setting students up with something that is going to be confusing or hard to use.
This checklist was built to help IT staff and instructional designers make more informed decisions when evaluating tools. Some team members may not have a UX or instructional design background, so having a structured scoring process makes it easier for anyone to evaluate a tool consistently. This framework has been adopted into my work. We do VPAT/ACR reviews, accessibility audits, and now usability reviews.
I have a background in UX, so I wanted to create a checklist that anyone could pick up and use without having a usability background. Basing it on Nielsen's heuristics gave the checklist a solid foundation, and adding examples related to our campus tools made it feel more practical and relevant.