Despite the availability of numerous usability assessment methods, current software development practices often lack a user-centered design approach. However, early and continued implementation of usability methods in the software development process can yield significant return on investment by reducing the resources and manpower required to address usability issues, minimizing maintenance costs and training requirements, and increasing user efficiency and satisfaction (Rajanen, 2003; Svanes & Gulliksen, 2008). Heuristic evaluation and user testing are two methods that can be used to assess the usability of a product or system. Heuristic evaluation involves assessing an interface against general usability standards (Nielsen & Molich, 1990). Typically performed by individuals with a human factors background, results from heuristic evaluation may not capture data points related to the background and expertise of end-users. User testing fills this gap by collecting feedback directly from end-users as they complete representative scenario-based tasks using the interface under evaluation. These complimentary methods can produce unique results and influence interface design from different approaches to create a more comprehensive evaluation (Tan, Liu, & Bishu, 2009), as seen in the prototype development of Workbench. Currently in development, Workbench is a web-based interface that will be integrated with the Post Mission Assessment for Tactical Training and Trend Analysis (PMATT-TA) software suite. PMATT-TA supports anti-submarine warfare (ASW) training assessments by providing a central location for the collection, aggregation, and visualization of ASW measures of performance (MOPs). Workbench aims to establish a more efficient process for training instructors to update and create MOPs without the assistance of a software engineer. This paper will compare the protocols and recommendations associated with heuristic evaluation and user testing as they apply to the Workbench case study. These results can be used to demonstrate and justify the need for standardized, iterative, user-centered design software development processes when creating training systems.
Demonstrating the Need for Usability Assessment within Software Development Standards
Conference
I/ITSEC 2020
Track
Policy, Standards, Management, and Acquisition
2 Views
1 Downloads