Despite meta-analyses showing strong learning gains for adaptive learning, few domain areas are covered by adaptive learning. A key reason for this is a content bottleneck: currently, adaptive systems require highly-trained computer scientists and educational specialists to add new content. To explore this issue, we are researching a pipeline of interactive tools designed for content managers with little or no training to incorporate content into an adaptive learning ecosystem. This Rapid Adaptive Content Registry prototype consists of four components:
- Adaptive Module Registry for composing a set of learning resources and learning objectives (competencies) in an intuitive content-management UI;
- Rapid Content Analysis Service which leverages machine learning to analyze web-pages (static or dynamic), PDFs, or short videos to generate metadata tags on competencies, estimated duration, and complexity;
- Preview and Text Extraction interface to review, test, and manually extract text from resources; and
- Module Simulator to analyze the ability of the available content to adapt to different simulated student patterns (e.g., struggling learner, learner starting with partial mastery, etc.)
This paper outlines the design principles, machine learning performance, and formative usability testing process for this toolkit. For this research, the performance metrics are authoring time, metadata tag quality, deployment reliability (valid content), and personalized pathways (differentiation between different kinds of learners). A comparison of machine learning models leveraging BERT-S to generate competency tags is presented, which indicates that a general model (not tag-specific) is reasonable for cold-start labels. The tool's usefulness is evaluated by comparing results from an adaptive module for virtual counseling registered de-novo through the tool, as compared to the same content composed and tagged by a team of specialists. Strategies and issues for integrating this into an enterprise ecosystem are also discussed. Initial testing indicates useful potential for such a tool, but also raises questions about how specialized tools should integrate with more traditional content management systems.
Keywords
AUTHORING TOOLS,COMPETENCY BASED TRAINING,MACHINE LEARNING,PERSONALIZED TRAINING
Additional Keywords