VitalSource News

US_Blog_950_accessibility

September 23, 2021 • 1 minute read
By: Rick Johnson

Around the world, more students than ever before are reliant upon digital learning tools and eTexts in order to continue their education.

Insight-icon
VitalSource Insights
Whitepapers, infographics, case studies, and more
Browse
events
Events
VitalSource webinars and conferences
Connect
Blog > Iterative Improvement—A Commitment Through Research

November, 11, 2021 . 2 minute read

Iterative Improvement—A Commitment Through Research

US_Blog_950_impact

Share:

Oftentimes, research papers aim to be groundbreaking or to show significant positive findings. While these studies often push forward what is known in the learning sciences, these are not the only types of research studies that matter. It is also beneficial to share what did not work and learn from those examples.   

In July of 2021, we presented a paper [1] on three courses in which the adaptive activities in courseware did not have the benefits to students that were previously found [2]. In these three non-STEM courses, the adaptive activities had a net negative impact on learning estimates. The goal of the paper was to investigate why. We found that these courses all had far fewer total adaptive questions than successful courses, had very low ratios of scaffolding questions to hard questions, and often had scaffolded questions all written at the same difficulty level as the hard questions. From this investigation, we determined a set of best practices for creating adaptive activities designed for student success.  

Data analysis using natural student data is key because data will always be able to show when a feature is not optimised to benefit students. In this paper, we found evidence that even courseware written by subject matter experts and designed based on learning science principles can be imperfect. Data illuminates those imperfections and their solutions. In this case, the data revealed new insights into how to better design adaptive scaffolding. It also revealed human fallibility. Questions written to low and medium difficulties often were just as difficult for students as the hard questions. This was not intentionally done; question difficulty is hard to gauge, and subject matter experts may overestimate the abilities of novices, especially those who are struggling. Data analysis can reveal when the intention of questions does not align with reality.  

This process of analysing data, discovering subpar results, investigating the circumstances for those results, and identifying solutions is part of the learning engineering process we use every day in Research and Development. Carrying forward our origins at Carnegie Mellon University’s Open Learning Initiative, the learning engineering process provides the structure to engage in the practice of iterative improvement in a meaningful way. Investigating subpar results and learning how to improve them is an act of learning engineering aimed at improving the learning experience for students. 

Sharing research findings—even when they aren’t optimal—is part of an ethical commitment to transparency and accountability to the broader research and educational community. Positive research findings are fantastic to provide validity and support for learning methods, but negative research findings are also beneficial to investigate and learn from what did not work. While we strive for effectiveness, we also are committed to learning from the data and making improvements because, at the end of the day, we must always do what is best for the learner. 

 
1. Van Campenhout R., Jerome B., Dittel J.S., Johnson B.G. (2021) Investigating Adaptive Activity Effectiveness Across Domains: Insights into Design Best Practices. In: Sottilare R.A., Schwarz J. (eds) Adaptive Instructional Systems. Design and Evaluation. HCII 2021. Lecture Notes in Computer Science, vol 12792, pp. 321–333. Springer, Cham. https://doi.org/10.1007/978-3-030-77857-6_22 

2. Van Campenhout, R., Jerome, B., & Johnson, B. G. (2020). The impact of adaptive activities in Acrobatiq courseware: Investigating the efficacy of formative adaptive activities on learning estimates and summative assessment scores. In: Sottilare R., Schwarz J. (eds) Adaptive Instructional Systems. HCII 2020. LNCS, vol 12214. Springer. pp 543–554. https://doi.org/10.1007/978-3-030-50788-6_40 

 

Subscribe to the blog

Subscribe