Firecracker for Continuous Quality Improvement

Posted by Ben Williams on Jun 28, 2017 6:10:34 PM

Firecracker's evolution from student study aid to programmatic assessment system for health science schools.

Firecracker started with a focus on improving student learning by combining proven learning and memory science techniques like spaced repetition, active recall, pretesting, and interleaving with Big Data and Artificial Intelligence. In short, we look at what works best for similar students and recommend what the student should study/review, when they should study/review, and what format it should be in (e.g. easy recall questions versus 3-order questions with complex clinical vignettes) students should study). Student adoption, engagement, and outcomes have been very compelling. Firecracker now serves nearly 1 in 4 medical students in the United States who collectively answer an average of 8,000,000 questions each month on our mobile applications and website. Students score nearly a standard deviation higher than their peers who don't use Firecracker -- and that's not because Firecrackers are studying more or are better test takers.

As faculty and administrator awareness of Firecracker grew, medical schools asked if they could get Firecracker for their entire student body. Over time, schools asked if we could customize Firecracker to support their curriculum and assessment strategy. Specifically, schools wanted us to align our content (topic summaries, easy review/retention questions, course questions of intermediary difficulty/integration, and licensing-exam style clinical vignette 2-3 order questions) with what they taught and when they taught it -- class by class. They also wanted us to provide questions they could use for weekly formative quizzes, summative tests, and progress tests (i.e. multiple practice USMLE and COMLEX exams, starting the first year of school).

Their most frequently stated goals:

  1. Increase attendance and satisfaction by showing students that what they are learning in class is relevant to their licensing exams;
  2. Give Faculty the ability to recommend exactly when students should study material they aren’t covering in class, but that students need to know for their licensing exams;  
  3. Save Faculty time for high-value learning activities and student remediation by reducing the amount of time Faculty spend authoring and updating questions, question explanations, course packs (Firecracker Topics can supplement if not replace coursepacks). Faculty will never be able to cover everything that licensing exams like the USMLE or COMLEX says is fair game, but they can help students learn more efficiently by recommending when they learn this material on their own. Currently, students self-study is rarely aligned with what Faculty teach, when the teach it. Aligning what Faculty teach, and when, with a list of recommended topics for self-study is easy and we believe doing so supports LCME’s self-directed learning requirement (Element 6.3). Our partners think about it as "Guided Self-Directed Learning.";  
  4. Provide Faculty, Deans, tutors/TAs, and learning specialists with an easy way to see how students are doing as they progress through the curriculum, and using a variety of measures (e.g. self-assessment effort, self-reported confidence, objective mastery, and ability to apply to clinical scenarios and USMLE-style questions). Deans especially like the ability to compare students to both their classmates and peers across the country since it helps them in discussions with Trustees, Board members, Alumni, and accreditors; and 
  5. Help Faculty, curriculum review committees, Course Directors, Clerkship Directors, and Deans make proactive and informed decisions about what topics Faculty need to teach, and what what students can learn on their own.  

Since decision makers at schools are typically Deans -- and one of a Dean’s primary responsibilities is to make sure all their hard work on behalf of students and faculty is documented and improvement processes are clearly articulated to accreditors like the LCME and COCA -- we are receiving a lot of interest in how our platform (and the comparative performance data its use produces) supports accreditation. This interest started when Deans realized that the data they were getting from student use of Firecracker and our curriculum and assessment alignment process could be used to support not only formative quizzing and frequent student feedback allowing enough time for remediation (e.g. LCME Element 9.7 “Formative Assessment and Feedback”), but also continuous quality improvement processes in general (LCME Element 1.1).

We’ve learned that a school’s ideal solution is to build a world-class “programmatic assessment” program. Tellingly, multiple-regression analysis of factors leading to Severe Action Decisions (SAD) by the LCME found that Element 8.3 in Section 8 was the most highly correlated with a SAD. Here is what the 2017 LCME guidelines says about Element 8.3 (“Curricular Design, Review, Revision/Content Monitoring”):

The faculty of a medical school are responsible for the detailed development, design, and implementation of all components of the medical education program, including the medical education program objectives, the learning objectives for each required curricular segment, instructional and assessment methods appropriate for the achievement of those objectives, content and content sequencing, ongoing review and updating of content, and evaluation of course, clerkship, and teacher quality. These medical education program objectives, learning objectives, content, and instructional and assessment methods are subject to ongoing monitoring, review, and revision by the faculty to ensure that the curriculum functions effectively as a whole to achieve medical education program objectives.

Van der Vleuten et al. (2015) define programmatic assessment as “an integral approach to the design of an assessment program with the intent to optimise its learning function, its decision-making function and its curriculum quality-assurance function."

Most schools current solution is to manually tie together data from curriculum mapping products (e.g. One45, LCMS+), data from school-administered assessments (Examsoft, NBME CAS, NBME subject exams), and data from self-directed learning resources (e.g. Firecracker, USMLE World, MedU). None of these solutions by themselves can deliver programmatic assessment and, moreover, the data derived from these sources is not normalized and thus statistically difficult to compare/integrate and make decisions from. An integrated technology-driven solution has the potential to save time and money, as well as improve student outcomes and ongoing data-driven decision making.

Firecracker’s goal is to make programmatic assessment easy by combining a proven learning system for students (and one that has proven utility across the entire lifecycle of learning starting day one of class) with a course/clerkship curriculum and assessment support system (institution-wide assessments can also be easily incorporated as previously mentioned in the context of cumulative summative testing or progress testing). The end result is a programmatic assessment system capable of efficiently driving continuous curriculum quality assurance and improvement over time, and producing a wealth of data analysis and reports that can be used internally and shared externally with accreditors if desired.

Moreover, Firecracker is a platform -- not merely a stand-alone learning, course support, or accreditation tool. Schools need not worry that working with Firecracker means not working with others; our goal is to integrate with anything that provides value to students, faculty, or administrators. Tellingly, we have integrated both student resources (e.g. we cite First Aid and Pathoma so students and faculty can use those resources alongside Firecracker) and pulled and pushed data and content from Faculty and Administrator tools (Learning Management Systems like Canvas, Curriculum Mapping tools like LCMS+, assessment distribution systems like ExamSoft, etc.).

Many schools are in the process of revamping their curricula and Firecracker is well positioned to support these changes in a data-driven manner. Since we have historical data on student performance we can share once we’re a school affiliate (FERPA requirement), faculty and deans can look at student performance on their curricula to determine the best way to revamp it. Since our content is already aligned with USMLE, COMLEX, and other standards, when we aligning our content to schools’ curricula, we will help them determine when they cover what national learning objectives, if there are any gaps, if there are topics students are learning well on their own but perhaps not at the ideal time in the curricula, and if there are any redundancies. Redundancies can be good, especially for topic areas that students struggle with.

We have a robust dashboard with data reporting, exporting, mining, and analysis capabilities and will always do custom queries, data integrations, and analysis for our partners. That's how we've been building our next-generation health science learning platform -- by rolling up our sleeves and doing it manually at first and learning how partners use what we provide them. We think of ourselves as a service-enabled technology and content company and we wrote this post to start the conversation about what is possible in 2017.

We would love to hear what you think. Please don’t be shy, especially if you think we have something horribly wrong!

Looking forward to your thoughts,

 

- Ben Williams

CEO & Co-founder

Firecracker Inc.

about.me/bcwill

Topics: Accreditation

Welcome to Firecracker

Firecracker is an adaptive learning platform that uses amazing content coupled with spaced repetition to help medical students crush their exams. We offer programs for both individual students, and entire medical schools. If you'd like to learn more contact us at [email protected].

Subscribe to Email Updates

Recent Posts