
Education image created by Freepik.
By Sarah Hansen, OCW Educator Project Manager
Assessing students’ learning is one of the most important things we do as educators. It’s also one of the most complicated. There’s a lot to consider:
- When will assessment happen? (Along the way? At the end of the course?)
- How will we collect useful information about student learning? (Through writing samples? Surveys? Online reading questions? Student self-assessments? Performance assessments? Something else?)
- How will we assess work that doesn’t have right and wrong answers, like creative writing or digital media projects?
- How will we assess work students complete in teams? (It’s hard enough to assess students individually! But we know collaboration is an essential skill—so how do we measure it in a way that’s fair to individuals?)
- How will we effectively communicate feedback to students (Via rubrics? Written comments? Oral exams that function as educative conversations?)
- How will we use assessment to improve our own teaching? (When should curricular iteration occur?)
For every group of students, there’s a different combination of productive approaches to assessment that instructors need to configure. It’s a shape-shifting puzzle that can be exciting, enervating, and downright addictive. If you’re an educator and you’re intrigued by “the assessment challenge,” you’re not alone. MIT instructors are thinking hard about measuring student learning, providing feedback, and improving their teaching based on what they learn through assessments. In the following short videos, six MIT instructors candidly share the assessment strategies they’ve been trying in their own classrooms:
- In 16.842 Fundamentals of Systems Engineering, students in a Small Private Online Course (SPOC) worked in teams to participate in an international imitation satellite design competition. Aero/Astro Professor Olivier de Weck shares how he assessed work students completed as teams, how he conducted online written and oral exams, and how made use of students’ personal reflective memos to understand what they learned in the course.
- Elizabeth Choe gets into the nitty gritty of how she approached assessment and feedback in the creative context of 20.219 Becoming the Next Bill Nye, a course in which students conceptualized and produced educational videos (no multiple choice tests here!).
- Takako Aikawa discusses how she used a daily grading system and interview tests to provide students with feedback about their language learning in 21G.503 Japanese III. (You can view this video in Japanese, too.)
- In CMS.611 Creating Video Games, students worked in teams to develop games for a real client: The Red Cross/Red Crescent Climate Center. Sara Verrilli shares how instructors assessed these projects, emphasizing that students’ processes and project management skills were more important than the final products.
- Professor Joe Schindall opens up about grading in ESD.051J Engineering Innovation and Design, noting that students’ “passion of engagement” and their willingness to try new things were factors the instructors considered when assessing student learning in this Engineering Systems Division course.
- Professor Catherine Drennan shares how she uses clicker competitions to engage students and formatively assess learning in 5.111SC Principles of Chemical Science. (Spoiler Alert: Things get heated.)
Want more MIT instructor insights about assessment? Head over to our OCW Educator portal and click “Assessment.” Then filter your results by topic, such as feedback, formative assessment, performance assessment, student self-assessment, and more.
If you find a strategy on our site that helps you solve (or inspires you to think differently about) your assessment puzzle, we want to hear from you! We’ll share some of the trickiest puzzles with the most creative solutions on our Facebook page. Go!
via Solving the Assessment Puzzle — Open Matters