Another vital component of the project is the 鈥渇lipped classroom,鈥 a teaching tool that is already widely utilized on campus. The practice reverses the traditional structure in which lectures are done in the classroom and problem-solving is done in the library or dorm room. Steven records his lectures and makes them available to students online鈥攚here they can be watched, paused, and revisited anytime. Class time is now spent working individually and in groups, with Steven there and available for questions.
鈥淚鈥檝e found that watching students solve problems right in front of me, and sharing suggestions or strategies in real time, is a more effective use of my skills,鈥 he says. 鈥淚t also helps with students who get it and have advanced questions.鈥 This way, he鈥檚 also able to field any questions or issues that come up with the new Polar CGI component of his curriculum.
In order to explore the combined impacts of polar data, CGI, and flipped classrooms on student learning, Steven and Lea need to evaluate the project鈥檚 results. 鈥淎ll the components of this project have been established as sound pedagogy, but the combination hasn鈥檛 been tested,鈥 Steven explains. 鈥淪o we鈥檙e aiming to evaluate how this works: Do students learn differently in such an environment? Does the method attract students who may not otherwise have been attracted to the sciences?鈥
The team knew that they didn鈥檛 want to write an exam to simply test knowledge. That wasn鈥檛 the point, since these modules within classes are designed to get students to engage with the data, to critically analyze it鈥攖o think. What kind of evaluation tool could show that a chemistry, economics, or graphics student is actually engaging with the information unique to their class?
Aedin and Max brought a valuable student perspective to the evaluation process. 鈥淲e really wanted to focus on getting students to figure out how to ask questions,鈥 Aedin says. 鈥淒oing real science at advanced levels is more creative and less rigid, and we wanted the questions to reflect that that鈥檚 how science actually happens.鈥
To that end, the team came up with pre-assessment and post-assessment survey queries. In the pre-assessment query, students were asked to look at the polar data particular to the subject they were studying and ask鈥攏ot answer鈥攓uestions about it. Then they worked through the Polar CGI module created for their class. In the post-assessment query, students were required to answer the same prompts again, doing as scientists do, which is asking questions and forming hypotheses before and after grappling with data.
How does the team know if they鈥檝e achieved their goal in getting students to think critically, like scientists? 鈥淲e鈥檙e looking for greater complexity in their questions,鈥 Lea says.
The first test run of the evaluation process was a success. 鈥淚t looked like, over time, the students鈥 answers became more sophisticated,鈥 Aedin says. This indicated that students were truly engaging with the information. Aedin called the method 鈥渞eally inclusive鈥 since it gets students to ask their own questions rather than answer a professor鈥檚.
Aedin plans to continue working on the design of this process as part of her senior thesis. 鈥淚鈥檒l be evaluating the surveys and figuring out how to code the answers,鈥 she says. 鈥淚鈥檒l code as many as I get back and hope to be able to quantify the success of the modules.鈥 The external evaluator for the project, Candiya Mann from the University of Washington, will also be auditing the assessment results.
The NSF grant will allow the team to continue developing the Polar CGI modules for two years. They hope to double the number of participating professors in the next academic year, so that they can try it out with a wider range of curricula. The students are certainly hungry for it. 鈥淐limate change is something we care about,鈥 says Max, 鈥渟ince we are poised to feel the effects of it within our lifetimes.鈥