Scientific Modeling Rubric Development Process
Dimension Development
We developed the first iterations of a modeling rubric that incorporated most of the aspects included in the literature frameworks discussed below. This led us to the following:
Dimensions of "use of models in science"
- Models are used to explain and predict
- Models are evaluated for their ability to represent and explain phenomena
- Phenomena can be represented by multiple models
- Distinguishing between a model and the actual phenomena that it represents
- Models are revised to take into account additional information related to the phenomena
Ultimately, we removed the dimensions of “models are used to explain and predict” and “models are revised…” to focus on the more challenging aspects of models that students may need more feedback on. Model revision is also difficult, but in the context of a lecture course, we did not envision as many opportunities to revise models unless a class structure such as Bierema et al. (2017) was implemented.
Quality Definitions / Learner Progression
Based on our initial literature review, our rubric initially took the form of something that would allow us see what students can tell us about using models, instead of students actually using the models. For example, we iterated on “sufficient understanding” criteria for the “Phenomena can be represented by multiple models,” as follows:
Pre-iteration | Post-iteration |
Indication that models change to depict different object properties or structural features | Multiple models of the same phenomenon are used to depict different object properties or structural features |
This type of change was enabled through thinking of the types of tasks that we would want students to be doing. We wanted to assess students' ability to use models to explain phenomena, not to tell us how they would use a model to explain. This distinction was abstract for us at times, but we were able to clarify by using our own examples of using models to explain in authentic research settings, which also provided us examples of nuanced engagement and understanding of the practice. For example:
Dimensions of “Use of Models in Science” |
Authentic engagement in research context |
Models are evaluated for their ability to represent and explain phenomena |
“This model helps us understand receptor structural dynamics of agonist binding to a GPCR. The model is able to differentiate receptor dynamics in apo state and in presence of validated agonist” |
Phenomena can be represented by multiple models |
“This model is consistent with others of biased receptor signaling and agonist-bound static crystal structures” |
Distinguishing between a model and the actual phenomena that it represents |
“Model accounts for receptor dynamics in solution state assuming that gas-phase ionization retains HDX markers. Does not have amino-acid resolution and does not account for known co-receptor activation. Model uses biomimetic membrane to solubilize the receptor” |
We arrived at our final iteration before piloting with students:
Dimension of “Use of Models in Science” |
1 Early Understanding |
2 Intermediate Understanding |
3 Nuanced Understanding |
1. Models are evaluated for their ability to represent and explain phenomena (Explanatory Power) |
No connection made between model features and phenomena of interest to evaluate its ability to explain. The phenomena is discussed generally. |
Original phenomena is directly compared to model of interest as evidence of explanatory power |
Connections are made between model and specific aspects of phenomena that it is supposed to explain |
2. Phenomena can be represented by multiple models (Multiple models are valid) |
Only a single model is used to explain a phenomenon despite other useful models being available |
Multiple models of the same phenomenon are used to depict different object properties or structural features |
The relationship between multiple models are used to explain (bolster the explanation of) a phenomenon, acknowledging that they may all be valid at the same time |
3. Distinguishing between a model and the actual phenomena that it represents (Assumptions and limitations) |
A model is described as an exact replication of the phenomena |
Salient features, traits, or properties of the model are described |
The model’s relationship to the phenomena, including limitations, is justified in order to contextualize an explanation The model is acknowledged to be a short-hand representation, with limitations, that emphasizes aspect of phenomena in question |
Literature Search
Oh, P.S., and Oh, S.J. (2011). What Teachers of Science Need to Know about Models: An overview. International Journal of Science Education 33, 1109–1130.
This overview of the nature of models and their uses in science classrooms is an excellent review of the literature that breaks down modelling into subtopics. These subtopics were influential starting points for our rubric dimensions: meanings of a model, purposes of modelling, multiplicity of scientific models, change in scientific models, and uses of models in the science classroom. The paper also shares a useful working definition of models as a “representation of a target and serves as a ‘bridge’ connecting a theory and a phenomenon.”
Grünkorn, J., zu Belzen, A.U., and Krüger, D. (2014). Assessing Students’ Understandings of Biological Models and their Use in Science to Evaluate a Theoretical Framework. International Journal of Science Education 36, 1651–1684.
This study empirically evaluated a theoretical framework of scientific modeling comprising of five aspects divided into three levels each: nature of models, multiple models, purpose of models, testing models, and changing models. The study asked questions related to each of those aspects to over one thousand studunts, and reports exemplar student responses for each level of each aspect, including categories that emerged from the analysis. We heavily utilized this work as it provided components of scientific modeling and different levels, which informed our rubric dimensions and quality definitions. Additionally, the examples of student responses allowed us to anticipate how students could respond to our assessments and revise the rubric accordingly.
Gogolin, S., and Krüger, D. (2018). Students’ understanding of the nature and purpose of models. Journal of Research in Science Teaching 55, 1313–1338.
The researchers in this study reported diagnostic information about students’ conceptions of models and modelling by asking students about to what extent models represent a phenomena, and what purpose particular models may be used for. These questions were related to two of five aspects of a theoretical framework of model competence, “nature of models” and “purpose of models.”
Other research that influenced our rubric development included:
Krell, M., Reinisch, B., and Krüger, D. (2015). Analyzing Students’ Understanding of Models and Modeling Referring to the Disciplines Biology, Chemistry, and Physics. Res Sci Educ 45, 367–393.
Schwarz, C.V., Reiser, B.J., Davis, E.A., Kenyon, L., Achér, A., Fortus, D., Shwartz, Y., Hug, B., and Krajcik, J. (2009). Developing a learning progression for scientific modeling: Making scientific modeling accessible and meaningful for learners. Journal of Research in Science Teaching 46, 632–654.
Bierema, A.M.-K., Schwarz, C.V., Stoltzfus, J.R., and Pelaez, N. (2017). Engaging Undergraduate Biology Students in Scientific Modeling: Analysis of Group Interactions, Sense-Making, and Justification. LSE 16, ar68.