Sridhar, Navin

Navin Sridhar is a PhD student in the Department of Astronomy at Columbia University. His current research interests focus on the plasma physics around black holes and neutron stars. He studies them by developing, testing, and using established theories to perform plasma simulations on supercomputers, and through direct observations using satellite and ground-based telescopes. Navin’s intent behind attending PDP was not only to learn effective teaching methods, but also to learn how these skills can be transformed effectively in non-academic sector. Navin believes that the skills he developed as a part of his PDP experience will be a valuable asset for him while designing equitable, and inquiry driven courses and curricula.

 

navin.sridhar@columbia.edu

navinsridhar_crop.png

Teaching Activity Summary

Name of Teaching Activity: "To err is human - Measurement uncertainties and significant measurements"

Teaching Venue: UCLA Physics/Astronomy PREP

Teaching Date: September 25, 2019

Learners: 20 undergraduate students.

Reflection on teaching and assessing the practices of science or engineering:

The STEM practice we concentrated on was, `Constructing explanations’. In physics/astronomy, we construct explanations to expound on our observations and experiments all the time, and therefore, it is an authentic STEM practice. Being able to construct an explanation using data, and using reasoning to connect the explanation to a broader claim, is a critical component of any experimental science. Requiring students to construct scientific explanations to support their claims (and developing the argumentation to defend the claims with the data) has been shown to enhance students' understanding of the scientific content (Zohar et al., 2002). In addition, the process of constructing explanations, and argumentation of the explanations is a skill that is not just content specific, but is applicable in a wide range of circumstances, and is also transferable to non-STEM fields (Kuhn et al., 1993, Yeh et al., 2001).

In order to encourage the critical thinking required to construct explanations, in our teaching activity, we allowed the students to come up with their own questions on a few scenarios/datasets. The questions that they raised were also followed by a claim/hypothesis that they made based on the first look of their data. The students then unraveled their claims with logical tools, which later became a part of their explanation for why their claim was scientifically either true or flawed.

As much as this practice might be important in various facets of critical thinking, there also lies some common difficulties that learners may face, going through the process of constructing logical explanations. McNeill et al. (2006) provide the following nice summary of some of the difficulties students typically face with constructing explanations:
● Not being able to use the appropriate data to support the claim, or having trouble identifying which data is useful and which isn’t, or what constitutes as “sufficient evidence”.
● Confirmation bias—students are more likely to discard data if it doesn’t match their theories/expectations (Chinn et al., 1991, Koslowski 1996).
● Most of the verbal discussions are dominated by just the claim, rather than building up of evidences supporting the claim (Jiminez-Aleixandre et al., 2000).
● Constructing explanations at many instances in astrophysics require mathematical calculations as a basis, and students might be uncomfortable with it. 
● Understanding the calculation and statistical principles does not necessarily carry over to, students to readily applying them to construct the correct explanations. A certain sense of intuition is required in physical sciences.

We tried to resolve some of these difficulties by conducting a pre-activity survey designed to understand prior knowledge of students (e.g., mathematical/statistical background). This allowed us to plan our facilitation strategy. The three chief dimensions of practice (rubric) we concentrated during our activity are as follows:
● Claim: A claim is proposed that answers the original question.
● Evidence: Relevant data are used to support claim.
● Reasoning: Reasoning links evidence to claim.
We checked whether relevant data or observations are used to support their claim. For example, students were required to justify their claim of whether the data contains a significant detection (say, of an atomic/molecular emission line) using confidence intervals. Here, the data in hand formed the evidence, and the statistical analysis (weighted average, standard deviation, confidence level, etc.) formed the ‘reasoning’, required to construct explanations. Our content and practice goals thereby were designed to go hand-in-hand.

Our activity’s Culminating Assessment Task (CAT) was also designed to mimic the authentic STEM practice of: individual paper writing followed by an anonymized ‘peer review’, and the act of disseminating science to a general audience, through a poster presentation. We handed over to the students the practice rubric—before their individual paper writing segment—with the three dimensions as mentioned above. In addition, we shared with the students the expectations on what qualities exhibit a nuanced, good, fair, poor understanding of the concepts. Students used this practice rubric for their CATs, and they learnt about their mistakes from their peers in the form of a peer-review (in addition to the facilitator interactions). Students received their feedback before the poster jigsaw presentation, which gave them an opportunity to improve. This way, we also prevented a hostile learning environment by not placing our emphasis on ‘grading’, but more on ‘reviewing’, and `learning' from other teams' pros and cons. A `no one is being judged’ policy was adopted.