Hosek, Matt

Matt Hosek is a postdoc in the astronomy department at UCLA. He is part of the UCLA Galactic Center Group and studies star formation close to the supermassive black hole at the center of the Milky Way. As a three-time participant in the PDP program, Matt is excited to use the inquiry-based learning concepts he has learned in future teaching opportunities, and apply the group management and leadership skills he has gained to future projects. 

mwhosek@gmail.com

hosek_headshot-copy.jpg

Teaching Activity Summary

Name of Teaching Activity"To Err is Human" 

Teaching Venue:UCLA Physics/Astronomy PREP, September 25, 2019

Learners: 22 undergraduate students.

Reflection on teaching and assessing the core science or engineering concept:

Our content goal was to have learners understand the quantitative relationship between error bars and confidence intervals in order to quantify the significance of an event, while weighting individual data points by their uncertainties. This is a critical concept because all scientific data contains measurement uncertainties, and so scientists must be able to properly account for these uncertainties when interpreting data, no matter what field they pursue. Further, while measurement uncertainties has been identified by as a core concept in physics education, student’s exposure to the concept in traditional physics curricula is often limited and lacking (Wilson et al, 2010;

http://openjournals.library.usyd.edu.au/index.php/IISME/article/viewFile/4686/5474).

Our activity addressed this content by presenting students with simulated data sets for a variety of experiments (exoplanet transits, galaxy spectra, and blood pressure monitoring), and asking them to draw a conclusion based on an “event” they observe in the data (e.g. a transit dip, a spectral line, a blood pressure drop). The data sets were designed so that each data point had an associated measurement uncertainty, the size of which varied across the data set. The students needed to take these uncertainties into account when measuring their “event”, and quantify the statistical significance of their detection to justify their conclusion. This task encompassed the 3 elements of our content rubric: 1) interpreting error bars mathematically as confidence intervals (we assume Gaussian errors), 2) using error-based weights when calculating the average baselines and signal depths from the individual data points, and 3) justifying a detection claim using statistical significance.

We anticipated several difficulties for students during this activity, many of whom had never dealt with measurement uncertainties before. Two large conceptual leaps were identified that we addressed via thinking tools during the activity: 1) the concept of a confidence interval, and its mathematical description assuming a Gaussian distribution, and 2) propagation of errors (not part of our content goal, but students need this when calculating statistical significance when considering the uncertainty in the baseline and the event detection itself). Other difficult spots we anticipated addressing during facilitation was not considering that different data points have different levels of uncertainty when calculating trend line (i.e., the need for a weighted average), how to weight a calculation mathematically (we wanted to get the students to a point where they understood that the weights needed to be inversely related to the errors, but didn’t care if they did 1/uncertainty^2 or not), and saying that a detection is significant just because the difference is larger than the error bars (i.e., not quantifying the significance in terms of confidence).

In the culminating assessment task, we found that students achieved two out of our 3 content goals, namely understanding the connection between error bars and confidence intervals and then quantifying the statistical significance of their event. However, many groups didn’t quite get to error weighting without heavy-handed facilitation. However, we feel this was due to the very limited time we had for our activity (only ~100 mins of investigation time, due to scheduling constraints). In the future, I think this activity needs at least 30-45 more minutes of investigation time for the students to absorb this aspect of the activity.