Inquiry Framework and Indicators

Project team members

Lisa Hunter, Anne Metevier, Scott Seagroves, Barry Kluger-Bell

Project Description

“Inquiry” is one of ISEE’s three main themes and is central to nearly all of ISEE’s work. From ISEE’s very first workshop in 2001 (when ISEE was the Center for Adaptive Optics (CfAO) Education & Human Resources program), understanding inquiry teaching and learning and designing effective inquiry learning experiences has been an unwavering focus. Inquiry is a shared goal that has endured through many cohorts of participants and many changes to the PDP, and most would say it is the essence of ISEE. In 2001, ISEE collaborated with the Exploratorium Institute for Inquiry (IfI), which had extensive experience with the K-12 community, to bring an inquiry professional development workshop to a new audience: graduate students and postdocs. ISEE continues to draw from the IfI model for inquiry to this day, and has developed an extensive set of new tools, workshops, and strategies that make up ISEE’s infrastructure. Along the way, we have gained great insight about what inquiry is, what it looks like across a wide range of STEM disciplines, how to assess learning in an inquiry environment, and why it is included in essentially every recent recommendation for improving science education[i],[ii],[iii],[iv] [v]. We have included engineering in our definition of inquiry, which has been a productive and challenging endeavor that has spun out a suite of other ISEE projects. We provide our argument for the importance of inquiry in our publications[vi],[vii],[viii], and here include a summary of our development of our own inquiry framework.

Six Elements of Inquiry:

In ISEE’s earlier years, we strived to articulate what a well designed inquiry learning experience was, in such a way that our community would be able to use it in practice. For example, we realized that our participants were designing activities in which learners were essentially going through the motions of inquiry practices, rather than gaining competency that was more nuanced and part of a developmental progression. A number of definitions and frameworks developed by others had elements that resonated well with our community, but we were unable to find the tool needed to support our expansive work and our emphasis in higher education. As ten years of work through the NSF CfAO (AST-9876783) was concluding, we wished to disseminate what we had learned, and new projects stimulated us to take our implicit working definition of inquiry into a definition that could be used more broadly in practice. An NSF Course, Curriculum and Laboratory Improvement award (DUE-0816754) expanded our work to support course development. A second award from NSF (AST-0836053), funded through the American Recovery and Reinvestment Act, included a major role for ISEE to support the development of a new inquiry intensive engineering technology degree program. The outcome was the development of our own inquiry framework, which includes six elements that we look for in all ISEE-developed learning activities:

  • The learning activity is focused on developing learners’ competency with cognitive STEM practices
  • The learning activity is focused on core concepts
  • Students’ use of cognitive STEM practices and gaining conceptual understanding are intertwined
  • The learning activity mirrors authentic science and/or engineering
  • Students have ownership over their learning and gaining competency
  • Students use evidence to explain their work throughout the activity

These six elements are used to frame ISEE’s Professional Development Program (PDP), guide PDP participants in designing their own inquiry activity, and in evaluating the success of the PDP. For example, within the PDP, an inquiry activity is intended to teach both conceptual understanding of core content and cognitive STEM practices. Cognitive STEM practices include reasoning processes, such as developing explanations or justifying solutions, which are highly valued, but in our conception of inquiry, need to be learned in the context of relevant scientific content, as they are practiced in authentic science or engineering. When a member of our community begins designing an activity that appears to be lacking in either scientific content or practices, referring to the six elements quickly makes this weakness apparent and provides a way for our instructors to provide concrete feedback, or for the activity designer to self-assess and make his or her own revision.

Inquiry Indicators:

With support from NSF grants (DUE-0816754, AST-0836053), we expanded upon our framework to develop a new tool to objectively measure PDP outcomes, and help PDP participants design inquiry activities. We designed and piloted a protocol (involving “inquiry indicators”) to measure how participants’ inquiry activities aligned with our inquiry framework. The inquiry indicators comprise a checklist of elements one could observe in participant-generated artifacts, such as lesson plans, post-teaching reports, instructional materials, and other forms of documentation. Examples include: a clearly articulated learning outcome focused on a core concept, multiple ways that learners can investigate a question, and a culminating assessment task that prompts learners to use evidence to explain their findings.

A main goal of the inquiry indicators was to create an objective way to evaluate the effectiveness of our professional development that did not require direct observation of teaching. We reviewed other protocols, but they did not meet our specific needs and constraints, or have the granularity to apply at the activity design level. Most protocols are aimed at inquiry in a K-12 classroom, and require classroom observation. We needed a protocol that could be used with teaching teams that were geographically distributed, and that assessed inquiry in higher education STEM with the subtlety with which we have defined it. For example, we push our participants to go beyond an intended inquiry learning outcome such as “generating explanations” to something like “generating explanations that integrate and account for results from multiple experiments.” Our experience has taught us that clearly articulating an assessable inquiry learning outcome is essential, and teams that accomplish this generally have strong designs. Thus participant design artifacts are data that reveal important outcomes, and can be reviewed without resource-intensive classroom observations.

To pilot our inquiry indicators, we reviewed and coded participant artifacts for a cohort of PDP participants. We consider an activity design to demonstrate proficiency in inquiry with a 70% score on the inquiry indicator analysis. In the cohort we analyzed, nine of fourteen activity designs scored at least 70%, which was a close match with PDP instructors’ anecdotal observations. In addition, we ran correlations between the inquiry indicator score, participants’ self-reported perception of how their design met the six inquiry elements, and team leaders’ year in the PDP. Our dataset was small (n=14), and only weak to moderate correlations were found, but there was a moderate positive correlation between the score and the leaders’ year in the PDP (r = 0.46, p < 0.10). This matches what we have observed for many years: it is usually in the second cycle of the PDP that participants gain a solid understanding of how to design an inquiry activity. Interestingly, participants’ self-reported assessments of the designs were weakly anti-correlated with this – bolstering the finding that an objective measurement is needed to assess an instructors’ implementation of new teaching practices. Others studying the effectiveness of professional development programs have also found that participants’ self-reported teaching practices do not necessarily match what they actually do.[ix]

Current and Future Work:

We are currently revising our inquiry indicators to adapt them to analysis of new and revised artifacts produced by our participants. For example, we now require that participants fill out a design and lesson plan (a document that outlines each component of the activity, along with rationales for design choices and in-the-moment teaching moves), which is an excellent source of information about participant designs and has stimulated the identification of new indicators. In addition, our piloting provided us with new ideas for prompts that would more consistently elicit the indicators from our participants.

The inquiry framework is now being used to help mentors develop productive projects for student interns.

A revised set of indicators is in development and will be used in 2014.

Publications and products

Inquiry framework originally described in: Cultivating Scientist- and Engineer-Educators 2010: The Evolving Professional Development Program

Current version of inquiry framework



[i] AAAS Project 2061, Rutherford, F.J., and Ahlgren, A., 1989. Science For All Americans, Oxford University Press.

[ii] National Research Council, National Committee on Science Education Standards and Assessment, 1996. National Science Education Standards, National Academies Press.

[iii] National Research Council, Olson, S., and Loucks-Horsley, S., eds., Committee on the Development of an Addendum to the National Science Education Standards on Scientific Inquiry, 2000. Inquiry and the National Science Education Standards: A Guide for Teaching and Learning, National Academies Press.

[iv] President’s Council of Advisors on Science and Technology, Feb. 2012. “Engage to Excel: Producing One Million Additional College Graduates With Degrees in Science, Technology, Engineering, and Mathematics.”

[v] American Association for the Advancement of Science . Vision and Change: A Call to Action. Washington, DC: 2011.[vi] Hunter, L., Metevier, A.J., Seagroves, S., Porter, J., Raschke, L.M., Kluger-Bell, B., Brown, C., Jonsson, P., and Ash, D., 2008. “Cultivating Scientist- and Engineer-Educators: The CfAO Professional Development Program”.

[vii] Hunter, L., Metevier, A.J., Seagroves, S., Kluger-Bell, B., Porter, J., Raschke, L.M., Jonsson, P., Shaw, J., Quan, T.K., Montgomery, R., 2010. “Cultivating Scientist- and Engineer-Educators 2010: The Evolving Professional Development Program” in Learning from Inquiry in Practice, L. Hunter & A.J. Metevier, eds. ASP Conference Series 436: 3.

[viii] Ball, T., and Hunter, L., 2010. “Using Inquiry to Develop Reasoning Skills and to Prepare Students to Take Initiative in a Research Setting: Practical Implications from Research” in Learning from Inquiry in Practice, L. Hunter & A.J. Metevier, eds. ASP Conference Series 436: 490.

[ix] Ebert-May, D., Derting, T. L., Hodder, J., Momsen, J. L., Long, T.M., & Jardeleza, S. E. (2011) What We Say is Not What We Do: Effective Evaluation of Faculty Professional Development Programs. BioScience, 61(7): 550-558.