Full Text View

Volume 21 Issue 8 (August 2011)

GSA Today

Bookmark and Share

Article, pp. 26–28 | Abstract | PDF (245KB)

Groundwork
GROUNDWORK:

Revisiting the Geoscience Concept Inventory: A call to the community

J.C. Libarkin1, E.M.G. Ward1, S.W. Anderson2, G. Kortemeyer3, S.P. Raeburn4

1 Dept. of Geological Sciences, Geocognition Research Lab, Michigan State University, 206 Natural Science, East Lansing, Michigan 48824, USA
2 MAST Institute, University of Northern Colorado, 1210 Ross Hall, Campus Box 123, Greeley, Colorado 80639, USA
3 Lyman Briggs College, Michigan State University, East Lansing, Michigan 48825, USA
4 Division of Science & Mathematics Education, Michigan State University, 103 N Kedzie Lab, East Lansing, Michigan 48824, USA

Table of Contents
Search GoogleScholar for

Search GSA Today


 

Abstract

The use of concept inventories in science and engineering has fundamentally changed the nature of instructional assessment. Nearly a decade ago, we set out to establish a baseline for widespread and integrated assessment of entry-level geoscience courses. The result was the first Geoscience Concept Inventory (GCI v.1.0). We are now retiring GCI v.1.0 and rebuilding the GCI as a more community-based, comprehensive, and effective instrument. We are doing this in the hopes that GCI users, many of whom have expressed a need for a revised and expanded instrument, and the geoscience community at large will view it as a springboard for collaborative action and engagement. If we work together as collaborators, the geosciences have the potential to evaluate learning across our community and over time.

Manuscript received 12 Oct. 2010; accepted 29 Mar. 2011

doi: 10.1130/G110GW.1

Introduction

The Geoscience Concept Inventory (GCI; Fig. 1) was developed to diagnose conceptual understanding and assess learning in entry-level geoscience courses. The GCI has become a staple in many classroom-based research studies, is being revised for use in pre-college settings, and has been shown to discriminate between experts and novices. Although a valuable research tool, the GCI is in need of an expansion that can only be accomplished by a community of geoscientists and educators working together. This paper is a call for that collaboration.

Fig. 1Figure 1

Development cycle for the Geoscience Concept Inventory (from Ward et al., 2010) and an exemplar GCI question.

The GCI holds a unique place in the concept inventory world for several reasons. First, the GCI is the only concept inventory to generate a bank of correlated concept inventory questions for higher education science (Libarkin and Anderson, 2006). Through this correlation, users of the GCI can create course-specific subtests rather than being tied to a single set of questions.

Second, the GCI contains single response, two-tier, and multiple-response multiple-choice questions. Two-tier questions offer added insight into student thinking by requesting an explanation for student responses (Treagust, 1988). Multiple-response questions, essentially a set of true/false items, are generally more difficult than typical single-response items and are cognitively similar to free response questions, offering deeper insight into cognition (Kubinger and Gottschall, 2007).

Third, GCI questions were developed from ideas that both experts and novices found important for entry-level geoscience courses. A review of textbooks provided initial ideas about important concepts for inclusion on the GCI, while open-ended interviews with students provided additional topics (Libarkin and Anderson, 2005). For example, in-depth interviews suggest that students conflate gravity and magnetism and inflate the importance of magnetic fields on the movement of large objects. Addressing this mixing and mis-scaling is important for student understanding of geomagnetism and its effects, a discovery that only became apparent after considering the student perspective.

The Need to Revise and Expand the GCI as a Community

The original GCI questions were piloted with up to 5,000 students enrolled at >40 institutions nationwide, with the current version in use by >200 faculty and researchers. The GCI has been used to estimate learning in geoscience courses, including evaluation of specific instructional approaches (e.g., Kortz et al., 2008) and analysis of learning (e.g., Petcovic and Ruhf, 2008). In ongoing work, GCI scores have been shown to correlate strongly with geological mapping ability. This suggests that the GCI, a measure of very foundational knowledge, can be used as a skills measure to predict performance on an expert task. While we are encouraged that GCI v.1.0 was useful in these studies, we acknowledge that the instrument ingrains our own biases and limitations. As many of our colleagues have stated, the GCI is both an effective instrument for gauging learning in entry-level geoscience courses and a test in need of revision.

The diversity of geoscience courses at all levels should be reflected in the assessment instruments used to evaluate learning nationwide. Expansion to more complex, wider ranging questions will allow replicable assessment in advanced courses and across geoscience programs. A critical need for questions targeted toward upper-level courses requires community effort. Experts knowledgeable about issues students have understanding complex ideas, such as feedback in global systems, are needed to write, review, and test new questions.

The current effort to revise and expand the GCI is a community endeavor. This interdisciplinary and collaborative approach addresses the limitations that are otherwise inherent in any tool generated for an entire field by a single development team: (1) education technology specialists with expertise in online assessment, together with geocognition researchers, oversee question dissemination, community feedback and question submission, and online data collection; (2) self-selected geologists, science educators, and instrument developers participate as reviewers and authors of new questions; and (3) the GCI development team analyzes student response and interview data to establish instrument validity and reliability.

We have been collecting comments from users and have been reevaluating the GCI from the perspective of existing standards for instrument design (e.g., Moreno et al., 2006). Based on this examination, we have generated a revised version of the GCI. This version, GCI v.2.1, is available through the GCI WebCenter at gci.lite.msu.edu. We invite the community to contribute to its on-going use and development through:

  1. Reviewing GCI questions. Reviews and comments on existing questions and those proposed for inclusion are needed.
  2. Proposing new areas for GCI development. The existing GCI covers only limited topics, and inclusion of questions from atmospheric sciences, geophysics, planetary science, and other fields is needed.
  3. Becoming authors of the GCI. Contributors become co-authors of the instrument. Guidelines are available at the GCI WebCenter (gci.lite.msu.edu; Libarkin and Ward, 2011). Revisions initiate expert review and statistical analysis of student responses. The development of new GCI questions takes at least six months from submission to validation, with continual change in response to community needs.
  4. Assessing student learning. Online testing can generate a national sense of student learning, as well as link learning to instruction. The GCI is being migrated to a new system (LectureTools) that will offer auto-feedback of results, analysis of overall course performance, and summary of student conceptual difficulty. Anonymous data collected from courses across our community will also be accessible.

We encourage anyone with a stake in teaching and learning to become involved with the GCI as a reviewer, developer, or user. This involvement is vital for any assessment initiative that serves an entire community. Careful consideration of evidence, such as that offered by the GCI, is a first step to answering calls for overall improvement of instruction (e.g., COSEPUP, 2006). The quality of assessment can only rise to the level of the tools used, and everyone has a stake in ensuring that assessment instruments continually improve. We invite our community to join us as co-authors of the GCI.

Acknowledgments

We thank all GCI users and colleagues who graciously provided constructive feedback, encouraged us in this new initiative, and are poised to become co-authors on a community GCI. The GCI was funded by the U.S. National Science Foundation (NSF) through grants DUE-0127765, DUE-0350395, DGE-9906479, DUE-0717790, and DUE-0717589. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the NSF.

References Cited

  1. COSEPUP, 2006, Rising above the Gathering Storm: Energizing and Employing America for a Brighter Economic Future: National Academy of Sciences, National Academy of Engineering, Institute of Medicine, 524 p.
  2. Kortz, K.M., Smay, J.J., and Murray, D.P., 2008, Increasing learning in introductory geoscience courses using lecture tutorials: Journal of Geoscience Education, v. 56, p. 280–290.
  3. Kubinger, K.D., and Gottschall, C.H., 2007, Item difficulty of multiple choice tests dependant on different item response formats—An experiment in fundamental research on psychological assessment: Psychology Science, v. 49, p. 361–374.
  4. Libarkin, J.C., and Anderson, S.W., 2005, Assessment of learning in entry-level geoscience courses: Results from the Geoscience Concept Inventory: Journal of Geoscience Education, v. 53, p. 394–401.
  5. Libarkin, J.C., and Anderson, S.W., 2006, The Geoscience Concept Inventory: Application of Rasch analysis to concept inventory development in higher education, in Liu, X., and Boone, W.J., eds., Applications of Rasch Measurement in Science Education: Fort Dodge, Iowa, JAM Publishers, p. 45–73.
  6. Libarkin, J.C., and Ward, E.M.G., 2011, The qualitative underpinnings of quantitative concept inventory questions, in Feig, A.P., and Stokes, A., eds., Qualitative Inquiry in Geoscience Education Research: Geological Society of America Special Paper 474, p. 37–48.
  7. Moreno, R., Martinez, R.J., and Muniz, J., 2006, New guidelines for developing multiple-choice items: Methodology, v. 2, p. 65–72.
  8. Petcovic, H.L., and Ruhf, R.R., 2008, Geoscience conceptual knowledge of preservice elementary teachers: Results from the Geoscience Concept Inventory: Journal of Geoscience Education, v. 56, p. 251–260.
  9. Treagust, D.F., 1988, Development and use of diagnostic tests to evaluate students’ misconceptions in science: Journal of Biological Education, v. 10, p. 159–169.
  10. Ward, E.M.G., Libarkin, J.C., Raeburn, S., and Kortemeyer, G., 2010, The Geoscience Concept Inventory WebCenter provides new means for student assessment: eLearningPapers, no. 20, www.elearningpapers.eu (last accessed 7 Oct. 2010).

top