Full Text View

Volume 33 Issue 12 (December 2023)

GSA Today

Article, p. 8–9 | PDF

 

GROUNDWORK:

Our Rock and Mineral Exams Could Be Better

David R. Cordie*

Edgewood College, Division of Physical, Computational, and Mathematical Sciences, Madison, Wisconsin 53711, USA

Elizabeth G. Ceperley

Wisconsin Geological and Natural History Survey, Division of Extension, University of Wisconsin–Madison, Madison, Wisconsin 53705, USA

 

Traditional Methods of Assessment

The method by which we assess our students’ learning is just as important as the way in which we deliver content. With regard to the latter, workshops and conference sessions often display the value of updated pedagogical styles that include more active learning, discussions, and hands-on practice. Yet the assessment method of one of the most hands-on units—rocks and minerals—remains the same: the standard “rock exam.” In our personal experience, these rock exams most often take the form of an instructor providing 20–30 hand samples for students to identify in a one-shot exam. This format dates to the early 1800s, when the United States was expanding and qualified land surveyors were needed (Johnson, 1977). Combined with European traditions grounded in categorizing Earth’s materials, a standard curriculum emerged, focused on memorization and identification of rocks and minerals, that is still used in university classrooms today.

While this method is standard, it may not be ideal. For one, this format can resemble a test that even we as instructors can sometimes fail. The scenario is all too familiar: a student proudly presents a rock from their personal collection, and it is difficult to immediately identify it. Beaten up, dirty, weathered, rounded, and devoid of context, it is not hard to imagine a granite being mistaken for a diorite. Yet, presenting a single specimen as the only opportunity to identify a rock is exactly what these exams are asking our students to do. Furthermore, a one-shot exam promotes unease in a population for which 20%–40% self-report some form of test anxiety (Maier et al., 2021). As our introductory classrooms are typically filled with students seeking general education requirements—as much as 76.5%, according to Gilbert et al. (2012)—it is worth looking into a new method of assessing our students that motivates them to study and learn (Lukes and McConnell, 2014).

 

An Alternative Method

While exams have their downside, it might not be necessary to completely discard them. They are beneficial in providing an objective assessment of an individual student’s understanding of course material unlike subjective projects or group work. If we want to properly assess our students, what qualities should exams have to provide students with a better opportunity to show us what they know? Ideally, assessment of any type will (1) provide feedback, (2) have a clearly defined pathway to success, and (3) be iterative. Since fall 2021, we have used the following method, which is grounded in the concepts of mastery grading (see Farah, 2021, for background), for rock and mineral identification exams in an introductory-level geology course.

The course starts with lessons on the nature of science as well as broad topics in geology (e.g., plate tectonics, structure of Earth, rock cycle) to provide some grounding in the field. Around week five, we start lessons on minerals and then the three rock types. The purpose of this sequence is to provide students with context prior to starting the more detailed work of mineral and rock identification. Starting with the mineral lessons, the first half of class features a mixture of lectures and activities designed to show students why a geologist might want to know what minerals are in a sample. The second half of class is recognizable to many instructors as a standard geology lab with hand samples and guided practice on their identification. However, the final 30 minutes of class are reserved for work on their identification exam.

For these exams, students rotate through stations composed of samples in numbered trays. On their exam sheet, they are asked to identify which numbered tray corresponds to each sample. At the end of class, these exam sheets are turned in and graded (Fig. 1). This process is repeated for four consecutive classes, with new samples, arranged in a new order, for every attempt. A student only receives credit if they correctly identify a sample twice (they do not need to be sequential) during their four attempts. If a student correctly identifies a sample twice prior to their final attempt, they are not required to identify those samples further and are instead free to focus on those they still need to identify. In this manner, a student can focus on troublesome samples without being forced to repeatedly identify those they already know, as can happen in exam formats that take the highest score out of multiple attempts. At the end of the exam, the total number of completed samples determines the student’s final grade.

Figure 1Figure 1

Hypothetical exam sheet with grading (green rectangle for correct, yellow circle for incorrect). Xs on bottom of attempt 1 indicate that these samples were not present on their first attempt. Zidane was an early high scorer. They got everything correct on the first attempt and missed only two on the second attempt. On the final two attempts, they only had to identify samples yet to be correctly identified twice to complete the assignment. Tidus started poorly, but with the flexibility of this format was able to make up for a slow start and still do well on the exam as opposed to being punished for early struggles.

For the instructor, there is a lot of flexibility with this method. For example, an instructor may choose to assess only a portion of the required samples on the first attempt. To make the exam more challenging, one could add more samples, require more correct identifications, or require that correct identifications be sequential. Another option is to allow one of the attempts to be done in groups, knowing that each individual student will still need to learn how to identify all the samples on their own in the future. This method also means that a single poor sample will not spoil an exam for a student, since for the next class new samples will be provided. It also eliminates make-up exams; if a student misses a class, they still have multiple attempts on subsequent days.

 

Results and Discussion

As an assessment tool, there are numerous advantages to this method. First, it provides more frequent feedback. Each class, a student knows which samples they misidentified and can ask questions prior to the next attempt. Second, a student knows how many more samples they must identify each day, giving them a well-defined goal to work toward. And third, the iterative process not only eliminates the stress associated with a single-day exam, but also forces students to repeatedly show their understanding over the course of a few weeks. Students can no longer “cram” the night before, since they must identify the samples again on another occasion.

Since implementing this system in 2021, the median score on the exam has increased 5.3 percentage points (p-value = 0.03); however, statistical power is low with fewer than 50 students compared. Anecdotally, students appear to be less stressed in this environment as opposed to the traditional way, an observation backed up by research (Branco, 2021). After turning in their assignments, exam sheets are graded on the spot and students will often look over samples they got wrong and take notes in preparation for the next attempt. This is a sign that they are taking in feedback and planning for their future success. Additionally, by the third or fourth attempt, many students are gaining confidence in their identifications and rarely take more than 15 minutes to complete the exam. In one conversation, a student mentioned that they could not just memorize appearances of samples from pictures—they actually had to learn the properties of materials, since they knew that there would be multiple samples used throughout the exam. With some reimagining, a rock exam can be a valuable tool in assessing student learning.

 

References Cited

  1. Branco, R.C., 2021, A semester without exams: Approaches in a small and large course: Journal of Undergraduate Neuroscience Education, v. 20, no. 1, p. A58–A72.
  2. Farah, G., 2021, How to set up mastery-based grading in your classroom: Cult of Pedagogy: https://www.cultofpedagogy.com/mastery-based-grading/ (accessed July 2023).
  3. Gilbert, L.A., Stempien, J., McConnell, D.A., Budd, D.A., van der Hoeven Kraft, K.J., Bykerk-Kauffman, A., Jones, M.H., Knight, C.C., Matheney, R.K., Perkins, D., and Wirth, K.R., 2012, Not just “rocks for jocks”: Who are introductory geology students and why are they here?: Journal of Geoscience Education, v. 60, no. 4, p. 360–371, https://doi.org/10.5408/12-287.1.
  4. Johnson, M.E., 1977, Geology in American education: 1825–1860: Geological Society of America Bulletin, v. 88, no. 8, p. 1192–1198, https://doi.org/10.1130/0016-7606(1977)88<1192:GIAE>2.0.CO;2.
  5. Lukes, L.A., and McConnell, D.A., 2014, What motivates introductory geology students to study for an exam?: Journal of Geoscience Education, v. 62, no. 4, p. 725–735, https://doi.org/10.5408/13-110.1.
  6. Maier, A., Schaitz, C., Kröner, J., Berger, A., Keller, F., Beschoner, P., Connemann, B., and Sosic-Vasic, Z., 2021, The association between test anxiety, self-efficacy, and mental images among university students: Results from an online survey: Frontiers in Psychiatry, v. 12, https://doi.org/10.3389/fpsyt.2021.618108.

*dcordie@edgewood.edu
CITATION: Cordie, D.R., and Ceperley, E.G., 2023, Our rock and mineral exams could be better: GSA Today, v. 33, p. 8–9, https://doi.org/10.1130/GSATG578GW.1. © 2023 The Authors. Gold Open Access: This paper is published under the terms of the CC-BY-NC license. Printed in the USA.

Manuscript received 20 July 2023. Revised manuscript received 8 Sept. 2023. Manuscript accepted 6 Oct. 2023. Posted 7 Nov. 2023.

https://doi.org/10.1130/GSATG578GW.1
© 2023, The Geological Society of America. CC-BY-NC.