Qualitative and Quantitative Measures for Assessing Student Acquisition of Knowledge and Skills in a...
Transcript of Qualitative and Quantitative Measures for Assessing Student Acquisition of Knowledge and Skills in a...
Qualitative and Quantitative Measures for Assessing Student Acquisition of Knowledge and Skills in a Forensic Anthropology Field School Podium presentation in the invited panel session “Teaching and Learning among Anthropologists’ Pupils: Student Assessment Methods and Research” at the 113th annual meeting of the American Anthropological Association, Washington D.C. Adam Kolatorowicz1, Alexis R. Dzubak1, Amanda K. Ponomarenko1, Monica S. Hunter2 1 Department of Anthropology, The Ohio State University 2 The PAST Foundation Abstract
Anthropological field schools offer a unique opportunity to gain practical experience in applying
the methods and techniques used by professional anthropologists in the field and the laboratory.
Assessing student performance in field-based courses poses different sets of challenges
compared to a traditional classroom setting. An innovative model for student evaluation,
combining rapid assessment processes with content analysis, can be effectively applied to
address the experiential nature of these programs. Forensic anthropology, archeology, and
osteology field schools may not have a formal evaluation process other than students devoting
sufficient time to complete required activities. This time factor is then translated into a letter
grade or certificate of completion. Still, participants may not have acquired essential skills by
completing the program despite in-depth, practical study. This newly offered method focuses on
quantitative data and statistical hypothesis testing to formally determine if students increased
their knowledge base and skill set as a result of the field school learning experience. Qualitative
work produced by the students through hands-on, team and problem-based activities can then be
transformed into binary variables to allow for quantitative analysis. This method demonstrates
that students exhibit significant differences in multiple knowledge and skill areas gained upon
completion of the field course. Instructors can use this information to verify that students have
walked away from the program as a better-trained anthropologist. This method can also be used
i
to further refine course design and instructional strategies, as well as gain new insights to
understand how students engage with instructors, classmates, and Anthropology.
ii
Introduction The field school is an integral component of training in the anthropological sciences. The
field serves as the laboratory where anthropologists of any subdiscipline collect data to test their
hypotheses regarding the human condition. Before entering the field to test specific hypotheses,
anthropologists-in-training complete coursework to achieve competency in the theoretical
foundations of their research focus and an introduction to the methods used by scholars.
Fieldwork serves as a supplement to classroom learning and complements training as an
anthropologist in myriad settings. As an example, a small-scale ethnography produced by data
collection employing shadow observation may be assigned in an introductory cultural
anthropology class. An intensive, eight week archaeological field program living in a tent is at
the other end of the spectrum in which field experiences serve as work ancillary to the classroom
or laboratory.
Traditionally, the field school is considered a rite of passage or milestone in the transition
from student to professional anthropologist (Baxter, 2009). Completion of a field school
carrying academic credit hours may be required coursework for students in order to be granted
an undergraduate or graduate degree in anthropology. By being more fully immersed in the
principles, theories, methods, and techniques discussed in the classroom and practiced in the lab,
participants who successfully complete an intensive field school are thusly transformed into
increasingly competent individuals. Field experiences challenges individuals to apply what they
know about the human condition to new contexts. The centrality of experiential learning and
that learning occurs in a real-world research context are two primary factors that make field
schools unique and valuable (Mytum, 2012). An additional function of field schools is to serve
as an artificially selective force that culls potential future anthropologists from entering the
1
professional gene pool, guiding them to other subdisciplines or fields that better suit their
interests. For example, students discover rather quickly that they do not want to pursue a career
in archeology after spending a month in a remote location cut off from family and technology
while lacking the ability to take a hot shower.
The transformative experience of anthropological field schools is unlike what is
encountered in the classroom. In the field, focus is commonly placed upon hands-on, problem-
based learning. The problem manifests itself as a research question or an artificial scenario that
students must solve by applying prior training or experience in a new context. Thus, assessment
tools developed for evaluating work in the classroom (e.g. essays, research papers, exams,
quizzes, or presentations) may not be appropriate for evaluating work performed in field
scenarios (Mytum, 2012). The pedagogical value of field schools requires different methods for
describing and understanding the unique experiences encountered in a field setting. There has
been little to no formal review of the assessment tools used in anthropological field schools;
although, some work has been accomplished to outline minimum competencies. In the subfield
of archeology, the Register for Professional Archaeologists (RPA) established a set of guidelines
and standards for archaeological field schools based upon a 1974 resolution of the American
Society of Archaeology. The RPA certifies field schools that uphold rigorous professional and
ethical standards ensuring that students will learn the skills necessary to perform work as a
professional archeologist; however, there are no recommendations for how this training might be
assessed. We recognize that standardized assessment of all programs may not be practical due to
the unpredictable nature of fieldwork. Instead we suggest that this is an opportunity to critically
reflect on how students are evaluated in field programs and provide empirical evidence to show
that students have been transformed in an experiential learning environment.
2
Formal assessment of field schools should take a more centralized role in not only
evaluating student performance, but to verify if what instructors say they do in the field actually
corresponds to learning achievements by their students. Arriving in the field and performing the
required work without formal assessment is not adequate to legitimize a grade or attainment of a
certificate of completion. Here, we define “learning outcomes” as an increase in one’s
knowledge base, development of a skill set, and ability to apply one’s knowledge and skills in
new contexts. This explicit epistemological perspective has been particularly lacking in
anthropological field schools with foci in the forensic sciences. There are a number of rigorous
forensic anthropology short courses and field schools across the country that produce students
who later go on to be productive scholars and scientists.1 That, in and of itself, is a measure of
the effectiveness of the programs. In regards to field courses in forensic anthropology and
bioarcheology it has been noted that, “Ideally, students leave (the) course with an understanding
of the broad scope of bioarchaeology and forensic anthropology” (Bauer-Clapp et al., 2012:28),
with the operative word being “ideally.” Let us move beyond ideal scenarios to provide
evidence that students have gained a broader understanding of anthropology and acquired a
newly-developed skill set as well as identifying gaps in said skills and knowledge. A more
formal evaluation will help to document training ensuring it aligns with discipline-specific
standards for professional development. We have developed a novel method for measuring
outcomes in a field school setting by assessing field experiences from an anthropological
perspective using data collected via content analysis. This method focuses on testing the
hypothesis that there is a change in the knowledge base and skill set for those individuals
completing a field school.
1 Mercyhurst University, University of Tennessee - Knoxville, Texas State University, West Carolina University offer advanced training for undergraduates, graduate students, and medicolegal professionals.
3
First, we will describe the cultural system of education in forensic anthropology within
the context of a changing educational landscape as it relates to developing transdisciplinary skill
sets, gaining experience in applied jobs, and a greater emphasis in pedagogy on measurable
learning outcomes. Second, the Forensic Science and Anthropology Field School offered at The
Ohio State University will be presented as a case study to apply our method and examine if
students meet learning objectives and course outcomes. This program was carefully constructed
using backmapped course design while adhering to the principles of instructional design theory
applied to a problem-based teaching and learning environment. Third, we will review the myriad
assessment tools used in the field school to document changes in knowledge and skills which
provided data, via a version of content analysis, and statistical testing of hypothesized learning
outcomes. Finally, we close with critical reflection on the assessment process and course design
as instructor-ethnographers. A call to action will be made to incorporate formal assessment in all
anthropological field schools.
A Cultural System of Education in Forensic Anthropology
The cultural system of education and training in forensic anthropology has traditionally
centered on an apprenticeship process. A student is closely supervised by a professor, typically a
Diplomate of the American Board of Forensic Anthropology, who transfers their knowledge
base, stemming from years of practical and research experience, to the student through close
mentoring. It is perceived that a level of competency cannot be reached without study under a
master. The National Academies of Science report Strengthening Forensic Science in the United
States: A Path Forward (2009) recognized the arcane master-apprentice relationship of many
forensic science disciplines and called for an increase of standardization in the forensic sciences
4
regarding analytical techniques as well as the minimum qualifications, education, and training of
practicing forensic scientists.
The educational system in forensic anthropology is formally outlined in the published
documents of the Scientific Working Group for Forensic Anthropology (SWGANTH).
SWGANTH is cosponsored by the Federal Bureau of Investigation and Department of Defense
Central Identification Laboratory, created preemptively to the publication of the 2009 NAS
report, and serves the dual purpose of 1) “to develop consensus best-practice guidelines and
establishing minimum standards for the Forensic Anthropology discipline” and 2) “to
disseminate SWGANTH guidelines, studies, and other findings that may be of benefit to the
forensic community” (SWGANTH, 2013). The SWGANTH white documents describe the
required Education and Training (2013) and Qualifications - Minimum Standards (2014) for
forensic anthropologists. Instead of naming individuals as “master” or “apprentice,”
SWGANTH uses the designations “Forensic Anthropologist I” (apprentice), “Forensic
Anthropologist II” (journeyman), and “Forensic Anthropologist III” (master). Our method
moves to align with the NAS Report’s call for standardization in training by documenting
outcomes. The aforementioned masters are located at a handful of institutions across the country
to which apprentices flock to receive advanced training under the tutelage of the masters. This
migration is fueled by popular knowledge about forensic science and anthropology as viewed
through television programming such as CSI or Bones. It has created unrealistic expectations for
the capabilities of the forensic sciences, also known as the ‘CSI Effect’, and has changed the
manner in which legal counselors present their case to jurors and how forensic scientists explain
their findings (Shelton, 2008). The CSI Effect has been coupled with an expectation by students
that the latest technology will be incorporated in class design (Oblinger and Oblinger, 2005).
5
Market demand has provided impetus for the creation of unique field programs that not only
provide advanced training in biological anthropology and archeology, but for acquiring broader
skill sets that can be applied to multiple contexts.
In the broader context of a changing educational landscape are an increased emphasis on
measurable learning outcomes and fostering transdisciplinary skill sets. Metrics are put in place
to verify that students are meeting course goals and learning objectives (Duque and Weeks,
2010). On an even larger scale as it relates to public policy there is a need to document
outcomes to justify public spending on education. Educators and administrators of post-
secondary institutions are accountable to stakeholders of the education system (Frye, 1999) who
include the students attending the school (desiring a degree to attain employment), their parents
(desiring their children to earn a degree to help with employment), and society writ large
(providing education for its members). To satisfy the demands of stakeholders, assessment tools
and instructional methods that align with desired learning outcomes must be developed to
provide the required metrics (Biggs, 2003). One such learning outcome is the development of a
transdisciplinary skill set required for the highly integrated world. Recent trends in the demands
of employers have moved away from seeking hyperspecialized employees to those possessing
broader skill sets that can serve many purposes (Nugent and Kulkarni, 2013). This issue has
been specifically addressed in archeological field school programs wherein skills can be
transferred to most disciplines (Cobb and Crouchner, 2012). Again, it is here that field schools
can uniquely contribute experience in applied settings not found in the classroom.
Anthropology, Death, and the Law
The Forensic Science and Anthropology Field School offered at The Ohio State
6
University Department of Anthropology is an example of a program born out of market demand
and the need to adequately train students in practical, transferable skills not easily acquired in the
traditional classroom. It was developed for undergraduate students with little to no prior
experience in anthropology or forensic science. Applicants have a general interest in forensic
science and those deciding if the forensic science would be an appropriate career choice. The
field school draws students from across the United States and from a wide variety of disciplines.
The majority of participants are junior and senior Anthropology majors and other majors include
Biology, Chemistry, Criminology, Sociology, International Studies, pre-Medicine, and pre-
Dentistry. The program is an intensive four week course based on the resolution of a mock
medicolegal death investigation in which students participate in all stages of the case from crime
scene discovery to courtroom testimony. From a pedagogical perspective, it is situated between
a short course that offers advanced training in a specific set of methods, such as a week-long
bomb blast investigation course, and a traditional archeological field program in which
participants live the life of an archeologist for two months, as in excavation of the material
remains of prehistoric peoples. A problem-based teaching and learning environment was chosen
to foster the development of knowledge base and skill set. Students receive training in multiple
disciplines and work directly with actual forensic scientists, law enforcement agents, and legal
professionals as they attempt to answer the question “Whodunit?”
Backmapped course design (Wiggins and Mctighe, 2005) was adopted to identify
curricular priorities and build the program, beginning with the creation of five course goals that
revolve around the anthropological perspective and scientific methodology. Upon completion of
the program students are able to:
1. Improve interpersonal professional and public presentation skills. 2. Work cooperatively in a group and develop team skills.
7
3. Employ the scientific method to answer questions related to crime scene reconstruction.
4. Appreciate the role of an Anthropologist in medicolegal death investigations. 5. Distinguish between forensic science as portrayed in the popular media versus the
reality of forensic science as practiced by professionals. The superstructure of the program rested on the foundation of these goals and subsequently
developed learning modules, objectives, activities, and assessments relate directly to these goals.
These features were backmapped on the course goals. Although design progressed in a forward
manner, course design typically takes place in a backward manner in which activities and
assessments are identified first before goals are created. Instructional design theory was
implemented by including clear information, thoughtful practice, informative feedback, and
strong motivation in instructional components of the course to inform the backmapped approach
(Reigeluth, 1999). This theoretical framework guided the overall process of developing a
distinctive program in which participants leave with a unique skill set applicable to a diverse
array of fields including academia, education, business, industry, engineering, government, or
medicine.
The program is divided into three broader units with the first unit providing training to
develop the requisite knowledge and skill base to carry out the mock investigation. In the second
unit students investigate the crimes and examine the evidence. The third unit focuses on
synthesizing multiple lines of evidence to reconstruct the crime and then testifying in moot court
as a scientific expert witness. Across these units lie several individual modules, taking one half
day to three days to complete, highlighting different forensic science disciplines or requisite skill
sets, each with its own set of learning objectives. A course manual produced specifically for this
program, mimicking the standard operating procedures of forensic science laboratory or law
enforcement field guide, directs participants through the problem-based experience of the
8
program.
During the eight years this program has been offered, specific learning modules have
included team building, orienteering, the legal system, crime scene investigation, crime scene
photography, photogrammetry, human and non-human osteology, developing a biological
profile, taphonomic profiles, bone histology, forensic odontology, forensic entomology, mapping
coordinate systems, processing evidence, forensic archeology, geophysical remote sensing,
canine search and recovery, facial approximation, individual identification, fingerprint analysis,
blood spatter analysis, writing case reports, and courtroom testimony. During the Crime Scene
Processing module, for example, participants are charged with the task of handling a mock
indoor crime scene where a murder is suspected to have occurred and an outdoor crime scene
where the victims were allegedly buried. The indoor scene processing takes places over one day
and the outdoor scene investigation takes place over three days during which time students apply
the skills developed earlier in the program to meet ten specific learning objectives:
1. Efficiently process a crime scene in indoor and outdoor settings by following standard operating procedures.
2. Identify physical security issues related to crime scene management. 3. Work effectively in a team. 4. Apply photography skills to comprehensively record information from a crime
scene. 5. Thoroughly document all evidence through accurate notes and related evidentiary
forms. 6. Begin and maintain the chain of custody. 7. Produce an accurate representation of the scene through mapping. 8. Recognize the contextual associations between the evidence, scene, and
individuals involved in criminal activity. 9. Apply basic photographic skills to capture images necessary for 3D modeling of
specific pieces of evidence and buried human remains. 10. Produce and manipulate 3D models of specific pieces of evidence using Agisoft
PhotoScan and MeshLab software programs. Following backmapped design principles, these learning objectives were established first then a
specific activity was created revolving around staged mock crime scenes. The work students
9
perform during the Crime Scene Processing module and others requires unique assessment tools
to determine their success in meeting specific learning objectives and course goals. A formal,
compound assessment tool focusing on the cognitive domain (Bloom et al., 1956) was developed
to measure learning outcomes and determines if they aligned with the five course goals.
Additional tools assessing affective and psychomotor domains were incorporated at both
informal and formal levels of evaluation and will be presented as anecdotal evidence. The tool
was first implemented in 2013 as a beta program and then applied as an alpha program in 2014.
Methods
To assess whether students were achieving course goals in the manner desired, formal
hypothesis testing was used. The hypothesis is that students acquire discipline specific
knowledge through completing the field school. Data come from the 2013 and 2014 field school
cohorts with each class consisting of 19 students. Data came from multiple assessments,
including a case narrative, hypothetical scenario, and survey, forming a compound assessment
tool.
Student Demographic
To help students meet course goals, it is important to understand the demographic
composition of the student body. A student demographic is useful to understand who is drawn to
the field school and how their interests can be addressed throughout the course. The students
provided an anonymous, self-description using a checklist of experiences and classes taken,
including hobbies and unique skills. This checklist was provided for students as a survey on
Survey Methods (http://www.surveymethods.com). Submissions could be written in as a free
response replies as well. Included in the pre-course survey were self-assessments reflecting on
10
the student’s comfort level working in teams and giving public presentations. All data from the
2013 and 2014 programs were analyzed separately because we recognize that each cohort is a
unique group of students with different abilities, interests, and backgrounds.
Compound Assessment Tool
Our hypothesis was addressed using three assessment tools: a survey, case hypothetical
scenario, and case narrative. Each student completed a survey and scenario before and after the
field school with both the pre- and post-assessments containing the same prompts. Students were
instructed to not use any external references and simply answer both the scenario and survey to
the best of their ability. The case narrative was a free-response assignment that asked students to
integrate all evidence found during the mock field school scenario to produce a meaningful
narrative. Each assessment had a rubric designed for it to ensure that each tool was assessing the
overall course goals. To analyze data, an a priori thematic framework (Krippendorf, 1980) was
applied to each assessment device and was subsequently broken down into components or
desired themes that were individually scored as ‘1-present’ or ‘0-absent’ (Table 1). All desired
information must have been included in the component to score as present.
To analyze each student’s response, textual analysis, searching for each of the a priori
themes, was used. A focus on manifest content, “that which is on the surface and easily
observable such as the appearance of a particular word in a written text” (Potter and Levine-
Donnerstein, 1999:259), was explicitly chosen to align with quantitative measures of
performance in professional development. Also, there were clearly defined, desired outcomes
that students were expected to reach. Themes were identified by reflecting upon how each
assessment could test for the achievement of course goals. The five goals of the forensic field
school were designed to reflect the skill-set and knowledge base of a competent forensic
11
scientist. The survey prompted students to provide a definition for science, forensic science, and
anthropology and answer two open-ended questions about the roles of science and
biological/forensic anthropology in a crime scene investigation. The hypothetical scenario was
scored based upon five themes required in the student’s response: control of scene,
documentation, chain of custody, identification, and clarity of response. The case narrative was
assessed using four main themes: evidence presentation, explanation of behavior at crime scenes,
synthesis of data, and the ability to use non-technical writing. Prior to textual analysis and
coding, all responses had identifying information removed; each response was coded separately
by three researchers. Results were analyzed using the most frequent value scored by the three
researchers, two of whom were field school instructors and one external researcher who was
unassociated with the field school program. Both instructors are in a doctoral program in
anthropology while the external researcher is in an undergraduate program.
Chi-squared analysis was performed in Microsoft Excel 2010 to determine whether
students met desired course goals in terms of each assessment tool: hypothetical scenario, case
narrative, and survey. Chi-squared was chosen to test whether a difference in knowledge did in
fact exist after completion of the forensic field school. Responses from all 19 students of the
2013 field school were used. The hypothesis is students performed better (i.e. achieved more
components) than expected by random chance. We expect half of the students to meet the
components by chance.
Precautions were taken to ensure that the coding scheme could be reliably applied by
different observers. Fleiss’ Kappa (1971), a measure of agreement among observers over what
would be expected by chance, was calculated to identify the level of reliability in coding among
all three researchers. The value is given on a scale from 0 to 1. The higher the number the
12
greater the agreement among observers with lower values indicating potential issues with the
coding scheme. Fleiss’ Kappa scores were averaged for each tool to more broadly assess the
coding method.
Results
In regards to inter-rater agreement, the case narrative, hypothetical scenario, and survey
had Kappa values of 0.56, 0.59, and 0.56 respectively, which are considered “moderate
agreement” as noted by Landis and Koch (1977). The scale they proposed provides a qualitative
description of ranges within possible Kappa values: < 0 = Poor Agreement, 0.01 – 0.20 = Slight
Agreement, 0.21 – 0.40 = Fair Agreement, 0.41 – 0.60 = Moderate Agreement, 0.61 – 0.80 =
Substantial Agreement, and 0.81 – 1.00 = Almost Perfect Agreement.
Chi-square was performed on the observed counts from the pre-assessment tools, the
hypothetical scenario and pre-course survey. No significance was found for either year except
for the clarity of writing (p<0.01) in the hypothetical scenario (Table 2). All students’ responses
were clear and succinct. However, the post-course assessment tools were not significant either
for the 2013 course except for the all 19 students achieving the clarity and documentation
components (p=0.01) of the hypothetical scenario response as well as the writing component of
the case narrative (Table 3). In contrast, the 2014 course had significance for both the evidence
and writing of the case narrative (p<0.01) and the communication component approached
significance (p=0.07). All of the post-course hypothetical scenario components excluding
identification were significant, while the post-course survey approached significance for the
definition of forensic science (p=0.07).
The student population has a well-rounded general education with a focus in the natural
or social sciences. However, few students have completed any advanced coursework in
13
anthropology or forensic science prior to their entry in the field school. In addition to the lack of
advanced coursework, students lacked experience in the field or laboratory. As part of the
course goals, we want students to gain experience in their teamwork and presentation skills.
Most students felt comfortable working in teams prior to entry in the field school but did not feel
comfortable giving formal presentations. The field school seeks to address the paucity of formal
presentation experience through multiple opportunities for presentation, culminating in the final
“expert” testimony. To see if students had extracurricular experiences that might affect their
knowledge acquisition, students were asked in the pre-course survey to list any that they thought
would be useful to the course. Students participated in multiple extracurricular artistic endeavors
such as drawing, writing, or music in addition to various physical activities such as intramural
sports or individual recreational exercise.
Discussion
Thematic analysis in the compound assessment allowed for interpretation of each
student’s work and progress as a result of the forensic field school. This method served as a
quantifiable approach to assess whether students were achieving course goals and a reflection
upon how knowledge was achieved in the field school. To demonstrate that knowledge was
gained through the field school, it was expected that more students would achieve these
components, and a chi-squared analysis was used to understand if any changes were observable
in the data. In areas where knowledge was not being achieved, it is important to recognize any
deficiency in order to change the manner of instruction or emphasis placed upon certain topics.
The goal of the course assessment is thus to modify the course to adapt to changes in the student
population.
14
In general, the coding scheme worked well to understand how knowledge changed as a
result of the field school and was also useful to understand the differences in student populations
over time. The inter-rater agreement between all researchers was moderate suggesting that
responses could be coded in a reliable manner despite differences in education, experience, or
relationship of the observers relative to the field school. It should be noted that Fleiss’ Kappa
value can be affected by changing the number of observers or the number of observations. It is
possible that coding fatigue could have affected how individual researchers coded each student
(Potter and Levine-Donnerstein, 1999), but it is unclear what this relationship would be at this
time. Content analysis lends itself to potential subjective interpretation and shifting focus away
from thematic elements and onto interpretations of the receivers of content (Potter and Levine-
Donnerstein, 1999). Our focus on manifest content was chosen to provide more objective
interpretation of student work. Although students appeared to be achieving desired learning
outcomes they produced work that answered the question or a solved a problem in a manner
other than what was expected. This suggests that the coding scheme should be refined or made
less strict to capture the change in their knowledge base that was easily identifiable, just not with
the a priori framework.
Case Narrative
The case narrative was a measure of how well students could summarize and integrate all
the evidence found in the mock scenes into a narrative form. In the pilot study data from 2013,
only the writing component was significantly achieved while the 2014 data was significant in
three components. Both cohorts used a journalistic style in their writing. The responses that did
not achieve components in both years were lacking the explanation of data. Most students could
present the information and recite the evidence found, but a deeper understanding and integration
15
of all evidence into a story explaining what happened at the crime scenes was more difficult to
achieve. Perhaps, this lack of integration speaks to a greater need to emphasize critical thinking
during the field school. Nonetheless, the difference between cohorts suggests that each student
population faces different challenges during a course requiring instructors to adapt techniques to
the current student demographic.
Hypothetical Scenario
Although students could present their ideas in a clear manner for the pre-course
hypothetical scenario, a significant number of students did not achieve the control,
documentation, chain of custody, or identification components. The 2013 pilot field school
assessments only achieved significance in the documentation and clarity components while the
2014 cohort achieved significance in all components except identification. Responses from
either year typically did not include information regarding how or through what means a positive
identification of an individual can be made when only skeletal remains are present. Most student
responses did include information on the biological profile construction, a task predominantly
associated with forensic anthropology. Each component is necessary to resolve the situation
presented in the hypothetical scenario prompt. Results imply that a stronger emphasis needs to
be placed upon the means of positive identification, a task usually out of the purview of forensic
anthropologists.
Survey
The pre-course survey assessment similarly had no significant values in either cohort
(Table 2). For the post-course survey, none of the 2013 cohort components had significant
values while the 2014 cohort had a significant number of responses that met the definition of
forensic science. Neither cohort achieved the basic science, anthropology, or scientific method
16
descriptions. Does this suggest that students are losing more basic information as they study
more applied concepts? It seems unlikely as applied concepts are building upon foundational
information. However, it is possible that students are superficially answering the basic questions
while focusing on the seemingly more important applied concepts in terms of this field school. In
terms of the biological anthropology description, most students failed to state the importance of
forensic anthropology in cases of decomposing remains yet clearly stated the other components.
A need to address this particular deficit will be addressed in future field schools. It is important
to address these areas in future courses to ensure that students are retaining basic concepts and
their application throughout their career.
The difference in cohorts can be attributed to either a change in the emphasis of course
content or an inherent difference between the two cohorts, serving as evidence that student
populations are dynamic. The lead instructors did not change during this time period, but the
supporting instructors were different each year. Despite the difference in some instructors,
neither the course content nor mode of instruction was significantly altered.
Changes in Student Profile
Upon completion of the forensic field school, most students felt “very comfortable” with
teamwork and “comfortable” with class presentations. The students themselves wrote that
“[they] became more comfortable with [public speaking] as the course progressed.” Another
student mentioned that while they still get nervous, they “have definitely grown in [comfort]
with public speaking and teamwork.” Even the students’ perception of their ability to speak at
professional meetings changed over the course. One student stated their experience in the course
has given them “more opportunities to practice, especially with the courtroom testimony.” As a
direct result of the field school, all students expressed a desire to continue coursework or seek
17
out additional field schools or internship experience. Follow-up questionnaires will be
distributed to see where the program has taken them academically and professionally. There are
opportunities to expand this work and investigate other aspects of the transformative experience
these students went through.
Future Directions
Further assessments will be added to the compound tool to understand how knowledge
and skills change on a daily basis. A more formal rapid assessment process will be introduced
by interviewing staff and students at the end of each module. Ancillary assessments, some
formally a part of students’ grades and others an informal part of the evaluation procedure, have
been built into the course to evaluate changes in affective and psychomotor domains including a
written case report, fieldwork performance, prepared oral presentations, on-the-spot lab/field
presentations, courtroom testimony, and daily journal entries. Students produced a final case
report detailing their involvement in the mock death investigation and the findings of their
analyses. This report is similar to what a forensic scientist would submit to the crime laboratory
and what lawyers would use to build a case. In the lab and field instructors provided immediate
constructive criticism of students’ technique while the students learned proper methods and
procedures. Each student was required to give a five minute oral presentation reviewing a
learning module from the prior day. Visitors to the field school, including local newspaper and
television outlets, were treated to impromptu presentations by students that described the
activities being performed and how the case was progressing. Students achieve deeper
understanding of the principles, theories, methods, and techniques of the discipline under study
when they have to articulate said concepts to other novices (Marvell, 2008). Finally, the
capstone for the field school was a mock trial in which each student testified on the stand, under
18
oath, for 15 minutes as to their involvement in the case and their findings. These activities and
assignments were used to give students practice speaking in front of audiences large and small
while having to synthesize great quantities of information and communicate complex concepts at
a level that anyone could understand.
One of the more potent methods for understanding the learning experiences was through
a daily journal. Writing journal entries has been recognized as an effective way to evaluate
changes in the affective domain (Boyle et al., 2007). Each student was asked to be reflexive
about their work in the field school by submitting a daily journal entry in which they were
prompted to discuss which activities were challenging for them, which activities were easy to
complete, new skills they learned, how their perspectives had change, and what they looked
forward to in upcoming days. Through the course of the program one or two of their entries
were posted on the field school blog. Inclusion of such introspective work answers the call for
emotional reflexivity in teaching and learning anthropology (Spencer, 2011). The
aforementioned tools will be formally incorporated in a more complete ethnographic report of
the program that details the transformative experience of the field school for both students and
instructors alike.
Conclusions
Anthropological methods used to understand complex cultural systems can be applied to
an introspective examination of how teaching and learning occur in Anthropology. The unique,
changing educational landscape of forensic anthropology can be used as a test case for describing
the transformative experience of field schools whereby qualitative work produced by students
during the course of a field school can be transformed to quantitative data so that hypotheses
19
regarding changes in knowledge and skills may be tested statistically. Assessment tools
deployed to gather such data provide a metric that can be used to demonstrate that field schools
align with standards for professional and academic growth and development. Assessment is
made easier when field schools are constructed using backmapped course design principles
because a more direct connection is made between course goals, activities, and desired learning
outcomes. This novel method of an anthropological assessment of an anthropological field
school helps to address broader questions of how we define and produce competent
anthropologists as well as construct courses to reach desired learning outcomes. We recommend
that formal evaluation of students, by qualitative or quantitative means, be incorporated in all
field schools.
References Cited Bauer-Clapp, Heidi J., Ventura R. Perez, Tiffany L. Parisi, and Robin Wineinger 2012 Low Stakes, High Impact Learning: A Pedagogical Model for Bioarchaeology and
Forensic Anthropology Field School. The SAA Archaeological Record May: 24-28. Baxter, Jane E. 2009 Archaeological field schools. A guide for teaching in the field. Walnut Creek: Left Coast Press. Biggs, John 2003 Biggs, John. Aligning teaching and assessing to course objectives. Teaching and
Learning in Higher Education: New Trends and Innovations 2:13-17. Bloom, B. S., Engelhart, M. D., Furst, E.J, Hill, W., and Krathwohl, D. 1956 Taxonomy of educational objectives. Volume I: The cognitive domain. New York: McKay. Boyle, Alan, Sarah Maguire, Adrian Martin, Clare Milsom, Rhu Nash, Steve Rawlinson, Andrew Turner, Sheena Wurthmann, and Stacey Conchie 2007 Fieldwork is good: The student perception and the affective domain. Journal of
Geography in Higher Education 31(2) 299–317.
20
Cobb, Hannah, and Karina Croucher 2012 Field schools, transferable skills and enhancing employability. In Global Perspectives on
Archaeological Field Schools: Constructions of Knowledge. Harold Mytum, ed. Pp. 25-40. New York: Springer.
Cohen, Jacob 1960 A coefficient of agreement for nominal scales. Educational and Psychological
Measurement 20 (1):37-46. Duque, Lola C., and John R. Weeks 2010 Towards a model and methodology for assessing student learning outcomes and
satisfaction. Quality Assurance in Education 18(2):84-105. Fleiss, Joseph L. 1971 Measuring nominal scale agreement among many raters. Psychological Bulletin
76(5):378-382. Frye, Richard. 1999 Assessment, Accountability, and Student Learning Outcomes. Dialogue 2:1-12. Krippendorff, Klaus 1980 Content Analysis: An Introduction to Its Methodology. Newbury Park, CA: Sage. Landis, J. Richard, and Gary G. Koch 1977 The measurement of observer agreement for categorical data. Biometrics 33:159-174. Marvell, Alan 2008 Student-led presentations in situ: the challenges to presenting on the edge of a volcano.
Journal of Geography in Higher Education 32(2):321–335. Mytum, Harold 2012. The pedagogic value of field schools: Some frameworks. In Global Perspectives on
Archaeological Field Schools: Constructions of Knowledge. Harold Mytum, ed. Pp. 9-24. New York: Springer.
National Academy of Science 2009. Strengthening Forensic Science in the United States: A Path Forward. National
Academies Press, Washington D.C. Nugent, Kathy L., and Avi Kulkarni
21
2013 An interdisciplinary shift in demand for talent within the biotech industry. Nature Biotechnology 31(9):853-855.
Oblinger, Diana G., and James L. Oblinger, eds. 2005 Educating the Net Generation. Boulder, CO: EDUCAUSE. Potter, W. James, and Deborah Levine-Donnerstein 1999 Rethinking validity and reliability in content analysis. Journal of Applied
Communications Research. (27):258-284. Reigeluth, Charles M. 1999 What is Instructional Design-Theory and How is it Changing? In C. M. Reigeluth, ed.
Instructional-Design Theories and Models Volume II: A New paradigm of Instructional Theory. Pp. 5-29. Mahwah, NJ: Lawrence Erlbaum Associates.
Register for Professional Archaeologists N.d. Guidelines and standards for archaeological field schools. http://rpanet.site-
ym.com/?FieldschoolGuides, accessed November 4, 2014. Scientific Working Group for Forensic Anthropology 2013 Education and Training. Revision 0.
http://swganth.startlogic.com/Education%20and%20Training%20Rev0.pdf, accessed October 30, 2014.
Scientific Working Group for Forensic Anthropology 2014 Qualifications - Minimum Standards. Revision 0.
http://swganth.startlogic.com/Qualifications%20-%20Minimum%20Standards.pdf, accessed October 30, 2014.
Shelton, Donald J. 2008 The ‘CSI Effect’: Does it really exist? National Institute of Justice Journal 259:1-6. Spencer, Dimitrina 2011 Emotions and the transformative potential of fieldwork: Some implications for teaching
and learning anthropology. Teaching Anthropology 1(2):68-97. Wiggins, Grant, and Jay McTighe 2005 Understanding by Design, 2nd ed. Alexandria, VA: Supervision and Curriculum Development.
22
Table 1. Rubric for coding assessment tools. Tool Component Case Narrative
Evidence • Evidence from indoor and outdoor scenes • Background information on suspect and victims • Describes identification
Writing • Limited use of jargon • Journalistic
Communication • Clear communication of all case aspects • Able to understand what happened • Made connections between people, places, and things
Explanation • Synthesis of evidence • Fills in details • Explains how victims were killed
Hypothetical Scenario
Control • Set up perimeter and barrier to restrict access to scene • Include who is involved in investigation/scene
Documentation • Photos taken • Notes, logs, maps • Entrance/exit logs
Chain of Custody • Explain chain of custody
Identification • Explain how positive ID was made • Biological profile- narrow missing persons list • How or by whom bones were identified as human
Clarity • Within one page limit Survey Science • Knowledge
• Question/hypothesis • Experiment/test/method/approach/process • Natural world/physical/social
Forensic Science • Law/legal system • Applied/application • Crime/criminal
Anthropology • Study of humans • Non-humans/relatives/ancestors • Biology/physical/Culture/social/behavior OR holistic • Evolution/time/change/variation
Briefly describe how a biological anthropologist trained in osteology and archeology can aid in a forensic investigation.
• Search • Recovery • Identification • Remains/skeletonized remains/bones/skeleton • Decomposed/decomposing
Explain how the scientific method can be used to answer questions related to crime scene reconstruction.
• Evidence/data/observations • Behavior/what happened • Hypothesis • Explanation
23
Table 2. Chi-Squared Analysis of Pre-Assessment Tools Tool Component Chi-Squared P-value
2013 2014 2013 2014 Hypothetical Scenario
Control 3.18 0.66 0.93 0.41 Documentation 5.92 3.18 0.99 0.92 Chain of Custody 2.13 3.18 0.87 0.92 Identification 0.02 1.29 0.86 0.99 Clarity 9.50 9.5 <0.01 <0.01
Survey Science 5.92 1.00 0.99 0.31 Forensic Science 5.92 8.00 0.99 >0.99 Anthropology 7.61 1.00 >0.99 0.63 Bio Anth 7.61 1.00 >0.99 >0.99 Scientific Method 0.23 6.00 0.63 0.25
Table 3. Chi-Squared Analysis of Post-Assessment Tools Tool Component Chi-Squared P-value 2013 2014 2013 2014 Case Narrative
Evidence 1.29 4.44 0.26 <0.01 Writing 9.50 9.50 <0.01 <0.01 Communication 0.66 3.18 0.42 0.07 Explanation 2.13 2.13 0.14 0.14
Hypothetical Scenario
Control 0.23 0.23 0.63 <0.01 Documentation 5.92 3.18 0.01 <0.01 Chain of Custody 1.29 3.18 0.26 <0.01 Identification 0.24 0.65 0.63 0.99 Clarity 9.50 7.60 <0.01 <0.01
Survey Science 9.50 9.50 >0.99 >0.99 Forensic Science 5.92 3.18 0.99 0.07 Anthropology 7.60 7.60 >0.99 0.31 Bio Anth 9.50 9.50 >0.99 >0.99 Scientific Method 0.66 2.13 0.42 0.97
24