Graduate Assistant Institute Facilitators Manual

24
2013 Kristen Smith James Madison University 9/9/2013 Graduate Assistant Institute Facilitators’ Manual

Transcript of Graduate Assistant Institute Facilitators Manual

2013

KristenSmith

JamesMadisonUniversity

9/9/2013

Graduate Assistant Institute Facilitators’ Manual

2013 GAI

1

Graduate Assistant Institute Objectives: ................................................................................... 3Objectives Map ...................................................................................................................................................... 3Today we will: ............................................................................................................................ 4Agenda At-a-Glance .................................................................................................................. 4Homework (to be done prior to Day One) ................................................................................ 4Day One: Introduction to GAI (10:00-10:15) .............................................................................. 5Objectives Covered: ............................................................................................................................................... 5Day One: Welcome to CARS (10:15-10:45) ................................................................................ 5Location of Presentation Materials: ....................................................................................................................... 5Objectives Covered: ............................................................................................................................................... 5Day One: Introduction to the Learning & Assessment cycle (10:45-12:00) .............................. 6Location of Presentation Materials: ....................................................................................................................... 6Objectives Covered: ............................................................................................................................................... 6“Check In” Questions and Answers: ..................................................................................................................... 7Day One: Boxed Lunches at CARS (12:00-1:00) ........................................................................ 8Objectives Covered: ............................................................................................................................................... 9Day One: Roles and Responsibilities of CARS (1:00-2:00) ....................................................... 9Location of Presentation Materials: ....................................................................................................................... 9Objectives Covered: ............................................................................................................................................... 9“Check In” Questions and Answers: ................................................................................................................... 10Day One: Faculty Panel: Using Results for Improvement (2:00-3:00) ................................... 10Location of Presentation Materials: ..................................................................................................................... 11Objectives Covered: ............................................................................................................................................. 11Today we will: .......................................................................................................................... 12Agenda At-a-Glance ................................................................................................................ 12Day Two: The Role of Assessment in Higher Education (10:00-11:00) .................................. 12Location of Presentation Materials: ..................................................................................................................... 12Day Two: Navigating CARS (11:00-11:25) ............................................................................... 13Location of Presentation Materials: ..................................................................................................................... 14Objectives Covered: ............................................................................................................................................. 14Day Two: Qualtrics Training (11:45-12:45) .............................................................................. 14Location of Presentation Materials: ..................................................................................................................... 14Objectives Covered: ............................................................................................................................................. 14Day Two: Lunch (1:00-2:00) .................................................................................................... 15Objectives Covered: ............................................................................................................................................. 15Day Two: Data Security at CARS (2:00-2:30) .......................................................................... 15Location of Presentation Materials: ..................................................................................................................... 15Objectives Covered: ............................................................................................................................................. 15Day Two: Roles & Responsibilities of GAs & Student Culture at CARS (2:30-3:30) ............ 16Location of Presentation Materials: ..................................................................................................................... 16Objectives Covered: ............................................................................................................................................. 16“Check In” Questions and Answers: ................................................................................................................... 17Day Two: Chrome Book Assessment & Wrap Up (3:30-4:00) ................................................ 18Objectives Covered: ............................................................................................................................................. 18Assessment Description & Results ......................................................................................... 19Summary of Assessment Results ............................................................................................. 198 Cognitive Check-All-That-Apply and Ordering Questions .............................................................................. 205 Short Answer Cycle Questions .......................................................................................................................... 20Sense of Belonging Questions ............................................................................................................................. 21Recommended Programming Changes Based on Assessment Results ................................. 21Recommended Programming Changes not based on Assessment Results ........................... 22

2013 GAI

2

Appendix A .............................................................................................................................. 23GAI Pretest Survey ............................................................................................................................................... 23GAI Posttest Survey .............................................................................................................................................. 23

2013 GAI

3

Graduate Assistant Institute Objectives: Graduate Assistants and Faculty completing the Graduate Assistant Institute, offered by the Center for Assessment & Research Studies at James Madison University, will…

Describe the roles and responsibilities of CARS Describe the roles and responsibilities of Graduate Assistants at CARS Identify the basic components of the learning and learning and assessment cycle Articulate the purpose of assessment Explain the role of assessment in higher education Cultivate new relationships with incoming and current Graduate Assistants Develop a sense of belonging at CARS

Objectives Map Knowledge

Objectives Affective

Objectives OBJ 1 OBJ 2 OBJ 3 OBJ 4 OBJ 5 OBJ 6 OBJ 7 Day 1

Introduction to GAI Donna welcome presentation X X X Intro to Learning and assessment cycle X X Lunch @ CARS X X CARS Roles & Responsibilities presentation X Using Results Faculty Panel X X

Day 2 Role of Assessment in Higher Ed presentation X X Navigating CARS presentation X Qualtrics Training X Lunch @ CARS X X Data Security presentation X GA Responsibilities/Student Culture presentation X X

Total by Individual Objective 2 4 1 3 1 2 6 Total by Type of Objective 11 8

2013 GAI

4

Day One – Facilitators’ Manual

Today we will:

• Introduce Graduate Assistants (GAs) and faculty to the Graduate Assistant Institute (GAI) and the Center for Assessment and Research Studies (CARS);

• Introduce and describe the learning and assessment cycle used in CARS; • Discuss the roles and responsibilities of CARS; • Get to know several CARS faculty members through an invited panel; • Listen to faculty members describe examples of programs that are using assessment results to improve student

learning, consider barriers to using results for improvement, and explain strategies for promoting using results.

Agenda At-a-Glance

Time Session Presenter

10:00- 10:15 Introduction to GAI Kristen Smith

10:15- 10:45 Welcome to CARS Donna Sundre

10:45-12:00 Introduction to the Learning & Assessment cycle Kristen Smith

12:00- 1:00 Boxed Lunch @ CARS

1:00- 2:00 Roles & Responsibilities of CARS Mandalyn Swanson

2:00- 3:00 Faculty Panel on Using Results for Improvement

Christine DeMars, Sara Finney, Jeanne

Horst, & Donna Sundre

Homework (to be done prior to Day One) Participants should have:

Completed all pretest assessment items via a Qualtrics survey that was emailed to them one week prior to the GAI.

Read Chapter 1 of Suskie, L. (2009). Assessing student learning: A Common sense guide, (2nd ed.). Bolton, MA: Anker Publishing Company Inc.

Read Kuh, G., & Ikenberry, S. (2009). More than you think, less than we need: Learning outcomes assessment in american higher education. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).

Read Montgomery, M. (2013). The best professional resource: A mentor. National Council on Measurement in Education Newsletter, 21(2), 2-5.

Participants also have the option to have: Read Blaich, C., & Wise, K. (2011). From gathering to using assessment results: Lessons from the

wabash national study. (NILOA Occasional paper No. 8). Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

2013 GAI

5

Day One: Introduction to GAI (10:00-10:15) Facilitator: Kristen Smith Setting: Anthony-Seeger Room 9 Materials needed: GAI binders

Objectives Covered: None

The facilitator will lead this introduction: • Get to know the participants:

o Tell us your name and what your Graduate Assistantship in CARS will be this year (faculty members can tell the group what their main assessment assignment is for the year).

o What is one thing you hope to learn from the GAI?; • Tell the audience what the Graduate Assistant Institute (GAI) is:

o What is the purpose of the GAI o What should participants take away from the GAI o Introduce the GAI Student Learning Outcomes;

• Walk participants through the GAI binders. Parallel process: This activity represents the first formal presentation of the GAI objectives to the participants. It allows participants to start identifying with their GA assignments and it facilitates a discussion about what we will be learning at the GAI. Also, this activity promotes interaction and communication among participants.

Day One: Welcome to CARS (10:15-10:45) Facilitator: Donna Sundre Setting: Anthony-Seeger Room 9 Materials needed: GAI Binder [hyperlink to folder that contains binder materials]

Location of Presentation Materials: N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\Presentations\day 1\Sundre Welcome to CARS GAI 0813.ppt

Objectives Covered: Objective 1: Describe the roles and responsibilities of CARS Objective 2: Describe the roles and responsibilities of Graduate Assistants at CARS Objective 7: Develop a sense of belonging at CARS

The presentation will begin by explaining to participants which objective(s) the session addresses. Donna will welcome new GAs and faculty members to the GAI, and provide a brief history of the center. Following, she will describe CARS mission, vision, and areas of service. Donna then introduces the three levels of assessment policy: Assessment Accreditation Accountability

2013 GAI

6

And the four levels of policy makers: JMU State Regional National This leads into a description of the specific policy makers that exist at each level (i.e. General Education council at JMU, SCHEV at the state level, SACS COC at the regional level, etc.). Donna will go into more detail about each level of assessment policy, beginning with JMU’s General Education Council and Assessment Advisory Council, then transitioning into JMU’s Assessment Progress Template (APT) and Academic Program Review (APR) practices. Next, participants will learn about Virginia’s policy making body, SCHEV, and our regional accrediting body, SACS COC. After learning about assessment policy, Donna transitions to more specific information about CARS, including why CARS engages in learning assessment. This includes a brief overview of the three assessment areas of CARS (General Education, Academic Program, and Student Affairs), which leads nicely to Donna’s next point about JMU’s culture of assessment. To wrap up the session, Donna conveys the role GAs play in accomplishing JMU’s assessment goals and keeping CARS running. In conclusion, she offers words of wisdom and well wishes for each GA. Parallel process: This session serves three purposes. It helps students feel more acclimated to CARS and increases their understanding of what CARS does. It also introduces how the work done at CARS fits into the larger assessment picture (i.e., at the university, state, and regional level). Participants are challenged to think about how assessment work at CARS fits into the different levels of assessment policy. Furthermore, the presentation helps participants think about the importance of their assessment work and how it ties into the bigger picture at JMU and beyond JMU.

Day One: Introduction to the Learning & Assessment cycle (10:45-12:00) Facilitator: Kristen Smith Setting: Anthony-Seeger Room 9 Materials needed: Kuh & Ikenberry (2009) prep reading

Suskie (2009) prep reading GAI Binder Blaich & Wise (2011) optional prep reading

Location of Presentation Materials: N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\Presentations\day 1\Intro_Assessment_Cycle.pptx

Objectives Covered: Objective 3: Identify the basic components of the learning and assessment cycle Objective 4: Articulate the purpose of assessment

The presentation will begin by explaining to participants which objective(s) the session addresses. Kristen will introduce assessment, what it is and what it is not. This involves showing participants the learning and assessment cycle for the first time. This same learning and assessment cycle will be shown throughout the GAI.

2013 GAI

7

After presenting the learning and assessment cycle, Kristen discusses the reasons we engage in assessment. This will be a review from what was discussed in the previous session. The point of reviewing this material is to make the point that the purpose of assessment should be to improve student learning. This concept is then related back to the Kuh & Ikenberry (2009) prep reading that students were expected to complete prior to attending the GAI. The “purpose of assessment” is introduced first in order to facilitate subsequent discussions about each stage of the learning and assessment cycle. If attendees understand, in big picture terms, why we engage in assessment, this will facilitate their understanding of the specific stages of the cycle. Again, understanding the why should help students better comprehend the how. Following, each stage of the cycle is addressed by anywhere from 2 to 7 slides. Several sections also have “Check In” questions (See below). The “Analyze/Maintain Information” section includes a handout proved by Jeanne Horst. The handout is used as part of a group activity about thinking critically and analyzing data. This activity asks participants to think through causal relationships and the three conditions needed to establish causation. Specifically, participants are asked to describe the results displayed in a bar graph, analyze the extent to which the graph supports a causal claim, consider alternative explanations for the results display in the graph, and finally provide examples of supplemental evidence that would be required for making a casual statement. This activity should be completed in groups of 5 or fewer participants. This activity should take ~15 minutes: 10 minutes to answer the questions in small groups and 5 minutes for discussion with the larger group. After briefly explaining each stage of the learning and assessment cycle, Kristen will conclude the presentation by emphasizing using results to improve student learning, which is the concept that was introduced at the beginning of the presentation to explain why we engage in assessment. During this section, participants are asked to discuss what they learned from one of the prep readings by Kuh & Ikenberry (2009). Finally, the presentation ends by asking participants to recall the steps in the learning and assessment cycle and then Kristen summarizes what was covered during the presentation (i.e., linking the presentation back to the objectives it was created to address). To facilitate processing and identify gaps in understand during the presentation, “Check In” slides are incorporated throughout. These slides include fill-in-the-blank and/or multiple-choice questions, while others include discussion-based questions. Discussion based questions can be addressed in groups, individually, or both.

“Check In” Questions and Answers:

Question Response Option Answer

Jon Snow is in the English department and teaches poetry. He says, “Poetry is a Discussion

2013 GAI

8

mystical subject that transforms the mind in a spiritual way. It cannot be assessed.”

How would you respond? Think and talk in terms of things students should “know, think, or do” as a result of English department classes How can you convince him that you can assess poetry?

What would be the first step he should take to assess poetry in the English department

Articulate objectives that are specific, observable, and measurable

Sansa Stark has written her first objective! However, it’s not quite ready to go. Can you point out any flaws and help improve this objective? “The faculty will teach the kids to understand science.”

Discussion

Avoid words like understand or teach because they aren’t clear; use more specific language; what is meant by “kids”; what is meant by “science”

Jaime Lannister teaches in the Philosophy department at JMU. He needs to find a measure to assess one of the program’s learning objectives, and his program is on a budget so they can’t afford to pay for a test. Which type if measure should he look for? A. A commercial measure that assesses a non- cognitive construct B. A non-commercial measure that uses a Likert scale C. A non-commercial, cognitive measure D. A commercial measure of self-reported attitudes

Multiple Choice C. A non-commercial, cognitive measure

Tywin Lannister told his grandson, Joffrey Baratheon, that collecting implementation fidelity information can help him make more informed, appropriate decisions about the programming they use to teach his followers about the history of King’s Landing. Joffrey, per usual, offensively disagrees with Tywin. How would you convince Joffrey that implementation fidelity data is essential to making informed decisions about the history programming for King’s Landing?

Discussion

Unlocking the black box; planned versus implemented programming, program on paper versus delivered program; allows you to make the most appropriate conclusion about programming;

For some time now, Ned Stark has been assessing the people of Winterfell’s ability to identify the ruling families of the North. One day he asks you, “What is the true purpose of all this assessing? Why must we assess?” A. Because the Lord of Light commands it B. To satisfy the regional accrediting bodies of Westeros C. To help our people learn more about the rulers of the North D. To determine which people receive food rations in times of war

Multiple Choice

C. To help our people learn more about the rulers of the North

What did you learn from Wabash? Discussion

Programs are getting better at assessment, still not so good at using results to improvement student learning; quality data & good assessment methodology aren’t the biggest problem; assessment is necessary but not sufficient for improving student learning

Parallel process: This presentation asks participants to integrate knowledge they learned during the “Introduction to CARS” presentation with new concepts. Participants should be using information they have already learned to help them acquire new knowledge. Another aspect of this presentation asks participants to integrate information from a prep reading that they were asked to do prior to attending the GAI (Kuh & Ikenberry, 2009). This kind of integration is common at CARS and in graduate school coursework and research, thus it is beneficial to model this in the GAI programming. Participants will also participate in a group activity that requires critical thinking skills and data analytic skills. It’s very important that participants can work in teams and think both collectively and independently at the same time. This activity helps to facilitate analytical thinking at the group and individual levels. Additionally, “Check Ins” are used throughout the presentation as a means of formative assessment. This should allow facilitators to identify and address gaps in understanding during the presentation.

Day One: Boxed Lunches at CARS (12:00-1:00) Facilitator: None Setting: Anthony-Seegar Room 9

2013 GAI

9

Materials needed: None

Objectives Covered: Objective 6: Cultivate new relationships with incoming and current Graduate Assistants and new faculty

members Objective 7: Develop a sense of belonging at CARS

Boxed lunches are provided by Aramark. Other members of CARS should also be invited to lunch. Before lunch, participants should be instructed to meet at least one faculty member or graduate assistant during lunch that they have never met before.

Day One: Roles and Responsibilities of CARS (1:00-2:00) Facilitator: Mandy Swanson Setting: Anthony-Seegar Room 9 Materials needed: GAI Binder

Location of Presentation Materials: N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\Presentations\day 1\CARS_Roles_Responsibilities_updated_7.25.13.pptx

Objectives Covered: Objective 1: Describe the roles and responsibilities of CARS

The presentation begins by explaining to participants which objective(s) the session addresses. Mandy will begin by explaining the difference between evaluation and assessment; this should situate what CARS does (Student learning and developmental outcomes assessment) within the grander scheme of the entire evaluation process. Here, Mandy should review each of the different kinds of assessment that make up the evaluation process (i.e. cost, dropout, student satisfaction, etc.) so that participants can better understand the differences between each part of the evaluation process and distinguish “Student Learning and Development Assessment” from the other components of evaluation. This leads nicely into a conversation about what CARS does and does not do. For instance, CARS does one part of the evaluation process, “Learning and Development outcomes assessment.” Participants are asked to think about and offer their ideas about why CARS only engages in Learning and Development outcomes assessment, but no other parts of the evaluation process. Next, the presentation addresses the three things that CARS does. This “3 Things” framework will come up in other presentations as well. It is important that participants understand the three things CARS does because in later presentations they will be asked to recall this information and use this framework to think about GA assignments. Also, the remainder of the slides in the presentation are structured according to the “3 Things CARS does” framework. The first thing CARS does is Academic program assessment. The presentation describes the role of the PASS office and the APT process. The next thing CARS does is General Education assessment. Data from university-wide assessment day is often used to assess General Education programming. During this part of the presentation, participants are shown the five General Education clusters for the first time. It is important that participants know there are five clusters and which faculty member is the CARS liaison for each Cluster. Next participants are asked to think about and articulate why CARS assesses General Education programming. For instance, CARS assesses General Education programming because this makes up about 1/3 of every graduate from JMU’s degree course work. Also, it constitutes the liberal education component of every JMU degree. Most importantly, CARS assesses General Education programming to provide information that can be used to improve student learning in the five different clusters of General Education. Here, Mandy should point out that the purpose of any assessment endeavor should be to use results to improve student learning. The third and final duty of CARS is to assess Student Affairs programming. The presentation ends by describing university-wide assessment day organized and orchestrated by CARS GAs. This

2013 GAI

10

highlights the role of GAs in the assessment day process, and describes how the data collected on assessment day is used. Also, a few assessment day tasks are discussed. To conclude this session, participants are presented with the session objectives (i.e., linking the presentation back to the objectives it was created to address), and they are asked to name one of the roles of CARS. To facilitate processing and identify gaps in understand during the presentation, “Check In” slides are incorporated throughout. These slides include fill-in-the-blank and/or multiple-choice questions, while others include discussion-based questions. Discussion based questions can be addressed in groups, individually, or both.

“Check In” Questions and Answers:

Question Response Option Answer

Which if the following is true? A. Cersei Lanster does not have to submit an APT because she is the coordinator of

a certificate program, not an academic degree program. B. Margaery Tyrell’s program only has one student learning objective, and her

program didn’t want to collect assessment data this year, so they are not required to submit an APT.

C. Melisandre’s program should describe objectives, mapping of objectives to assessment method, methods of assessment, assessment results, and how results were used to improve learning, in detail, on their APT.

D. Theon Greyjoy’s program received low ratings on its APT last year, thus they had to terminate three teaching positions within their department

Multiple Choice

C. Melisandre’s program should describe objectives, mapping of objectives to assessment method, methods of assessment, assessment results, and how results were used to improve learning, in detail, on their APT.

In your own words, describe what an APT is

Discussion

What kinds of information should academic and degree programs include in their APT?

Discussion

What are some reasons why programs have to submit an APT? Discussion

JMU conducts a university-wide assessment day (A-day) ________ times per year!

Fill in the blank Two

On A-day, cognitive tests are administered to measure learning in the five clusters of _________ __________ programming.

Fill in the blank General Education

On A-day, non-cognitive tests are administered to measure attitudes and behaviors related to _________ _________ programming.

Fill in the blank Student Affairs

Data are collected for A-day using a longitudinal design, this means each student will take the same test ________ times.

Fill in the blank Two

A-day cannot happen without the help of ________ ________!

Fill in the blank CARS GAs

Parallel process: The presentation teaches participants to think about CARS responsibilities as the “3 Things” CARS does. This “3 Things” framework is used in other presentations as well. Participants should then use this framework to later recall the roles of CARS. Moreover, in later presentations, participants are encouraged to think about CARS GA assignments through this same framework. The goal is to give participants an organizing tool that they can use to understand what CARS does and (after the presentation during day 2) what GAs at CARS do. Additionally, “Check Ins” are used throughout the presentation as a means of formative assessment. This should allow facilitators to identify and address gaps in understanding during the presentation.

Day One: Faculty Panel: Using Results for Improvement (2:00-3:00) Facilitator: Kristen Smith, Sara Finney, Dena Pastor, Donna Sundre, & Christine DeMars

2013 GAI

11

Setting: Anthonty-Seegar Room 9 Materials needed: Kuh & Ikenberry (2009) prep reading

Blaich & Wise (2011) optional prep reading

Location of Presentation Materials: N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\Presentations\day 1\GAI Faculty Panel.docx

Objectives Covered: Objective 4: Articulate the purpose of assessment Objective 7: Develop a sense of belonging to CARS

To begin, Kristen will explain to participants which objective(s) the panel addresses. This invited panel includes assessment and measurement experts that work in CARS. To begin the panel, each faculty member introduces themselves, briefly describes the assessment work they do in CARS, and identifies the new GAs or faculty that they will be working with during the upcoming year. During the panel, each faculty member is asked to respond to the following topics: 1) What are some examples of how programs you have worked with are using results to improve student learning (i.e. assessing, using results to make program/curriculum/pedagogy changes, re-assessing to see if those changes actually improved student learning) 2) What is the most common challenge or barrier to “using results to improve student learning” that you have encountered either here at JMU or elsewhere? From your experiences, why do you think more programs aren’t using results? 3) How can students and faculty members explain, promote, and facilitate the process of using results for improvement to the clients they will be working with? Any strategies that have worked well for you in the past? 4) If any GAs or faculty members are interested in learning more about using results to improve student learning, are there any resources you’ve come across that you would recommend? (i.e., articles, book chapters, websites of other programs that are using results, webpages, presentations, online modules, etc.)

The theme of the panel revolves around using results to improve student learning, and the barriers that often prevent programs from using results, making curricular changes, and re-assessing those changes. Panel members will also discuss strategies that GAs and faculty members can use to talk about using results to improve student learning and how they can facilitate this in their assistantships and other assessment work. Finally, panel members will provide resources that GAs and faculty members can consult to learn more about using results for improvement. Also, participants should be encouraged to ask questions throughout for clarification. Faculty members’ responses can be found on the N-drive on the document that lists the faculty panel questions. Parallel process: This activity introduces participants to some of the faculty members that teach and work in CARS. It gives participants a sense of the kinds of assessment work done at CARS. The panel is a great way to have participants and faculty engage in a conversation about an important and relevant issue in higher education, using results to actually improve student learning. Furthermore, it models the collaborative environment of CARS; an environment in which faculty and students work together on assessment projects and other scholarly research.

2013 GAI

12

Day Two – Facilitators’ Manual

Today we will:

Discuss the role of assessment in higher education policy and practice; Explore CARS to become more comfortable with the center and to discover the location of important resources; Be introduced to Qualtrics Survey software through a presentation by the Center for Instructional Technology

(CIT); Describe how to keep CARS data and sensitive information secure; Discuss the roles and responsibilities of GAs that work in CARS; Complete posttest assessment items using Google Chrome Books Debrief and wrap up.

Agenda At-a-Glance

Friday, August 9th

Time Session Presenter

10:00-11:00 The Role of Assessment in Higher Education Donna Sundre

11:00-11:25 Navigating CARS Sharon Sipe & Paula Love

11:30-11:45 Transition to Carrier

11:45-12:45 Qualtrics Training E. David Stoops

12:45-1:00 Transition back to CARS

1:00-2:00 Pizza Lunch @ CARS

2:00-2:30 Data Security Dena Pastor

2:30-3:30 Roles & Responsibilities of GAs and Student Culture Devon Hopkins

3:30-4:00 Chrome Book Assessment & Wrap Up Kristen Smith

Day Two: The Role of Assessment in Higher Education (10:00-11:00) Facilitator: Donna Sundre Setting: Anthony-Seeger Room 9 Materials needed: None

Location of Presentation Materials: N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\Presentations\day 2\The Role of Assessment in Higher Education GAI 0813.ppt

2013 GAI

13

Objectives Covered: Objective 4: Articulate the purpose of assessment Objective 5: Explain the role of assessment in higher education Objective 7: Develop a sense of belonging to CARS

The presentation will begin by explaining to participants which objective(s) the session addresses. Donna will remind participants what assessment is and why we engage in it. This should be information that was presented in two different presentations during day one, so participants should already feel comfortable with this information. Higher education institutions typically engage in assessment for two reasons 1) improving effectiveness and 2) providing accountability. However, assessment endeavors driven by improvement and assessments driven by accountability often look very different. Donna describes “the tension” that is created between assessing for improvement and assessing for accountability purposes. For instance, assessment for improvement is often intrinsically motivated, guided by faculty, and committed to improving student learning. Assessment for accountability is typically externally dictated, requires faculty compliance, and feels like an audit. Donna makes the important point that “the tension” she just described is unnecessary because quality assessment should meet any accountability demands; the two should never be at odds with each other. Next, Donna provides a brief history of Assessment practices at JMU, assessment culture, and the establishment of a university-wide assessment day. In 1986, Dary Erwin established CARS. Along with a secretary and two GAs, Dary started to develop a culture of assessment at JMU. One year later, in 1987, JMU held the first ever university assessment day and administered the first university Alumni survey. A few years later in 1998, Dary founded the Assessment & Measurement PhD program. This presentation will also link the role of CARS to national and state legislature and policy issues. For instance, consider Peter Drucker’s assertion that “higher education is in deep crisis.” Assessing student outcomes and using that information to change curricula, pedagogy, and programming can lead to improved student learning outcomes; this demonstrates that there is value in higher education. Similar to day 1, participants will be introduced to policy-making bodies like SCHEV and their role in higher education assessment. Donna introduces some additional organizations such as the Lumina Foundation, the Association of Governing Boards, the Social Science Research Council, and the Association of American Colleges and Universities (AAC&U). Donna also talks about European assessment practices including the Bologna process. Donna presents current news headlines and updates that will affect higher education policy and assessment practices; this includes several articles and citations from “Inside Higher Education.” Through her examples and discussion, Donna gives participants a sense of the current political landscape in higher education. She also makes it clear that assessment for improvement and assessment for accountability are often at odds with one another, but should be complimentary processes. Parallel process: This presentation should help participants articulate the purpose of assessment by highlighting the reasons we engage in assessment. In order for participants to grapple with specific information about the assessment process, they need to first consider assessment from a broader perspective. This perspective includes the reasons we engage in assessment and the role assessment plays in local, regional, and national decisions. Assessment is a tool that higher education should use to demonstrate its worth. Again, in order for students to understand assessment within the contexts of CARS, they need to take a step back and consider the bigger picture. For example, they need to be aware of assessment policy makers at the state, regional, and national levels, and how assessment policy shapes assessment practice. They should be challenged to think about “the tension” between assessment for improvement and assessment for accountability. After the presentation, participants should know how to articulate the specific role of assessment at JMU, and the broader role assessment plays in higher education. This should give participants more holistic view of assessment, not only in the context of JMU, but at the national level too.

Day Two: Navigating CARS (11:00-11:25) Facilitator: Sharon Sipe & Paula Love

2013 GAI

14

Setting: Anthony-Seeger Room 9 Materials needed: GAI binder CARS bingo board

Location of Presentation Materials: N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\Presentations\day 2\Navigating Handouts on _072913.ppt N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\Presentations\day 2\Navigating the Center Presentation_072913.ppt

Objectives Covered: Objective 7: Develop a sense of belonging to CARS

To begin, Kristen will explain to participants which objective(s) the session addresses. This presentation will give participants information about CARS such as building information, safety protocols, emergency procedures, center resources, and an introduction to the administrative team. The information Sharon and Paula will present mainly deals with logistics such as:

Best safety practices at CARS Printing at CARS Reserving rooms in CARS Using the CARS library and computer lab Mailing packages from CARS Getting supplies from the supply closet Receiving intercampus mail Using the N-drive Maintaining the CARS kitchen (GAAMP)

After discussing general information about CARS and logistic information, participants will be given a CARS bingo board. Then, they must follow a set of “clues” to explore CARS and find the appropriate stickers to fill in their bingo boards. These “clues” contain information about CARS and about faculty members that work in CARS. Parallel process: Through this presentation, students are physically exploring the center, while also getting to know faculty members. This is intended to help participants feel more acclimated to the center physically and psychologically because they will learn how to navigate the center and they will learn personal information about faculty members along the way. Additionally, this gives participants a chance to naturally form teams and work with fellow participants to figure out the clues and track down the stickers for their bingo boards.

Day Two: Qualtrics Training (11:45-12:45) Facilitator: Dave E. Stoops (CIT) Setting: Carrier Library Materials needed: None

Location of Presentation Materials: None

Objectives Covered: Objective 2: Describe the roles and responsibilities of Graduate Assistants at CARS

2013 GAI

15

To begin, Kristen will explain to participants which objective(s) the session addresses. Dave Stoops from the Center for Instructional Technology (CIT) will introduce participants to Qualtrics Survey software. Participants will have to use Qualtrics surveys as part of their GA responsibilities in CARS. Dave will spend about 15 minutes reviewing basic Qualtrics information. He will demonstrate some basic features of Qualtrics including how to:

Use different types of items; Use survey flow; Use skip logic; Distribute a Qualtrics survey; Download survey data from Qualtrics.

Afterwards, participants will use Qualtrics to complete a group activity. Participants should divide into groups of 2-3. Each group will have 15 minutes to create their own short survey (~3-5 items). These short surveys should include 3-5 different types of questions and at least one item must use skip logic. After creating their survey, each group will distribute it to the other groups via their JMU email account. Each group will have 10 minutes to send their survey to the other groups and respond to the other groups’ surveys. Finally, each group will have 15 minutes to download the data from their survey into excel and into SPSS. During the Qualtrics activity, Dave and Kristen will be walking around to assist groups if they have any difficulties with Qualtrics. After participants have completed the activity, Dave will spend the final five minutes addressing any lingering questions or issues that participants experienced during the activity.

Day Two: Lunch (1:00-2:00) Facilitator: None Setting: Anthony-Seegar Room 9 Materials needed: None

Objectives Covered: Objective 6: Cultivate new relationships with incoming and current Graduate Assistants and new faculty

members Objective 7: Develop a sense of belonging at CARS

Pizza from Papa Johns will be provided by CARS. Other members of CARS should also be invited to lunch. Before lunch, participants should be instructed to meet at least one faculty member or graduate assistant during lunch that they have never met before.

Day Two: Data Security at CARS (2:00-2:30) Facilitator: Dena Pastor Setting: Anthony-Seegar Room 9 Materials needed: GAI Binder

Location of Presentation Materials: N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\Presentations\day 2\data mgmt

Objectives Covered: Objective 2: Describe the roles and responsibilities of Graduate Assistants at CARS

2013 GAI

16

Participants are welcomed back from lunch, and Kristen will explain which objective(s) the session addresses. Dena begins by describing the participants the importance of data security at CARS. This includes a list of “data dos and data don’ts.” For instance, participants should always log off of computers after use and should never take identifiable data outside of CARS. Additionally, CARS has data via scantron forms that must be locked at all times and disposed of in the proper location. Dena also explains to participants that under the Family Educational Rights Privacy Act (FERPA), they cannot release information about students to anyone. Dena gives a brief introduction to the Data management course that is required of all CARS GAs. She explains that CARS GAs are required to use syntax and statistical software to appropriately process and present data. Also, CARS GAs must be able to conduct psychometric analyses on data, and make analytical comparisons. GAs are also responsible for writing reports and presenting information from data analyses. Furthermore, she describes several tasks that data management GAs will be responsible for completing.

Day Two: Roles & Responsibilities of GAs & Student Culture at CARS (2:30-3:30) Facilitator: Devon Hopkins Setting: Anthony-Seegar Room 9 Materials needed: Montgomery (2013) NCME Newsletter prep reading

GAI Binder

Location of Presentation Materials: N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\Presentations\day 2\GA_Roles_Student_Culture_updated_7.25.13.pptx

Objectives Covered: Objective 2: Describe the roles and responsibilities of Graduate Assistants at CARS Objective 6: Cultivate new relationships with incoming and current Graduate Assistants and new faculty

members Objective 7: Develop a sense of belonging to CARS

To begin, Kristen will explain to participants which objective(s) the session addresses. Devon will begin by reviewing the stages of the learning and assessment cycle with participants. Participants were introduced to the learning and assessment cycle during day 1, so this information should be review for them. Reviewing the cycle will should help participants build self-efficacy with this information. While reviewing the assessment cycle, Devon points out that CARS GAs are not responsible for the final stage of the learning and assessment cycle, “using results to improve student learning.” Programs are responsible for making changes to pedagogy and curricula. After programs make changes, CARS GAs can help them re-assess to see whether the changes actually improved student learning. Here, Devon pauses to have participants think about why this is and volunteers some rationale behind this. Thinking through this and formulating their own rationale should help participants more deeply process this information. Then Devon asks participants to recall the “3 Things” CARS does (information they should have learned during day 1) and use their knowledge of CARS responsibilities as a framework to think about GA roles and assignments within CARS. Also, the remainder of the slides for this presentation are structured according to the “3 Things” CARS does framework. Next, Devon will provide an overview of the GAs in CARS through the model of the “3 things” CARS does. For instance, first she will discuss PASS, which includes GA responsibilities related to Academic Program Assessment. Next, she discusses GA assignments that assess General Education clusters. Finally, Devon describes Student Affairs Assessment GAs. These GA responsibilities can be conceptualized using the framework introduced during day 1; each GA falls into at least one of the three categories.

2013 GAI

17

Devon then transitions to discuss more specific logistics that apply to all GAs regardless of which category they fall into. A crucial portion of this presentation involves having participants operationally define GA work and understand what types of behaviors constitute GA work and what behaviors do not constitute GA work. Devon will discuss three major requirements of all CARS GAs:

Meet deadlines Attend staff meetings Proctor A-day Record GA hours each week & share with GA supervisor

Let’s define “GA work”

What are your ideas? GA work is: Producing a product; having something to show for it An action that progresses or accomplishes a GA task Ask yourself: Who is benefitting from this? GA work is not: Physically being at your desk in CARS during “scheduled” GA hours Homework, class prep, staff meeting, projects, round table, practicum, research agenda, studying, etc.

The rest of the presentation focuses on student culture and mentorship at CARS. This section of the presentation draws on ideas from “The best professional resource: A mentor” (Montgomery, 2013). Devon will describe how to find a mentor and what to look for in a mentor relationship. All of these concepts come from the NCME article that participants should have read prior to attending the GAI. Devon should point out that although the ideas in the NCME article are fairly broad, they could easily be applied to finding a mentor at CARS. This section also includes quotes from current students and graduates of the Assessment & Measurement PhD program. The quotes exemplify the kinds of relationships CARS GAs have with each other and with CARS faculty. Many of the quotes provide examples of the mentorship that GAs can get from their fellow GAs and faculty members CARS. Devon concludes the presentation by discussing traditions at CARS and then asks participants to name one responsibility of CARS GAs. To facilitate processing and identify gaps in understand during the presentation, “Check In” slides are incorporated throughout. These slides include fill-in-the-blank and/or multiple-choice questions, while others include discussion-based questions. Discussion based questions can be addressed in groups, individually, or both.

“Check In” Questions and Answers:

Question Response Option Answer

GAs at CARS will directly participate in all of the stages in the Learning and assessment cycle except which one? Discussion

Why don’t CARS GAs engage in this stage? Discussion

Who is responsible for this stage? Discussion

CARS GAs can help programs collect data to see if the program changes they made actually improved student learning, but it’s not the job of CARS GAs to make program changes

GAs fall into three categories of assessment: _________ __________, ____________ ___________, and _________ __________.

Fill in the

blank

Academic Program, General Education, and Student Affairs

If your GA deals with assessing only academic programs, your GA is in ________.

Fill in the

blank PASS

A-day GAs work with _________ __________ and ____________ ___________ assessment

Fill in the

blank General Education, and Student Affairs

2013 GAI

18

Parallel process: During this presentation, participants are asked to provide their own rationale as to why CARS GAs are responsible for some aspects of the learning and assessment cycle but not all. Thinking through this information and formulating their own rationale should help participants more deeply process this information. Challenging participants to work through the reasoning themselves, rather than just feeding them the answers, might help them more deeply process this information. Also, participants are required to read an article about mentorship and apply concepts from that article to the setting of CARS. Over the next year, participants’ GA tasks and course assignments will expect them to apply material in a similar way.

Day Two: Chrome Book Assessment & Wrap Up (3:30-4:00) Facilitator: Kristen Smith Setting: Anthony-Seeger Room 9 Materials needed: Chrome Books Qualtrics Survey Link

Objectives Covered: None

To begin, Kristen will remind participants of the GAI objectives. This should jog their memories about what they have learned over the past two days. Then she will show students the Chrome Books and explain that they can be used to facilitate embedded assessment and other portable methods of data collection. Participants will be emailed a link that allows them to access the posttest Qualtrics survey. Participants will spend ~20-30 minutes responding to the posttest survey using the Chrome Books. After all participants have finished their assessment, GAI programming will officially end, and everyone will be invited to regroup downtown for happy hour.

2013 GAI

19

Assessment Description & Results Students were invited to take a pretest Qualtrics survey one week before GAI programming began. Students completed posttest during the final 30 minutes of the GAI using Chrome Books. Seven students participated in the GAI but only six completed Pre and Posttest due to issues with logging into email account. Thus, the sample size analyzed was six. Please see Appendix A for links to the pre and posttest measures. Pretest assessment included:

8 cognitive questions (check all that apply, ordering) 5 short answer questions that asked participants to describe each stage in the learning-assessment cycle 3 short answer questions that asked participants to articulate the purpose of assessment and the role of

assessment in higher education 18 item Sense of Belonging Scale (Freeman, Anderman, & Jensen, 2007) 3 qualitative feedback questions

o What are you most excited for working at CARS? o What are you most nervous about working at CARS? o What do you hope to learn from the GAI?

Posttest assessment included: 8 cognitive questions (check all that apply, ordering) (SAME AS PRETEST) 5 short answer questions that asked participants to describe each stage in the learning-assessment cycle (SAME

AS PRETEST) 3 short answer questions that asked participants to articulate the purpose of assessment and the role of

assessment in higher education (SAME AS PRETEST) 18 item Sense of Belonging Scale (Freeman, Anderman, & Jensen, 2007) (SAME AS PRETEST) 4 qualitative feedback questions (Different from PRETEST)

o What session(s) would you alter so that future students or faculty members could better meet the objectives of the GAI? And what would you alter? Please be specific.

o What session(s) should be added so that future students or faculty members could better meet the objectives of the GAI? Please be specific.

o Do you feel that GAI programming gave you ample opportunities to meet the objectives of the GAI? Why or why not?

o What questions do you still have that were not answered by GAI programming? Assessment results including the syntax used for analyses, the complete dataset from the 2013 GAI, and the typed responses for the 5 short answer assessment cycle questions can be found here:

N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\Assessment\GAIsyntax.sps N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\Assessment\GAI_PRE_POST_merged.sav N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\Assessment\Cohensd_and_ShortAnswers_PrePost.xlsx

Summary of Assessment Results The following PowerPoint presentation contains a summary of the assessment results from the GAI:

N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\GAI_staff_meeting_POST.pptx

The PowerPoint presentation presents the results broken up according to question type. The following summarizes the information presented in the PowerPoint.

2013 GAI

20

8 Cognitive Check-All-That-Apply and Ordering Questions ⇒ For the 8 cognitive (check all that apply/ ordering) items, at posttest, percent correct ranged from 75% (6/8) to

88% (7/8) ⇒ Every participant responded incorrectly to the same question at posttest:

o It is plausible that this question was too difficult for students o Also, the presentation that was intended to cover this information could be altered. That is, a bullet

point could be added to one of the slides that explicitly describes why CARS does NOT engage in classroom assessments (i.e. assessments that are strictly classroom based and not linked directly to General Education or Academic program assessment).

o Based on my implementation fidelity notes, the presenter ended the CARS Roles and Responsibilities presentation 22 minutes early. Also, according to my implementation notes, the presenter did not cover classroom assessments. Furthermore, the presenter did not spend enough time explaining each of the different components of the Evaluation process and why CARS only engages in one component of the evaluation process.

o According to my implementation fidelity notes, participants were not as engaged with this presentation as they were with others. This could be due to the fact that the presentation came after lunch or due to characteristics of the presenter.

o For future institutes, it should be decided whether this question is a good question and whether we want participants to know the information in this question or not.

o If the GAI programming wants to include this information, the CARS Roles and Responsibilities presentation is a great place to do it, especially the slide that breaks down the different components of the evaluation process. More time should be spent explaining the various components of the evaluation process and why CARS only engages in learning and development outcomes assessment.

o The presenter should make sure the participants understand how each component of the evaluation process differs from learning and development outcomes assessment, and the differences between classroom assessment and program or general education assessments.

⇒ Pretest mean = 3.5 out of 8 possible points ⇒ Posttest mean = 6.7 out of 8 possible points ⇒ We observed a 3.2 point increase at posttest out of 8 possible points ⇒ This constituted a large increase in scores from pre to posttest

5 Short Answer Cycle Questions ⇒ At posttest, participants did a great job articulating the purpose of assessment- Improving student learning ⇒ At pretest, several participants didn’t explain implementation fidelity. However, nearly every participant

described implementation fidelity at posttest ⇒ At posttest, responses contained information that was discussed during GAI

o One strategy for using results offered by faculty panel member o Commercial/non-commercial instruments o Cognitive/non-cognitive instruments o Existing versus new instruments

2013 GAI

21

⇒ Overall, participants seemed to show the greatest improvement from pre to posttest when it comes to describing the "Articulate Objectives & Map to Curriculum," “Collect Information & Implementation Fidelity” and “Use Results” steps in the assessment cycle.

⇒ Please see the excel spreadsheet that has the pre and posttest responses for further pre/post comparison.

Sense of Belonging Questions ⇒ For the 18 non-cognitive Sense of Belonging (SOB) items, at posttest, SOB scores ranged from 86% (77/90) to

97% (87/90) ⇒ Pretest mean = 78.7 out of 90 possible points ⇒ Posttest mean = 82.00 out of 90 possible points ⇒ So we have about 3.3 point increase at posttest out of 90 points… not too exciting? ⇒ All but one participant increased on SOB from pre to post

o One participant decreased by 3 points

Recommended Programming Changes Based on Assessment Results

⇒ Based on qualitative feedback from GAI participants: o The Roles and Responsibilities of CARS GAs should be one of the more early presentations because it

is “a little stressful not knowing what exactly we’ll be doing or helping with.” Also, the Roles and Responsibilities of CARS and the Roles and Responsibilities of CARS GAs could be presented back-to-back or at least closer together because they are interdependent and it would have promoted information retention to have them presented closer together.

o A better introduction/icebreaker for the GAI Perhaps an icebreaker would help ease the stress of being a “new” person and help transition

students into their new GA roles and help them feel more comfortable in CARS. o Continue to highlight and encourage CARS student culture o To help avoid communication errors with future incoming students, disseminate new students’

preferred method of contact to current students. Make sure that new students who are coming from other institutions are familiar with all of the email accounts they need to set up and respond to.

o An example of a typical client interaction. This might entail walking participants through some “mock” scenarios with a GA and a client. There could be a few different types of “client” (i.e., the confused client, the disgruntled client, the entitled client, etc.).

⇒ Concerning the question that every participant answered incorrectly (shown above), decide whether this question is a good question and whether participants should be expected to understand CARS roles and responsibilities on that level after the GAI

o If this item is retained: The CARS Roles and Responsibilities presentation is a great place to do cover the information

in this question, especially the slide that breaks down the different components of the evaluation process. More time should be spent explaining the various components of the evaluation process and why CARS only engages in learning and development outcomes assessment.

The presenter of this material should make sure the participants understand how each of the components of evaluation differs from learning and development outcomes assessment, and the differences between classroom assessment and program or general education assessments.

The presentation that was intended to cover this information could be altered. That is, a bullet point could be added to one of the slides that explicitly talks about why CARS does not engage in classroom assessments (i.e. assessments that are strictly classroom based and not linked directly to General Education or Academic program assessment).

2013 GAI

22

⇒ Otherwise, the participants showed gains in all of the other cognitive questions from pre to posttest, thus other questions might not need to be changed and other aspects of programming might be adequate.

⇒ Concerning the assessment cycle short answer questions, these questions provided rich qualitative information about participants’ understanding of the assessment cycle. Thus, I think it is a good idea to leave these questions as short essay or short answer questions. Given the data is qualitative, this year the data was analyzed by comparing each participant’s pretest response with their posttest response.

o For future GAIs, a rubric for quantitatively scoring these qualitative responses needs to be created and at least two raters should rate the responses.

⇒ All but one participant increased on SOB from pre to post; therefore, programming aimed to increase participants’ sense of belonging should be added or the existing programming should be further supplemented.

⇒ Consider using a different instrument to measure sense of belonging for future GAIs o Participants seem to come in reporting fairly high levels of sense of belonging o It is possible that we are seeing a ceiling effect o Future GAIs should consider using a different Sense of Belonging Scale, perhaps one that is more

sensitive to higher levels of sense of belonging. That is to say, we could not demonstrate much of an increase in sense of belonging after GAI programming, in part, due to the fact that the majority of participants came into the GAI reporting higher levels of sense of belonging.

⇒ Continue to have a faculty panel as part of GAI programming. The topic of the panel should be about a current issue that is relevant to assessment at JMU and beyond. Students were able to integrate ideas from the panel into their responses to the assessment cycle short answer questions. The panel was interactive; it helped students get to know their faculty members, and better understand the assessment work that goes on in CARS.

Recommended Programming Changes not based on Assessment Results

⇒ Currently, we are missing a pretest time point. Future GAIs should consider administering the GAI assessment tool to participants on or before their first day of working in CARS, then again approximately 1-7 days before the GAI programming begins, and finally at the conclusion of GAI programming using the Google Chrome Books. Implementing this additional time point will help us pinpoint learning that occurs during GAI because participants likely learn a lot between the time they begin working in CARS and the time of the GAI.

o Pretest 1 (on or before their first day of working in CARS) o Pretest 2 (exactly one day or one week before the GAI programming begins) o Posttest 1 (as soon as GAI ends using the Google Chrome Books)

⇒ Currently, we have no way of assessing Objective 6. Future GAIs should consider ways to assess that objective or rewrite/eliminate the objective if no feasible means of assessment can be created.

⇒ Currently, Qualtrics knowledge is not being assessed. We should consider if this is knowledge we want to assess through the GAI. If so, we need to incorporate a Qualtrics component into the pre and posttests

⇒ Consider adding additional GAI programming that takes place about a month after the GAI, a “GAI Booster shot,” so to speak. This might be a half-day event that GAI participants attend in a Friday afternoon when no classes are scheduled. This would be an opportunity to review the cognitive questions that participants struggled with and another opportunity to increase sense of belonging. Programming for this “booster shot” should be informed by the results from the GAI posttest measure; this could be similar to a review of “muddiest points” in a classroom setting. The idea of a “booster shot” would also be an ideal time to answer any questions that participants may have after having worked in their GA assignments for at least a month.

⇒ Additional resources should be added to the GAI binders for future GAIs including: o A directory of current students that lists all of the past GA positions they have held and their

current GA position. This can help new GAs quickly identify students that had their GA assignment in previous years. That way, if they have questions or concerns they have a quick reference that can

2013 GAI

23

direct them to the right GA to ask for help. This might also include each GAs picture and something about them. This might help foster mentor-mentee relationships among new GAs and current GAs. This would have to be maintained/updated every on either a yearly or semester basis.

o Similarly, it might be helpful to include a directory of faculty members within the GAI binders. This page could have a picture of each faculty member and their three most recent publications. This page might also include their current areas of research.

o Similar to this year, each year faculty members should be asked to submit any additional/new resources for the GAI binders. These could be recent articles that they came across, or articles they typically provide for their GAs or students.

Appendix A GAI Pretest Survey

N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\Assessment\GAI_pretest.pdf

GAI Posttest Survey N:\AA\CARS\CARS-Common\PASS\Projects\GA Institute\GAI 2013\Assessment\GAI_posttest.pdf