Unit Assessments A. - Millersville University

117
1 Standard 1. Content and pedagogical knowledge EPP Response is underlined Atachments are shaded Task: Unit Assessments A. Evidence in need of verification or corroboration 1. Please provide evidence that EPP created assessments meet CAEP sufficiency levels Stipulation: The EPP does not provide significant analysis of data and evidence for all unit and program assessments related to InTASC categories (Component 1.1) InTASC Category Source of Evidence: Professional Behaviors Rubric Components, please see full rubric and details in 2019 Assessment Handbook Sources of Evidence: MU Adapted Danielson The Learner and Learning, Respects Diversity and Civil Rights of Others Code of Professional Practice 235.4.4, 235.4.5, 235.8, Danielson 4f Demonstrates Professional Communication, Respects Diversity and Civil Rights of Others 1b Demonstrating knowledge of students InTASC category: The Learner and Learning, 1c Setting instructional outcomes, 2a Creating an environment of respect, Content 1a Demonstrating knowledge of content and pedagogy, 1d Demonstrating knowledge of resources InTASC category Instructional Practice 1e Designing coherent instruction, 1e Designing coherent instruction InTASC category, 1f Designing student assessment Professional Responsibility Demonstrates Professional Communication Code of Professional Practice 235.4. (7 & 8) Danielson 2a, 2b, 3a, 3b, 4c, 4d, Demonstrates Honesty and Integrity Code of Professional Practice 235.3. Purpose & 235.5. Conduct Danielson 2a, 2b, 4d, 4d, 4f Demonstrates Honesty and Integrity, Demonstrates Commitment to Becoming a Professional, Demonstrates Professional Relationships with Students 4a Reflecting on Teaching InTASC category Professional Responsibility, 18.) 4b Supervised maintenance of accurate records CAEP Sufficiency Source of Evidence: Professional Behaviors Rubric Sources of Evidence: MU Adapted Danielson ADMINISTRATION AND PURPOSE (informs relevancy) Transition points: First evaluation, Entrance to APS, Second evaluation entrance to Clinical: Transition point: Exit from Clinical MU Adapted Danielson Rubric Adapted from the Danielson Framework for Teaching At 7.5 weeks expected score is majority of basic (1) At 15 weeks expected score is majority of proficient (2) CONTENT OF ASSESSMENT (informs relevancy) Millersville Professional Behaviors Evaluation, CAEP 1.1, 2.3, 3.3, Danielson Domains 2, 3, 4, InTASC: Learner and Learning (1,2, 3), Instructional Practice (7), Professional Responsibility (9, 10) PA Code of Professional Practice and Conduct for Educators Please see EPP response in this task. The table in this task includes assessment meeting topics. Additionally, the attachment 2019 Assessment Handbook p. 12. SCORING (informs reliability and actionability) Professional Behaviors Explanation: Procedure for Self-Evaluation by Candidates, Procedure for Evaluation by Unsatisfactory: The teacher candidate does not meet performance expectations required for teaching.

Transcript of Unit Assessments A. - Millersville University

1

Standard 1. Content and pedagogical knowledge

EPP Response is underlined Attachments are shaded

Task: Unit Assessments

A. Evidence in need of verification or corroboration 1. Please provide evidence that EPP created assessments meet CAEP sufficiency levels

Stipulation: The EPP does not provide significant analysis of data and evidence for all unit and program assessments related to InTASC categories (Component 1.1)

InTASC Category Source of Evidence: Professional Behaviors Rubric Components, please see full rubric and details in 2019 Assessment Handbook

Sources of Evidence: MU Adapted Danielson

The Learner and Learning, Respects Diversity and Civil Rights of Others Code of Professional Practice 235.4.4, 235.4.5, 235.8, Danielson 4f

Demonstrates Professional Communication, Respects Diversity and Civil Rights of Others

1b Demonstrating knowledge of students InTASC category: The Learner and Learning, 1c Setting instructional outcomes, 2a Creating an environment of respect,

Content 1a Demonstrating knowledge of content and pedagogy, 1d Demonstrating knowledge of resources InTASC category

Instructional Practice 1e Designing coherent instruction, 1e Designing coherent instruction InTASC category, 1f Designing student assessment

Professional Responsibility Demonstrates Professional Communication Code of Professional Practice 235.4. (7 & 8) Danielson 2a, 2b, 3a, 3b, 4c, 4d, Demonstrates Honesty and Integrity Code of Professional Practice 235.3. Purpose & 235.5. Conduct Danielson 2a, 2b, 4d, 4d, 4f

Demonstrates Honesty and Integrity, Demonstrates Commitment to Becoming a Professional, Demonstrates Professional Relationships with Students

4a Reflecting on Teaching InTASC category Professional Responsibility, 18.) 4b Supervised maintenance of accurate records

CAEP Sufficiency Source of Evidence: Professional Behaviors

Rubric Sources of Evidence: MU Adapted Danielson

ADMINISTRATION AND PURPOSE (informs relevancy)

Transition points: First evaluation, Entrance to APS, Second evaluation entrance to Clinical:

Transition point: Exit from Clinical MU Adapted Danielson Rubric Adapted from the Danielson Framework for Teaching At 7.5 weeks expected score is majority of basic (1) At 15 weeks expected score is majority of proficient (2)

CONTENT OF ASSESSMENT (informs relevancy)

Millersville Professional Behaviors Evaluation, CAEP 1.1, 2.3, 3.3, Danielson Domains 2, 3, 4, InTASC: Learner and Learning (1,2, 3), Instructional Practice (7), Professional Responsibility (9, 10) PA Code of Professional Practice and Conduct for Educators

Please see EPP response in this task. The table in this task includes assessment meeting topics. Additionally, the attachment 2019 Assessment Handbook p. 12.

SCORING (informs reliability and actionability)

Professional Behaviors Explanation: Procedure for Self-Evaluation by Candidates, Procedure for Evaluation by

Unsatisfactory: The teacher candidate does not meet performance expectations required for teaching.

2

Faculty. See Attachments: PDP Plans, 2019 Assessment Handbook

Basic: The teacher candidate is performing at the basic level and is demonstrating partially proficient professional practices at this point in the clinical experience. Proficient: The teacher candidate’s performance is substantially demonstrated at this point in the clinical experience at the professional level.

DATA RELIABILITY

These two EPP assessments are not surveys. The assessments are completed at the student teaching placement by the COOP and the supervisor. The evidence for component content reliability, validity and quality of the assessments are presented in the following table where meetings, topics, and actions are provided and in section 2 where reliability and validity are provided.

The MU Adapted Danielson is the same as the original proprietary rubric, Charlotte Danielson Framework, with established validity and reliability. Since we deleted only one criterion, distinguished, from the original, we did some more reliability and validity studies. Cooperating teachers provide triangulation of data from K-12 experts on the quality of our candidates. Furthermore, the Danielson evaluation is well-understood by our cooperating teachers since it is the same evaluation that is used for their own evaluation. In the Spring of 2014, 96% % (the 4% not trained are new teachers or mid-year transfers into system and have not completed training as of this report)of our cooperating teachers had received training from the IU or their district on the use of the Danielson instrument. We believe that this percentage is now nearing 100% with the continued full implementation of the teacher evaluation system in K-12. Therefore, these ratings represent external partner evaluations using a high-quality instrument that they have been trained on.

DATA VALIDITY

These two EPP assessments are not surveys. The assessments are completed at the student teaching placement by the COOP and the supervisor. The evidence for component content reliability, validity and quality of the assessments are presented in the following table where meetings, topics, and actions are provided and in section 2 where reliability and validity are provided.

These two EPP assessments are not surveys. The assessments are completed at the student teaching placement by the COOP and the supervisor. The evidence for component content reliability, validity and quality of the assessments are presented in the following table where meetings, topics, and actions are provided and in section 2 where reliability and validity are provided and in the following EPP response.

SURVEY CONTENT

These two EPP assessments are not surveys. The assessments are completed at the student teaching placement by the COOP and the supervisor. The evidence for component content reliability, validity and quality of the assessments are presented in the following table where meetings, topics, and actions are provided and in section 2 where reliability and validity are provided.

These two EPP assessments are not surveys. The assessments are completed at the student teaching placement by the COOP and the supervisor. The evidence for component content reliability, validity and quality of the assessments are presented in the following table where meetings, topics, and actions are provided and in section 2 where reliability and validity are provided and in the following EPP response.

SURVEY DATA QUALITY

These two EPP assessments are not surveys. The assessments are completed at the student teaching placement by the COOP and the supervisor. The evidence for

These two EPP assessments are not surveys. The assessments are completed at the student teaching placement by the COOP and the supervisor. The evidence for

3

component content reliability, validity and quality of the assessments are presented in the following table where meetings, topics, and actions are provided and in section 2 where reliability and validity are provided.

component content reliability, validity and quality of the assessments are presented in the following table where meetings, topics, and actions are provided and in section 2 where reliability and validity are provided and in the following EPP response.

EPP Response: Two EPP created assessments are MU Adapted Danielson and Professional behaviors were developed with the oversight of the Teacher Education Council (TEC). The TEC committee within the Professional Education Association (PEA) houses all unit governance structures is the policy making body. The Associate Dean and Assessment Committee are charged with making assessment policy recommendations to the TEC which includes the Dean of the School of Education. The first step for the two assessments was to be discussed in the assessment committee and hear reasoning for changes to any current assessments being used. For example, the Professional Behaviors Assessment, first draft, review was prompted by discussions in bi-annual meetings of the Pennsylvania Association of Colleges for Teacher Education (PAC-TE) and the State Department of Education (PDE) regarding the assessment of student teacher performance. Different college and University Deans worked within the individual units and agreed the Danielson Framework was a proven assessment and one used by PDE to assess current P-12 teachers. The Pennsylvania State System of Education, 14 state colleges and Universities that work together as a system, adapted the Modified Danielson as the assessment for student teachers across the PASSHE system. The adapted Danielson document came to Millersville and was sent to the assessment committee for review and further piloting for reliability and validity testing.

The assessment committee includes representation from other Millersville Colleges that offer content area course work to candidates in grades 7-12 or Post Bacc certification therefore representing all initial and advanced certification programs. Members of PAC-TE include the Associate Dean for the School of Education and the Director of Field Experiences who are members of the Assessment Committee.

The assessment committee began discussion to use a state system accepted MU ADAPTED Danielson and to revise the disposition policy to include the mindset that professional behaviors, rather than thoughts about professional behaviors, can be demonstrated by candidates and evaluation with an assessment rubric. The same process was followed for the Professional Behaviors EPP created assessment. This assessment went through quite a few modifications and name changes until the present assessment was adapted.

When the two EPP created assessments, MU Adapted Danielson and Professional Behaviors, passed through reliability and validity activities (see section #2 this task), piloted, reviewed and modified by school partners, supervisors, and candidates, and each new was proposed for acceptance to the TEC by the assessment committee. The Assessment committee is charged with all work matters related to CAPE or PDE standards and accreditation. Please see addendum attachment, 2019 Assessment Handbook, p. 13 for the definition of the assessment committee. The two assessments were agreed upon by consensus.

The table below provides dates for the 2018-2019 meetings’ topics and outcomes regarding the EPP created assessments, MU Adapted Danielson and Professional Behaviors. The minutes’ topics reflect the committee discussions, review, and assigned working groups. The CAEP Sufficient level and specific element are aligned with the meeting date in small letters that match the CAEP sufficient evaluation framework. For actual meeting minutes’ please see attachment titled Assessment Committee Meetings/Data Days in this addendum. MU Adapted Danielson meetings and chart are first, following by Professional Behaviors next. The lead site visit member can access Millersville University's online education platform D2L to confirm the Assessment Committee meetings, agendas, and minutes in Nov. 2020. Additional data is included in B. Excerpt from Self-Study Report to be clarified or confirmed, #2, addressing the process used for unit assessment evaluation and revision.

EPP Created MU Adapted Danielson

4

Specific topics from meetings are included in first column under meeting date. EPP Created Professional Behaviors

Specific topics from meetings are included in first column under meeting date.

Meeting Date Mu Adapted Danielson

Resulting Actions

March 26, 2014 Data Day Closing the Gap, analyzing student teacher trends, 20 participants representing 5 departments.

Departments reviewed student teaching data submitted in the assessment system and analyzed rater agreement. Decision that MU Danielson data collection provides evidence of stu. Tchr. Competence in meeting standards

April 28, 2015 How Can we improve preparation of teacher candidates for classroom environment? Examine data form unit assessment MU Adapted Danielson, 16 participants representing 5 departments

Improvement area is in communication with school community. Decision to have Foundations courses that teach communication with partners’' strengthened. i.e. Communication and collaboration added to syllabus objectives

2017 Feb. 17 Began MU Adapted Danielson Pilot. Removed one rating from the original Framework

Faculty supervisors shared MU Adapted Danielson with school partners and COOPs

2017 May Review and share data from pilot with 15 P-12 partners, assessment committee members, and 22 faculty. Test – re-test data presented Data Day 17 participants representing 4 departments

Adapted Danielson rubric forwarded to Teacher Education Council for approval and use throughout all programs.

2018-2019

March 26 Subcommittee reports technology and CAEP Standards 1.5 and 3.4

Subcommittee identified weaknesses and provided a matrix of where technology is addressed in the PEU. Share matrix with faculty to add technology assessment in course syllabi.

Jan. 16 Aligning Evidence to CAEP Standards Matrix created to guide alignment exercise. Sharing of evidence table produced awareness across programs. Modification example, PDS programs will use a case study method to study impact on P-12 students.

Feb. 21 Aligning Evidence to CAEP Standards

Informa practice. Alignment shared in department meetings. Alignment is completed within the PEU assessment system

Feb 26 Subcommittee reports on Alignment

Continued alignment with CAEP and InTASC standard in the PEU system (can be confirmed by lead during visit)

April 9 Establish reliability and validity of assessment instruments

Run Lawshe method activities and analyze results. Result of analysis removal of distinguished category in Mu Adapted Danielson. Share test re-test method and analyze results during data day. Remove assessments that are no longer valid in PEU system

April 23 Revisit integrating technology components related to standards 1.5 and 3.4 (N

Information sharing with departments. Review and adjust syllabi to show technology use

May 15 Subcommittee rpts: Technology, surveys, add ethics module to APS

Information sharing with departments. Review and adjust syllabi to show technology use. Increase use of remote teaching/hybrid courses.

2019-2020 Oct. 10 CAEP and InTASC review and alignment Adjust PEU assessment alignment table and update to show alignment to standards. The

Assessment Committee revisited all major assessments and worked with departments to develop and upload new program assessments and align assessment components with InTASC, the CF, and student learning objectives (SLO). Assessments which have been replaced and are labeled inactivated so that the data remains in the system.

Oct. 16 Collection and analysis of Cooperating Teachers (COOPs) and supervisor ratings on Danielson in the PEU, consistent, reliable test, re-test method

Established validity and reliable results of PEU data. Compared COOP and SUPER interrater scoring. Analyzes showed high correlation of scoring

Nov. 11 Oct. 16 Collection and analysis of COOP and Supervisor ratings on Danielson

Analyzes showed high correlation of scoring. Review of survey questions to make consistent scoring options and clear questions because of COOP feedback.

5

2. Please provide direct evidence of reliability and validity for all unit assessments EPP Response: The MU Adapted Danielson is the same as the original proprietary rubric, Charlotte

Danielson Framework, with established validity and reliability. Since we deleted only one criterion, distinguished, from the original, we did some more reliability and validity studies. Cooperating teachers provide triangulation of data from K-12 experts on the quality of our candidates. Furthermore, the Danielson evaluation is well-understood by our cooperating teachers since it is the same evaluation that is used for their own evaluation. In the Spring of 2014, 96% (the 4% not trained are new teachers or mid-year transfers into system and have not completed training as of this report) of our cooperating teachers had received training from the IU or their district on the use of the Danielson instrument. We believe that this percentage is now nearing 100% with the continued full implementation of the teacher evaluation system in K-12. Therefore, these ratings represent external partner evaluations using a high-quality instrument that they have been trained on.

Meeting Date/Attendance

Professional Behaviors Assessment (PF)

Topics

2018-2019 Jan. 23 Faculty and K-12 Partners (survey N=78) Dot Voting Exercise to arrive at basic list of professional behaviors for rubric

Results of the dot exercise led to the first draft of PF rubric

March 26 Presentation of research and professional behaviors timeline to P-12 partners (5) faculty (15) Data Day

PF rubric was further refined. The pilot rubric had 5 components for evaluation a 6th component was added about behaviors toward students.

Jan. 16 Advanced program brainstorming for A.1.1 elements

March 29 establish inter rater reliability on a professional behaviors’ rubric

Receive partner feedback, established the inter-rater reliability

Feb. 21 continue with Jan. 16 agenda

Feb 26 Refine Professional Behaviors Rubric Input from faculty lead to revision. Now will send out to school partners April 9 Disposition Evaluation Internal Validity Survey developed and reviewed for launch.

Lawshe type of survey was launched to school partners.

April 23 Disposition Evaluation Internal Validity Survey

Collection and review of data

May 15 Review Professional Behaviors Internal validity survey

NO revisions needed to rubric after analysis of Validity survey.

2019-2020 Oct. 10 What can we learn from our analysis of student teaching data? Analyze trends. 20 participants, faculty from 5 departments

Data review resulted in

Oct. 16 Review and analyze data from COOP and Super Interrater using PEU frequency data from EPP Created assessment

Correlation between Super and COOP scoring is very high.

Nov. 11 Data Day: Focus on Classroom Environment. Deep dive into survey results, COOP and Supervisor assessment scores (22 participants, 10 P-12 COOPS)

Through data with partners for informed feedback from partners.

Dec. 12 InTASC category Professional Responsibility review standard 9 and align with Assessments in PEU.

Faculty complete checklists in assessment definitions

Feb. 26 Discussion of SLO’s and align with Professional Behaviors rubric and SPA program assessments. SPA program reports and how to use of SLO reports

Identify capstone SLO’s and align with Professional Behaviors rubric and SPA program assessments. Faculty writers will Include SLO data in SPA revisions

6

Data collected in the Professional Education Unit (PEU) assessment system has been operational for 8 years. Data for the EPP created assessments MU Adapted Danielson and Professional Behaviors assessment have validity and reliability tests as noted in the following table and narrative. The data collected through the PEU system follows a test-retest validity method. Data for MU Adapted Danielson and Professional Behaviors assessment are collected at mid semester (7.5 weeks) and at the final week (15 weeks) of the semester since 2011 (Unfortunate meeting minutes from 2011 period were not houses electronically through the D2L platform and cannot be located. Data Days minutes, agenda and other information that could be located from years before 2014 are housed in the attachment titled Assessment Committee Meetings/Data Days. More recent minutes and actions are accounted for in the table and in the Assessment Committee attachment in this addendum). The test-retest reliability process is applied with no sign of intervening factors on these assessments. The test re-test is where the same unit assessment instrument is administered some after the initial test and the results compared. The results from the assessments were compared over four semesters during an assessment committee meeting in 2012. The data from the fall 2011 and the fall 2012, and the spring 2012 was compared with spring 2013. The evaluation criterion is known to the candidates and the evaluators entering data so there is no link between knowing the content of the assessment better the second time the evaluation is completed. In fact, we want to see an improvement in the scores to indicate growth of skills and professional dispositions behaviors.

Please see table in Task: Unit Assessments #1. Meeting dates and topics provide evidence for CAEP Sufficient Levels. Additional evidence for the MU Adapted Danielson include a pilot group focus group held on March 18, 2015; Goals for the meeting: Review mid-term data gathered from the PEU on the Danielson domains included in the MU Adapted Danielson rubric what worked, what are problems, modifications for the pilot instrument. 9 attendees included 4 adjunct supervisors. A fair -unfair analysis of each criterion was held on Sept. 24, 2014.

In October 2015, 4 faculty members presented data on the pilot at the Pennsylvania Association of Colleges for Teacher Education (PAC-TE) which included analysis of the pilot data collected from supervisors and a Test - Retest or Stability- test noting the correlation between the first set of scores and the second set is determined.

EPP created Assessment Professional Behaviors. Please see table for Task: Unit Assessments #1. Meeting dates and topics provide evidence for CAEP Sufficient Levels.

Additional evidence for reliability and validity testing for this assessment include adaptation of the Niagra Approach. 21 items in 3 categories. Every faculty member evaluates every candidate in every course and populates a spreadsheet. This approach exceeds CAEP expectations for rigor of analysis, effort to establish validity, and overall rigor of progress. Candidates self-assess and can see ratings; program faculty develop plans for candidates to actively use and an intervention process is created for candidates with negative ratings. Niagra statistical approach, factor analysis, allows for discriminant analysis.

1. Additional steps taken to assure validity: Aligned draft of Professional Behaviors rubric with InTASC category Professional Responsibilities, collected student input through a survey with a Likert scale, January 26, 2018 Assessment Retreat (Data Day) minutes included a formal review process of the professional behaviors rubric. 20 faculty members attended. A consensus was achieved to accept the Professional Behaviors Rubric. Please see page 32-38, for a complete discussion of the professional policy, formal review process, Professionalism Policy in the 2019Assessment Handbook attachment in this addendum. On page 43 of the handbook, the Professional Behaviors evaluation with alignment to InTASC categories is provided. Also see attachment titled Professional Behaviors Results, Session date 3.29.19 which includes attendance, agenda, questions raised, and details of the meeting.

7

3. Confirm the multiple measures used for assessing initial (BSE) license candidates’ content knowledge and how deficits in content knowledge are remediated

EPP Response: Content knowledge is assessed by candidate competence in courses (I.) as reflected by their grades, (II.) Professional Behaviors assessed at the sophomore (Foundations) level and the junior level (professional Block), (III) Data collected from the MU Adapted Danielson at the mid and final points of the student teaching semester, (IV) Cooperating Teacher survey data, and the state required (V)PDE 430 data collect at mid and final points in the student teaching semester. The attachment titled Multiple Measures used for Assessing Initial License Candidates’ Content Knowledge indicates the measures used to assess initial (BSE) licensure candidates content knowledge and planning in addition to the attachments titled 3 cycles of EPP created Assessment MU Danielson which are included outside the folder as independent attachments. The measures include the Foundations courses which all candidates must take and pass with a grade of B or better. The grades on the table indicate our candidates achieve the benchmark with a great percentage. In addition, the state department requires 2 courses in special education, SPED 101, SPED 110. Student teaching is 6 credit hours: EDEL 461. EDEL 462 is a student teaching seminar which includes a SPED placement for dual majors and content for special education.

(I). Initial Program Course to assess content grades summary

Course Number and Name Number of Completers

% of Completers Meeting Minimum

Expectation

(Acceptable Grade = > C)

Number of Completers

% of Completers Meeting Minimum Expectation

(Acceptable Grade = >C)

SPED 101: Orientation to Special Education 140 97% 209 90%

SPED 311: Assessment for Designing and Implementing Instruction

155 100% 147 100%

EDFN 211 Foundations of Modern Education 251 99% 249 93%

EDFN 241: Psychological Foundations of Teaching

268 96% 278 99%

EDFN 330 Instructional Technology, Design and Assessment

251 94% 235 96%

EDEL 462: Student Teaching Seminar S= Satisfactory U=Unsatisfactory

105 91% 129 91%

PB cert candidates often enter our program and successfully transfer in the required special education course work and English Language Learners course. A review of applicants’ transcripts by the program chair and the Associate Dean (certification officer) is completed and signed. Courses listed for assessment in content for all PB cert seekers are foundations, technology education, and student teaching courses required for our PB program students in this chart. All candidates in the PB programs must earn a B or better in all courses.

8

Course Number and Name Number of Completers

% of Completers Meeting

Minimum Expectation

(Acceptable Grade = > C)

Number of Completers

% of Completers Meeting Minimum

Expectation

(Acceptable Grade = >C)

EDFN 511 Comparative Education 140 97% 209 90%

EDFN 545 Advanced Educational Psychology

155 100% 147 100%

EDFN 530 Instructional Technology, Design and Assessment

128 94% 120 96%

EDFN 590 Social Foundations of Education

125 96% 278 99%

EDFN 560 Post-Baccalaureate Clinical Practicum in Student Teaching

159 100% 230 100%

EDFN 561 Student Teaching 105 91% 129 91%

Remediation for struggling candidates University tutoring services are available for writing or reading weaknesses, faculty provide extra assignments or time to re-do an assignment, faculty tutoring for individuals or small groups, work with graduate students who have taken the course or passed the content, candidates can re-take the course.

(II) Professional Behaviors: Caandidates are assessed for content along with professional behaviors. The candidate is expected to deliver instruction in a professional manner and differentiate all instruction to meet the needs of individuals in the classroom. Professional flexibility and communication are required as modifications in instructional plans are carried out with professional behaviors including collaborative and reflective practices. Additionally, the candidate is expected to interact professionally with the educational community with a firm grasp on the content he/she is teaching. All these professional behaviors are assessed in the Professional Behaviors Rubric. Please see 2019 Assessment Handbook attachment page 43. Data for the individual components for 3 iterations of data disaggregated by programs are contained in the attachment 3 cycles of data or EPP created assessment Professional Behaviors aligned with InTASC.

Remediation: Because the Professional Behaviors assessment is completed at the beginning of candidates’ programs and again during the junior year, struggling candidates are given extra support from COOPs and supervisors in the classroom, in lesson/unit planning, and professional behaviors The Professional Development Plan (PDP), see attachment titled Sample PDP Plans, can be developed to provide the candidate formative feedback written as goals and objectives. Professional Development Plans specify current deficiencies, goals, potential consequences for failure to meet goals, identification of point-in-time when goal achievement will be

9

assessed. Students have the right of appeal in case of the finding of failure to meet goals of plan. The PDP plan is posted on the D2L learning platform as shared evidence for faculty review of data on the formative assessment of candidates. Faculty can access the PDP plans for informative measures and to guide their next steps should additional remediation of a behaviors become obvious. The Associate Dean reviews and posts the PDP plans to the learning system D2L. If the candidate does not meet the posted criteria for successful completion of the MU Adapted Danielson at the 7.5 week time and behaviors with extra support, or behaviors cannot be improved with additional support by the goal date of the PDP, the candidate will be advised to switch majors and given guidance in how to transfer credits to another non-education program.

(III) Content knowledge is assessed by Cooperating Teachers (COOPs) in the MU Adapted Danielson at the 7.5, mid-semester, and the end of the student teaching placement at 15 weeks. The following components measure candidates’ knowledge and application of the content: 1a Demonstrates knowledge of Content, 3a: Communicating with students. Teacher candidate’s explanation of content is well scaffolded, clear, and accurate, and connects with students’ knowledge and experience. During the explanation of content, the teacher candidate invites student intellectual engagement, 3c: Engaging students in learning Students are engagement with important and challenging content and are supported in that engagement by teacher candidate scaffolding, and 4e: Growing and Developing professionally seeks out opportunities for professional development to enhance content knowledge and pedagogical skill. Please see attachment 3 Cycles of Data MU Adapted Daniels Supervisor (super) and COOP for these components.

The total average scores are provided on the Data Summary excel workbook tab in the attachment 3 Cycles of Data MU Adapted Daniels Supervisor (super) and COOP. All these data for the MU Adapted Danielson show our candidates are scoring at the expected score of 2 or above at the 7.5 week evaluation and scoring above a total average of 2 (very close to the 3) for 15 week evaluation. The scoring guide is for the Danielson is: MU Adapted Danielson 3=Proficient, 2=Basic, 1=Unsatisfactory: Expected 'good enough' score 7.5 weeks 2> with only 2 or less scores of 1= Unsat, Expected 'good enough' score 7.5 weeks 2> with no scores of 1=Unsat.

Remediation: Because the MU Danielson assessment is completed at the mid-point of the semester, struggling candidates are given extra support from COOPs and supervisors in the classroom, in lesson/unit planning, and professional behaviors The Professional Development Plan can be developed to provide the candidate formative feedback written as goals and objectives. Professional Development Plans should specify current deficiencies, goals, potential consequences for failure to meet goals, identification of point-in-time when goal achievement will be assessed, and rights of appeal in case of failure to meet goals of plan. The PDP plan is posted on the D2L learning platform as shared evidence for faculty review of data on the formative assessment of candidates. Faculty can access the PDP plans for informative measures and to guide their next steps should additional remediation of a behavior become obvious. The Associate Dean reviews and posts the PDP plans. If the candidate does not meet the posted criteria for successful completion of the MU Adapted Danielson at the 7.5 week time and behaviors cannot be improved by the goal date of the PDP, the candidate will be advised to switch majors and given guidance in how to transfer credits to another non-education program.

NOTE: Due to COVID-19 shut down of the University and school districts and child-care, only one MU Danielson data set, data for the 7.5 evaluation is complete. Because of the shut down the MU Danielson data for the 15-week evaluation is also incomplete. Some non-education students were able to be in clinical placements as their placement, social work for example, was deemed essential.

(IV)Content knowledge is assessed through COOP survey, attachment titled Multiple Measures Used for Assessing Initial License Candidates’ Content Knowledge, click on COOP survey. (Qualtrics complete survey can be reviewed at Nov. 2020 visit). Each semester surveys are sent to COOPs. The table below provides data for content assessment through survey questions. Total responses are included with percentage of answers.

10

Survey question or statement Total Respondents to Question

Agree Neither Agree or Disagree

Somewhat Disagree

Strongly Disagree

Demonstrate knowledge of content-related pedagogy

234 67% N= 113

20% N= 33

10% N= 16

4% N= 7

Lesson planning provides concrete ways the candidate provides instruction that meets the needs of all learners.

235 74% N= 66

21% N= 16

6% N= 4

5% N= 4

The Cooperating Teacher (COOPs) Survey and the Program Provider Survey Aggregate Survey Responses provide data on content knowledge assessment of Millersville candidates. The COOP survey data for this task focuses on content application through lesson/unit planning.

Data for all three questions on the COOP survey show <4% felt MU candidates did not possess planning skills

and application of content in the classroom. The summary data in the above table represents multiple years of survey data.

(V) PDE 430: The state required PDE 430 collects data in 4 categories, but only the Category I and III are directly measuring content knowledge, attachment titled Multiple Measures used for Assessing Initial License Candidates’ Content Knowledge, click on PDE 430. Category 1: Student teacher/candidate demonstrates thorough knowledge of content and pedagogical skills in planning and preparation. Student teacher makes plans and sets goals based on the content to be taught/learned, knowledge of assigned students, and the instructional context. Alignment: State Standards 354.33. (1)(i)(A), (B), (C), (G), (H). Instructional Delivery - Student teacher/candidate, through knowledge of content, pedagogy, and skill in delivering instruction, engages students in learning by using a variety of instructional strategies. Alignment: State Standard 354.33. (1)(i)(D), (F), (G).

An example from the BSE Early Childhood program in in the table below. The attachment in this addendum titled Multiple Measures used for Assessing Initial License Candidates’ Content Knowledge, click on PDE 430. This data displays disaggregated data by programs. Data show our candidates make improvements in overall scores from the 7.5 evaluation to the 15-week evaluation. This improvement demonstrates the support of the faculty, mentor teachers, and supervisors in providing adequate instructional teaching and practical application of theory from the university classroom. Additionally, the improved scores demonstrate mentor teacher, supervisor, and faculty’s close attention to data analysis.

BSE Early Childhood 7.5-week assessment Scoring Criteria for PDE 430

3 2 1 0

Question Minimum Maximum MeanStd

Deviation

Variance Count% Strongly

Agreed/# answered

% Neither Agree or

Disagree/ # answered

% Somewhat Agree/ #

answered

%Strongly Disagree/# answered

Total%

Lesson planning provides concrete ways the candidate provides instruction that meets the needs of all learners.

1.00 9.00 3.45 2.64 6.96 235 40.79/31 21.05/16 32.89/25 5.26/4100

Plans assessments aligned with performance outcomes of learner.

1.00 5.00 1.9 0.99 0.97 235 64.94/100 22.73/35 9.74/15 2.60/4 100

Write detailed lesson plans. 1 5 1.65 0.92 0.84 236 54.24/ 128 36.02/85 2.54/6 2.12/5 100

11

Exemplary (Minimum of 12 Points)

Superior (Minimum of 8 Points)

Satisfactory (Minimum of 4 Points) Unsatisfactory 0 Points

3 2 1 0

Semester

Frequency Total

Exemplary (Minimum of 12 Points)

Superior (Minimum of 8 Points)

Satisfactory (Minimum of 4 Points)

Unsatisfactory 0 Points

Average Score

Exemplary Score Increase

Spring 2020 45 22% 73% 4% 0% 2.18 22%

Fall 2019 29 24% 72% 3% 0% 2.21

average score >2 at 15 wk

Spring 2019 42 29% 67% 5% 0% 2.24

The above table from the BSE Early Childhood PDE 430 scoring for content knowledge, shows an average score of a 2 or above indicates a candidate grasps the evaluation criteria and demonstrated competence. 13/16 of our programs have a score of 2 or above = Exemplary in the academic year 2020. 8/16 programs achieved a 2 or above in all three academic years. Not year reported a program scoring Unsatisfactory. All programs showed improvement in evaluations moving up from Superior to Exemplary. The non-education students’ scores were in line with the education students and did not show a measured growth above the evaluation scores achieved by education students

The PDE survey completed as candidates apply for licensure provides a self-assessment of content knowledge and planning abilities using that knowledge. For question 3, I feel adequately prepared to design and implement instruction and assessments that are aligned with state standards, 58 candidates strongly agreed, 14 agreed, and 7 no opinion. The total N = 79 for responders (complete data table is in attachments for this addendum attachment titled PDE Survey 2016-2019. These data show candidates agree with COOPs and supervisors they are well prepared for delivering content as a professional educator.

4. Confirm the multiple measures used for assessing initial (BSE) license candidates’ professional dispositions and behaviors and how deficits in professional behavior are remediate or if candidates are counseled out of the EPP.

EPP Response: Multiple measures included (disaggregated by program, BSE (initial) and PB cert): Mu Adapted Danielson (2X), Professional Behaviors rubric (2x), (including as independent attachments), attachment titled Multiple Measures used for Assessing Initial License Candidates’ Content Knowledge include course grades, Survey Data from COOPs, and State required PDE 430. All candidates in education licensure programs are assessed by supervising faculty using the Disposition and Professional behavior rubrics as a requirement for formal admission to advanced professional studies twice before student teaching (APS for teacher candidates,

3 2 1 0

Semester Frequency Total

Exemplary (Minimum of 12 Points)

Superior (Minimum of 8 Points)

Satisfactory (Minimum of 4 Points)

Unsatisfactory 0 Points

Average Score

Spring 2020 45 0% 62% 38% 0% 1.62

Fall 2019 30 7% 67% 27% 0% 1.8 Spring

2019 41 5% 61% 34% 0% 1.71

12

degree candidacy for advanced programs) and one other time during student teaching (culminating clinical experience). Evaluation of dispositions is used primarily for candidate self-reflection and growth, but also as a way to monitor and support developing skills of candidate as outlined in the transition points for the all programs, see 2019 Assessment Handbook, p. 16. The Professional Behavior Rubric is used as part of a formal review process (defined by Professionalism Policy, 32) when substantial concerns arise. The Professional Behavior rubric can be used in making decisions about candidate progression through the program.

The 2019 Assessment Handbook attachment on page 16 lists the transition points for the INT program for Professional Behavior assessment:

Continuing Assessment of Professional Behaviors

The Professional Education Unit has adopted an ongoing system for assessing candidates’ professional behaviors and admission (APS for teacher candidates, degree candidacy for advanced programs) and at least one other time prior to the culminating field-based experience. Evaluation of dispositions is used primarily for candidate self-reflection and growth. The Professional Behavior Rubric will also be used as part of a formal review process (defined by Professionalism Policy) when substantial concerns arise. The Professional Behavior rubric can be used in making decisions about candidate progression through the program, page 40 of the 2019 Assessment Handbook.

On page 17 the continuing assessment for the ADV programs in included within the transition point table:

Continuing Assessment of Professional Behaviors

The Professional Education Unit has adopted an ongoing system for assessing candidate dispositions. The system includes unit criteria for dispositions aligned to the conceptual framework, a process for identification of candidates for which there are concerns, a process for involving all appropriate faculty and school partners in the evaluation, an appeals process, and a process for remediation. All candidates must maintain a “Proficient” rating at all times. Candidates who earn a Partially Proficient rating receive remediation and may be removed from the program. The Professional Behaviors rubric and an explanation of the process can be found in the 2019 Assessment Handbook beginning on page 33.

Remediation for candidates in the INT and the ADV programs that need extra support are listed below:

Formal Review Related to Concerns 1. Formal review will take place in a meeting including relevant faculty selected by the program leader. In cases involving field experience, the person responsible for field placements for that program (typically the Field Experience Coordinator) must be invited to participate. 2. Formal review for field based professional performance or for professional behaviors or dispositions may only be completed using evaluation instruments approved by the appropriate curriculum policy group for that program. These guidelines will be revised to include new instruments as they are approved. All approved instruments must be consistent with and reflect recognized state or national professional standards appropriate for program. Clinical partners and faculty should be involved in the development and validation of these instruments.

3. Candidates must be notified in advance of the purpose of the meeting and the fact that it could result in an unsatisfactory review. Candidates should be made aware that they will have a right to appeal the results of the review and that they may bring advocates and relevant evidence to the meeting. There should be no decisions or plans drafted before the meeting.

4. If the meeting results in an unsatisfactory review, a Professional Development Plan will be created. Professional Development Plans should specify current deficiencies, goals, potential consequences for failure to meet goals, identification of point-in-time when goal achievement will be assessed, and rights of appeal in case of failure to meet goals of plan. The PDP plan is posted on the D2L learning platform as shared evidence of the formative assessment of candidates. Faculty can access the PDP plans for informative measures and to guide their next steps should additional remediation of a behaviors become obvious. The Associate Dean reviews and posts the PDP plans. Signatures of all present at

13

the PDP review meeting are on the PDP plan that will be posted in D2L. For purposes of Inter-rater review validity and reliability review a review of response is conducted by the PDP review team. This inter-rater process was newly adapted in the fall 2019 so the data is not complete at the time of this review. 5. If the review finds that the candidate’s behavior is such that participation in any field placements would pose a risk to the safety of individuals in that field placement, and the candidate would like to continue in his/her program, the findings of the formal review, along with any written statement by the candidate, will be shared with any current or prospective field placements. As with clearance infractions, field partners hold their own standards for accepting candidates and Millersville has no control over their decisions. If a field partner denies a candidate a placement, Millersville will search for another placement opportunity twice more, for a total of three (3) attempts. If three partners refuse or deny working with the candidate in question, Millersville University then resigns all responsibility in making a field placement for the candidate during his/her enrolled semester. Candidates will not be able to complete their degree program and/or certification program if they cannot complete required field experiences and will be advised and counseled for a change of major. 6. Faculty assess whether the goals of the Professional Development Plan have been met as well as specific consequences for progression through their program and will inform candidates of their right to appeal. The PDP process is outlined in addendum attachment titled 2019 Assessment Handbook, pgs. 33-39. A blank PDP form is included in attachments. Examples of the PDP filled will be shared with lead site visitor during Nov. 2020 visit. are also included with this addendum in attachment titled Formal Review Report.

B. Excerpt from self-study report to be clarified or confirmed 1. “Data evidence is collected through multiple unit assessments such as the Methods: Unit Planning

Assessment, Professional Behaviors Assessments at entry and exit from clinical experiences, and Evaluation of Field Experiences.” (p. 113). Clarify whether the evaluation of field experiences mentioned here is the same as the PDE 430, MU Danielson, or includes both assessments. EPP Response: The data evidence collected is through the Professional Behaviors EPP created unit assessment the MU Adapted Danielson which is an EPP unit created assessment. The MU Adapted Danielson is completed twice, at the mid-term 7.5-week time and again at the conclusion or 15th week time, during student teaching. The state required PDE 430 is completed twice during the student teaching semester at the same intervals.

2. “Assessment of the validity and utility of primary assessment data takes place in all programs, both in the

assessment development process and in the interpretation of data. Content validity is assured through the development of rubrics that are well-aligned to SPA standards and Conceptual Framework outcomes (Conceptual Framework, 2019, p. 12-28). Clarify which reliability and validity studies have been conducted on each EPP-created assessment and/or rubric.

EPP Response: Attachments Professional Behaviors Results and Assessment Committee Meetings provide details of the meeting times, dates and attendance. In the same attachment, Lawshe Method for internal validity was to rate the internal validity. Please see Task: Unit Assessments #1 table aligning CAEP sufficiency framework with two EPP created assessments. The table in this task provides meeting times, activities, and participants under CAEP sufficient level 4: Data Reliability and sufficient level 5 Data validity.

The MU Adapted Danielson is the same as the original proprietary rubric with established validity and reliability with a single deletion of one criterion, distinguished. We did some more reliability and validity studies. Please see table for Task: Unit Assessments #1. Meeting dates and topics provide evidence for

14

CAEP Sufficient Levels. Additional evidence for the MU Adapted Danielson include a pilot group focus group held on March 18, 2015; Goals for the meeting: Review mid-term data gathered from the PEU on the Danielson domains included in the MU Adapted Danielson rubric what worked, what are problems, modifications for the pilot instrument. 9 attendees included 4 adjunct supervisors. A fair -unfair analysis of each criterion was held on Sept. 24, 2014. 4 faculty members presented data on the pilot at the Pennsylvania Association of Colleges for Teacher Education (PAC-TE) which included analysis of the pilot data collected from supervisors and a Test - Retest or Stability- test noting the correlation between the first set of scores and the second set is determined.

EPP created Assessment Professional Behaviors. Please see table for Task: Unit Assessments #1. Meeting dates and topics provide evidence for CAEP Sufficient Levels. Additional evidence for reliability and validity testing for this assessment include adaptation of the Niagra Approach. 21 items in 3 categories. Every faculty member evaluated every candidate in every course and populated a spreadsheet. This approach exceeds CAEP expectations for rigor of analysis, effort to establish validity, and overall rigor of progress. Candidates self-assess and can see ratings; program faculty develop plans for candidates to actively use and an intervention process is created for candidates with negative ratings. Niagra statistical approach, factor analysis, allows for discriminant analysis.

Additional steps taken to assure validity: Aligned draft of Professional Behaviors rubric with InTASC category Professional Responsibilities, collected student input through a survey with a Likert scale, January 26, 2018 Assessment Retreat (Data Day) minutes included a formal review process of the professional behaviors rubric. 20 faculty members attended. A consensus was achieved to accept the Professional Behaviors Rubric. Please see page 32-38, for a complete discussion of the professional policy, formal review process, Professionalism Policy in the 2019 Assessment Handbook attachment in this addendum. On page 43 the Professional Behaviors evaluation with alignment to InTASC categories is provided.

B. Interviews to be scheduled for visit

Task: Data collection and analysis

A. Evidence in need of verification or corroboration 1. Please provide the three most recent cycles of data for each unit assessment, disaggregated by program, and

InTASC categories, and analyzed for trends, comparisons, or other differences EPP Response: 3 cycles of data disaggregated by program and aligned to the InTASC categories are included in

attachments: • 3 cycles of data EPP created assessment Professional Behaviors aligned with InTASC • 3 cycles of data for EPP assessment (COOP) MU Adapted Danielson aligned with InTASC • 3 cycles of data for EPP created MU Adapted Danielson (Super) aligned with InTASC Categories

Analysis: The data gathered for assessment of the professional behaviors indicate candidates strengthen their ability to identify professional behaviors and demonstrate professionalism in all circumstances. The expected average score for education students for the first evaluation in foundations classes when candidates are sophomores is >2 (developing professional); please see data summary sheet in attachment 3 Cycles of Data EPP created assessment. The total average score for education and non-education candidates is 2.5. Attachment 3 cycles of data for EPP created MU Adapted Danielson COOP show the total average score from the data summary page for education students is 2.7 at 7.5 weeks and 2.8 at 15 weeks. For non-education students the total average for both evaluations is 3. These scores are at the expected good enough score

15

indicated for the assessment. The supervisor data for the MU Adapted Danielson shows total average scores on the data summary page for education students at 7.5 weeks is 2.6 and at 15 weeks it is 2.8. These totals reach the expected score of >2. The non-education students total scores for the 7.5-week evaluation is 2 and the 15 weeks is at 3. The 7.5 score for the non-education students did not reach the expected score, but only missed it by a tenth of a point. The score for the 15 weeks exceeded the developing professional expected score and reached the Proficient level. COVID-19 shut down of schools impacted the spring semester 2020 data set as education student were not able to complete in classroom student teaching and therefore no data for the second half of the placement was entered. The non-education students form social work and other clinical settings not shut down were able to finish clinical placements and have data entered for them.

2. Where applicable, please provide data comparison to non-candidate’s performance in the same courses/majors, as well as EPP initial license candidates; specialty license area performance competed to state/national data.

EPP Response: The data attachments in Task: Data collection and analysis #1 in this addendum includes Non-Educ Student data. The EPP created 3 cycles of data attachments include frequency score data and total score data on the summary data page for non-education students. The Non-Educ students can be enrolled in special education courses SPED 100, SPED 110 (depending on the semester when courses are offered), foundations course EDFN 241, or EDSE 471, and clinical practicum social work classes. The PEU data collection system is set up to collect non-education student data as a program but does not identify which course they are enrolled.

Each assessment data workbook (attachments) includes a Data Summary disaggregated by program, initial (BSE) and Post Bacc cert, as the one of the first pages. From the Data Summary in the two unit created EPP assessments, it can be noted that Non-Educ students have an average score very close to the Education students. The expected average score for the Professional Behaviors unit assessment completed in the foundation courses, early sophomore year, and average of above 2. The ‘good enough’ score equates to the evaluation of being a developing professional. The education majors scored an average of 2.3. The Non-education students scored an average of 2.5 and so did the education students. In the second iteration of the assessment given in the junior year or professional block, the non-education students an average of 2.8 whereas the education students averaged a 2.9. Please see data attachments titled as 3 cycles of data for MU Danielson and Professional Behaviors in this addendum. The data summary page contains data averages by program and for Non-educ students.

The expected score for the MU Adapted Danielson is an average of 2 or above. The good enough scores for this assessment are at 7.5-week assessment students need to score an average of above 2 (Basic). Students should not have more than 2 scores of 1 = Unsatisfactory. For the second assessment, students are expected to have 2 (Basic) or above. 3 =proficient and is the highest score for the MU Adapted Danielson. Supervisors and cooperating teachers (COOPs) score the assessment in the same way.

The Non-Educ students’ average scores for the unit assessment Mu Adapted Danielson COOP data scored closely to the education students. Interestingly enough the non-educ students average a score of 3, proficient, for both the first half of the semester and the second half as well. The non-education students’ scores are a few tenths of a percentage point ahead of the education students. Education students averaged a 2.7 for the first half of the semester and a 2.9 for the second half. The education students scores were in the proficient range. The difference in the scoring was incomplete because of the COVID

16

shut down of public schools. Some non-education students were able to complete clinical experiences because their placement sites were deemed essential and were open during the COVI shut down the difference.

The MU Adapted Danielson supervisors scored the groups of students with similar results to the COOPs. For the 7.5 midterm assessment education students had an average score of 2.6. At 15 weeks the average score for education students is 2.8. The non-education student average score at the 7.5 weeks assessment averaged a 2. At the 15 weeks the non-education students average a 3. Again, the average scores are very close. The shutdown of schools due to the COVID-19 virus impacted scores for both groups, but the education students were not able to go to clinical placements in public schools. The 2020 data set is not complete because data entry for candidates who were not in a clinical setting was waived. Many social worker majors who would take the non-educ courses were able to attend their placements as social work was designated as an essential business during the shutdown therefore the non-education students were able to be scores for the last part of the spring semester 2020.

Pass rates data, below, compares Millersville’s candidates pass rates to the state pass rates. Millersville’s pass rate exceeds the state pass rates in every program supporting the fact that Millersville candidates are well-prepared.

**All numbers are percentages. PAPA tests results are no longer reported for MU because so few of our candidates choose to take the PAPA test. Our candidates take the CORE basic skills tests 98% of the time. Not all programs are reported due to low enrollment. Title II data collection does not report on program test data if there are less than 5 enrolled students.

2018-2019 2017-18 2016-2017 Overall MU State MU State MU State 83 74 86 79 CORE Math 90 89 93 89 96 92 CORE Rdg 96 95 97 96 100 95 CORE Wtg 87 85 86 83 88 83 PAPA Rdg ** ** 100 95 99 95 PAPA Math ** ** 97 94 97 95 PAPA Wtg ** ** 98 97 99 97 English 95 94 100 95 100 94 Math 86 79 90 76 30 81 Music 100 93 100 96 92 97 PReK-4 Mod 1 100 90 97 91 99 91 PreK-4 Mod 2 98 87 93 86 94 86 Prek-4 Mod 3 99 77 83 78 85 78 Social Studies 95 90 100 88 96 93 SPED Mod 1 91 79 86 77 96 77 SPED Mod 2 91 78 86 77 95 78 Tech Ed 100 100 100 100 100 100

3. Confirm for each initial (BSE) licensure program which data are collected to demonstrate candidates’ use of research and evidence for impact on P-12 student learning. EPP Response: Please see data attachment titled Research and Evidence for Impact on P-12 disaggregated by program. The data table provides data collected through the PEU assessment system for the course EDSE 471. Early Childhood (EMME) data is listed under the special education seminar course. EMEE dual major candidates are required to take EDSE 471 during student teaching. Further, the early childhood program uses a research

17

project to assess candidates in that program’s ability to use research. That data is included in the data table attached titled Research and Evidence for Impact on P-12. Foundations course EDFN 241, required for all candidates in initial programs. The same course with a different course number is required for PB cert programs. PB cert candidates are required to earn a B or better to pass the Founds course EDFN 545The Foundations course explains the central tenets underlying different theories of learning and the implications for the P-12 classroom instruction. The grades serve as evidence Millersville’s candidates demonstrate the use of research and its impact on instruction for the P-12 student.

Grade Description Grade Point Value

A Excellent 4

A- 3.7

B+ 3.3

B Good 3.0

B- 2.7

C+ 2.3

C Acceptable 2.0

C- 1.7

D+ 1.3

D Poor 1.0

D- 0.7

F Fail 0.0

The annual report submitted by the Dean of the College of Education and Human Services to the President of the University includes these research practices:

Student Research - Two years ago there was a dramatic increase in the percentage of students from the College of Education who participated in Made in Millersville (n=21). The student research and performing and visual arts conference, Made in Millersville: A Celebration of Student Scholarship and Creativity, highlights the work of MU students. The event embraces traditional field and laboratory work as well as projects in the visual and performing arts such as creative writing, music, drama, debate, public speaking, and other activities. Last year, according to data from Dr. Munoz, there were 31 EDHS students who participated in the event, a 48% increase over the previous year. The Made in Millersville event attracted 66 students from EDHS for 2019-20, a 112% increase.

High Impact Practices - All undergraduate programs in EDHS have at least 2 high impact practices built into their curricula, including senior capstone experiences across all undergraduate programs. 100% of undergraduate students avail themselves of at least 2 HIP (bolded entries indicate required): o SOWK – FYE, service learning, internships, undergraduate research, study abroad,

capstone o PSYC – FYE, service learning, internships, undergraduate research, study abroad, capstone o EMEE – service learning, internships, undergraduate research, study abroad, capstone o EDFN – FYE, service learning, internships, undergraduate research, study abroad,

capstone o WSSD (MDST) – FYE, service learning, internships, capstone

18

4. Please provide direct evidence and data analysis of candidates’ ability to effectively plan, implement, and evaluate student progress, candidates use of data to reflect on practice and professional practice, and candidates’ use of data to assess p-12 student progress and to modify instruction based on student data (data literacy) for all EPP initial (BSE) license programs EPP response is below the data table.

EPP Response: The Teacher Work Sample (TWS) data, aligned with InTASC Categories, collected by the Early, Middle and Exceptional Education program (EMEE) is provided in an attachment for this addendum disaggregated by EMEE programs; attachment titled Disaggregated by Program Assess P-12 Student Progress and Modify Instruction TWS. The table below aligns the sub scores with the elements candidates’ ability to effectively plan, implement, and evaluate student progress, candidates use of data to reflect on practice and professional practice, and candidates’ use of data to assess P-12 student progress and to modify instruction based on student data. Analysis: The scoring criteria of the TWS is

3 2 1 0

Sub Score Effective Planning

Implement Plans

Assess Student Progress

Reflection on use of Assessment Data

Assess Impact on P-12 Student

Modify Instruction

5. Implications for Instructional Planning and Assessment

X X

7.) Learning Goals: Clarity X X 8.) Learning Goals: Appropriateness for Students

X

10.) Assessment: Alignment with Learning Goals and Instruction

X X

5.) Implications for Instructional Planning and Assessment

X X X

10.) Assessment: Alignment with Learning Goals and Instruction 12.) Assessment: Multiple Modes and Approaches

X X X X

2.) Knowledge of Characteristics of Students 3.) Knowledge of Students’ Varied Approaches to Learning

X X X

17.) Lesson and Unit Structure X X X 22.) Modifications Based on Analysis of Student Learning

X X X

23.) Congruence Between Modifications and Learning Goals and Objectives

X X X

27.) ASL: Evidence of Impact on Student Learning

X

28.) Reflection and Self-Eval: Interpretation of Student Learning

X X

29.) Reflection and Self-Eval: Insights on Effective Instruction and Assessment

X

19

Frequency Total Indicator Met Indicator Partially Met Indicator Not Met No Score

The expected score for the TWS assessment is a total of >2. As the candidates complete the TWS they submit their work to the faculty for review and modification. This process assures the candidates will receive the support they need to have very few scores of a 2 on the final TWS. The data in the data table attachment show an average score of 2.7 for the early childhood majors. This group achieved the expected score for the three data cycles. The low enrolled Middle Level program only has one year of data as they had no enrolled students the other two data collection years. The Middle Leve did meet the expected score of >2 with a 2.7 average score. The dual majors, Prek-4/SPED K-8 achieved a total average score of 2.9 meeting the expected score.

Alignment of TWS with for candidates’ ability to effectively plan, implement, and evaluate student progress, candidates use of data to reflect on practice and professional practice, and candidates’ use of data to assess P-12 student progress and to modify instruction based on student data

Sub Score Effective Planning

Implement Plans

Assess Student Progress

Assess Impact on P-12 Student

Modify Instruction

17.) Lesson and Unit Structure X X 16.) DFI: Accurate Representation of Content

X X

5.) Implications for Instructional Planning and Assessment

X X

6.) Learning Goals: Significance, Challenge and Variety

X X

3.) CF: Knowledge of Students’ Varied Approaches to Learning

X

12.) Assessment: Multiple Modes and Approaches

X X X

19.) DFI: Use of Contextual Information and Data to Select Appropriate and Relevant Activities, Assignments and Resources

X X

28.) Reflection and Self-Eval: Interpretation of Student Learning

X

21.) IDM: Sound Professional Practice X X X X X 28.) Reflection and Self-Eval: Interpretation of Student Learning

X

14.) Assessment: Adaptations Based on the Individual Needs of Students

X X

29.) Reflection and Self-Eval: Insights on Effective Instruction and Assessment

X X

The remaining initial (BSE) license programs data that provides data for candidates’ ability to effectively plan, implement, and evaluate student progress, candidates use of data to reflect on practice and professional practice, and candidates’ use of data to assess p-12 student progress and to modify instruction based on student data (data literacy) is contained in the data table titled Data for Planning, Reflection, Impact on P-12. The table below and the highlighted MU Adapted Danielson sub scores in the data table and the attachment 3 cycles of

20

EPP created assessment data provide evidence that Millersville’s initial (BSE) license program candidates show competence in effective planning, reflective practices, and data literacy. The average scores for the 15-week supervisor ratings show an average score of 2.9 for initial (BSE) program candidates. The scoring criteria is Mu Adapted Danielson 3=Proficient, 2=Basic, 1=Unsatisfactory. The expected score for the initial (BSE) programs candidates is >2. The scores show just one tenth below proficient for three cycles of data. The tenth of a point below proficient is due to lack of data collection for spring 2020 due to the COVID shut down.

Alignment of MU Adopted Danielson with for candidates’ ability to effectively plan, implement, and evaluate student progress, candidates use of data to reflect on practice and professional practice, and candidates’ use of data to assess p-12 student progress and to modify instruction based on student data.

Sub Score Effective Planning

Implement Plans

Assess Student Progress

Assess Impact on P-12 Student

Modify Instruction

3.) 1c Setting instructional outcomes InTASC category: The Learner and Learning, standard 1

X X

5.) 1e Designing coherent instruction InTASC category: Instructional Practice, standard 7

X

6.) 1f Designing student assessment InTASC category Instructional Practice, standard 6

X

12.) 3a Communicating with students InTASC category Content and Instructional Practice, standard 5, 8

X X

13.) 3b Using questioning and discussion techniques InTasc category Instructional Practice, standard 6

X

21

14.) 3c Engaging students in learning InTASC category Instructional Practice and Content, standard 5, 8

X X X

15.) 3d Using Assessment in Instruction InTASC category Instructional Practice, standard 6

X X X

16.) 3e Demonstrating flexibility and responsiveness InTASC category Instructional Practice, standard 8

X X

17.) 4a Reflecting on Teaching InTASC category Professional Responsibility, standard 9

X

21.) 4e Growing and Developing professionally InTASC Category Professional Responsibility, standard 9

X X

22.) 4f Showing Professionalism InTASC category Professional Responsibility, standard 10

X

5. Verify the progress made toward meeting SPA conditions related to rubric development and alignment of assessments to provide evidence required to meet SPA standards Content and pedagogy test scores and SPA national recognition provide partial evidence for Component 1.3 (CAEP Initial Handbook, p.12-14; 60-69). EPP Response: The Associate Dean has met and shared reviewer feedback with each program that is Recognized with Conditions. Each program had begun revisions when the COVID-19 shut down occurred. Millersville went to total remote instruction which required 80% of our faculty to get training in remote instruction and learning. Millersville extended spring break for an additional 5 days to allow faculty to meet with the IT group and learn how to deliver instruction. Because of this additional learning and new

22

delivery of instruction the SPA revisions were put on hold. The Math program has continued to edit their program report as you can see in the AIMS site with the latest iteration dated 4/22/2020. All initial license programs, BSE and PB cert programs, have been approved by the Pennsylvania State Department of Education (PDE) using the process of Major Review. PDE approves teacher license programs according to meeting rigorous state standards as required by state law. All of Millersville’s teacher education programs, INT and ADV, are approved by the Pennsylvania State Department of Education (PDE) as evidenced by the program approval process outlined in the Major Review. PDE does not consider enrollment in programs as a criterion for accreditation. Our low enrolled programs do not meet the standard for enrollment in SPA review, more than 5 enrolled, therefore some of our programs do not meet the SPA standard for recognition.

B. Excerpt from Self-Study Report to be clarified or confirmed 1. "The PEU system data collected from several unit assessments and across programs is analyzed, shared, and used for unit program development through department meetings, assessment committee meetings, and unit data days (Assessment Committee Meetings/Data Days p. 19 of the PDF) standard 5." (SSR, p.13) - Clarify how the data assessment system works, particularly how the data are analyzed and used for program/unit improvement.

AFI: The EPP does not provide significant analysis of data and evidence for all unit and program assessments related to teaching and professional practice (Component 1.2)

EPP Response: Please access this website: https://www.millersville.edu/education/peu/index.php. The site has demonstration videos of how the PEU assessment system works. This site is available without password access. We apologize for using the word “several” on page 13 of SSR and can agree how the term is not clear. There are two EPP created unit assessments that gather data across all programs. Some programs have assessments individual programs that are required to meet SPA program reports. An example is the Math SPA review, 2019, required a less generic student teaching assessment using Math terms and NCTM terminology. The Math department responded with revised assessments in the PEU system that address only the Math department candidates for SPA data collection. The new assessments will be in the September 2020 SPA program report response to conditions. From page 30 in the 2019 Assessment Handbook attached to this addendum. Our data collection uses a locally developed add-on to our Student Information System (Banner). Data for the two Unit Assessments connected with courses are input into this system by faculty teaching the courses. The Banner system allows department chairs, program coordinators, the Dean of the School of Education, and the Associate Dean to access reports for analysis for the two Unit Assessments for individual students, programs, and for the Unit.

The professional education unit adheres to the following assessment timeline for data analysis and program/unit improvement:

o Prior to the beginning of each semester: The Associate Dean receives information from the program coordinators regarding identification of any new or modified assessments for the Banner System. The Associate Dean works with the Instructional Technology specialist assigned to the PEU to make necessary adjustments in the system.

o During each fall and spring semester: Data are collected via Banner according to dates identified by the program coordinators, Associate Dean, and the Instructional Technology

23

specialist. Reminders are emailed to the responsible individuals for data entry twice during the semester.

o During each fall and spring semester: Cooperating Teachers in partner school districts are notified of the login process for submitting final evaluations in Banner. Support is offered by the Director of Student Teaching in the Office of Field Services, CAEP standard 2.

o After the completion of each semester, all program coordinators can access assessment results and reports through Banner. The Associate Dean offers support and will download data as requested.

o Once a semester (when appropriate): A Data Day gathering/meeting includes the sharing of some assessment results with all PEU faculty and external partners. Each Data Day has a theme, and assessment focus or question. These days are planned and sponsored by the Assessment Committee.

o Each month, the Associate Dean meets with the Dean of the School of Education to discuss technical issues, programmatic concerns, and other outcome goals.

o During the summer months: The Associate Dean works on updates, assessment organization, and provides reports as needed to each program coordinator, and the Dean of the School of Education.

Development and revision of the PEU assessment system is the responsibility of the Assessment Committee led by the Associate Dean. In addition to the information provided in the Task #4 of the addendum, the 2019 Assessment Handbook p. 27-31 (an attachment to this addendum) provides details of the assessment system workings.

The PEU Assessment Committee and the Associate Dean are charged with evaluation of assessments and data that are used on the Unit level. The development of our current student teaching instrument was a response to an analysis by the Assessment Committee of the content validity of our old student teaching instrument. “Data Day” meetings are the primary method used by the assessment committee for comprehensively evaluating validity and utility of data. Data Day meetings annually, are open to all PEU faculty and stakeholders. Unit data is shared with faculty at these meetings. These meetings provide faculty a chance to analyze the meaning and validity of multiple measures. Faculty typically ask questions about the statistical significance of data, the meaning of one data set in relation to complementary measures, the meaning of qualitative data when compared to quantitative measures with the same focus. Stake holders are asked for feedback and assistance in data analysis aimed at programmatic improvements. These meetings have led to the development of new survey instruments for cooperating teachers and for the revision of existing surveys of graduates. Additionally, the Professional Behaviors policy and rubric were developed through COOP and Supervisors scores of the MU Adapted Danielson. The scores showed a weakness in our candidates’ abilities to demonstrate professional behaviors. Assessment Committee Meetings and Data Day attachment, Jan. 26, 2018.

2. "In fall 2012, EDSE 471: Differentiated Instruction was added to the Secondary programs replacing the CIRQL as a capstone assessment" (SSR, p. 17). - Clarify which programs are using Differentiated Instruction and which are still using CIRQL and how both assessments are aligned to provide evidence for meeting Component 1.4. EPP Response: CIRQL assessment of Student Learning – used by MAJOR(S): BSE Mathematics, BSE Mathematics with Inclusive Education 7-12, CERTIF Teacher Certification in Mathematics secondary education majors during student teaching.

24

Sub scores are aligned with the Mu Adapted Danielson. The sub scores student are evaluated on are: Reflective Practice, insights and limitations, Danielson:1d: Demonstrating knowledge of resources, 4a: Reflecting on Teaching ,Teacher candidate makes sound decisions about when instructional tools enhance teaching and learning by critically identifying limitations of such tools, , 3c: Engaging students in learning, Teacher candidate provided tools used to enhance teaching and student learning. Total average score for the candidates reaches the expected score of >2= satisfactory.

Semester

Frequency Total

4 = Exemplary

3 = Superior

2 = Satisfactory

1 = Unsatisfactory

No Score

Average Score

Reflective Practice, insights and limitations

Spring 2020 6 50% 33% 17% 0% 0% 3.33

Spring 2019 7 29% 71% 0% 0% 0% 3.29

Spring 2018 9 67% 11% 22% 0% 0% 3.44

Teacher candidate makes sound decisions about when instructional tools enhance teaching and learning by critically identifying limitations of such tools

Spring 2020 6 33% 33% 33% 0% 0% 3

Spring 2019 7 29% 57% 14% 0% 0% 3.14

Spring 2018 9 0% 0% 0% 0% 100% 0

Teacher candidate provided tools used to enhance teaching and student learning.

Spring 2020 6 17% 83% 0% 0% 0% 3.17

Spring 2019 7 14% 57% 29% 0% 0% 2.86

Spring 2018 9 0% 0% 0% 0% 100% 0

2.5

EDSE 471, Differentiated Instruction, is taken by students in these programs: BSE Art Education, BSE German, BSE Music Education, BSE Social Studies, BSE Social Studies with Inclusive Education 7-12, BSE Technology and Engineering Education, MED Mathematics, BSE English, BSE Mathematics, BSE Biology, BSE Physics, BSE Chemistry, BSE Earth Sciences, MED Art Education. The table below shows the InTASC category and state standards addressed in the course. The frequency data show candidates are proficient in the use of data to reflect on their practice, monitor student progress and modify instruction based on data.

Subscore Name 3 2 1 0

MU Adapted Danielson, InTASC category, State Standard

Semester Frequency Total Distinguished Proficient Not

Met No Score

Average Score

1.) Knows students, InTASC category The Learner and Learning, II B.6, 7, 9, IIIA

Spring 2020 66 14% 86% 0% 0% 2.14

Fall 2019 2 0% 100% 0% 0% 2

Spring 2019 69 29% 71% 0% 0% 2.29

Spring 2020 66 30% 70% 0% 0% 2.3

25

2.) Monitors student progress InTASC category Instructional Practice, II B.12, III.B,C

Fall 2019 2 0% 100% 0% 0% 2

Spring 2019 69 14% 86% 0% 0% 2.14

3.) Analyzes and assesses student learning InTASC category The Learner and Learning, II

Spring 2020 66 26% 74% 0% 0% 2.26

Fall 2019 2 0% 100% 0% 0% 2

Spring 2019 69 17% 83% 0% 0% 2.17

4.) Responsive to students' needs, including those from academically and linguistically diverse background InTASC category Instructional Practice, IIb.10, 16, 17, III j

Spring 2020 66 20% 80% 0% 0% 2.2

Fall 2019 2 0% 100% 0% 0% 2

Spring 2019 69 26% 72% 1% 0% 2.25

The attachment in this addendum titled State Standards Aligned with Programs, provides evidence Millersville programs ensure candidates demonstrate skills and commitment that afford all P-12 students access to rigorous college- and career-ready standards via meeting the Common Core State Standards. All of Millersville’s teacher education programs, INT and ADV, are approved by the Pennsylvania State Department of Education (PDE) as evidenced by the program approval process outlined in the Major Review. From the PDE Major Review handbook, p. 7,

The Major Program Review requires outcomes data and impacts on student growth and development as articulated by program competencies. Outcomes are broadly conceived as those performances of pre-service and early in-service program candidates. For example, the program provider designs the program of study that is aligned with competencies set forth by PDE’s Program Framework Guidelines. Candidates engage in courses, field experiences and culminating clinical experiences. From these varied experiences, they are required to demonstrate competency as gauged by faculty designed assessments. These critical competency-based assessments attest to the candidates’ performance in each program. They allow for assessment of the individual, and the aggregate of these results speak to the quality of program. The data collected is quantifiable and, when examined in the self-study, should lead the program provider to identification of areas of strength and areas for improvement.

This description of the program review process serves to highlight sources of outcomes-based evidence that are generated, reviewed, and analyzed by the program provider and reviewed by PDE, and finally should lead to program improvement decisions. These activities all serve as elements in a feedback mechanism to examine individual candidate growth, as well as the overall health and vitality of the program under review.

26

3. "An example is the development of the CIRQL assessment of Student Learning for secondary candidates during student teaching {Impact on P-12, CIRQL}. Each semester data for this assessment is collected on the effectiveness of the assessment. During data sharing meetings of the faculty and staff, revisions of these assessments were completed based on the PEU system data collection" (SSR, p. 18). - Clarify the process for developing and revising assessments. Clarify the process for reviewing data for effectiveness. Clarify how assessment revision is documented

EPP Response: Development and revision of the PEU assessment system, includes BSE, PB cert, and ADV programs, is the responsibility of the Assessment Committee lead by the Associate Dean. From the 2019 Assessment Handbook p. 27-31 (an attachment to this addendum): The PEU Assessment Committee and the Associate Dean are charged with evaluation of assessments and data that are used on the Unit level, PEU. The Assessment Committee members review Unit Assessments (see the table in Task 1 which gives topics discussed at the Assessment Committee meetings). The table below also addresses assessment committee topics, dates, and members present as evidence of revision of the PEU unit assessments and addresses CAEP Sufficient criteria from task 1 for Administration and Purpose, Content of Assessment, and Scoring.

Meeting Date Topic Members January 16, 2019 Aligning Evidence to CAEP

Standards: Review Unit Assessments

Marcia Bolton, Associate Dean, College of Education and Human Services Sarah Brooks, Educational Foundations Sharon Brusic, Applied Engineering, Safety, and Technology Leslie Gates, Art & Design Janet Josephson, Early, Middle, and Exceptional Education Timothy Mahoney, Educational Foundations Marcia Nell, Early, Middle, and Exceptional Education Cynthia Taylor, Mathematics Tiffany Wright, Educational Foundations

February 26, 2019 PEU Assessment Info. and Help Session Discussion

Susanne Nimmrichter, Janet Josephson, Marcia Nell, Marcia Bolton, Jason Petula, Kim McCollum-Clark, Ollie Dreon, Sarah Brooks, Beth Powers, Judy Wenrich, Tiffany Wright, Charlton Wolfgang, Sharon Brusic, Cynthia Taylor, Leslie Gates

April 23, 2019 Establish working groups to review and evaluate PEU assessments

Mahoney, Janet Josephson, Leslie Gates, Sharon Brusic, Kim McCollum-Clark, Jason Petula

May 13, 2019 Preview all unit assessments listed in the PEU after working groups edited and updated

Sarah Brooks, Beth Powers, Marcia Bolton, Aileen Hower, Tim Mahoney, Ollie Dreon, Kim McCollum-Clark, Leslie Gates, Tiffany Wright, Charlton Wolfgang.

Feb. 12, 2020 Data Day Discussion/Planning PEU assessment data that will be reviewed

Marcia Bolton, Sarah Brooks, Ollie Dreon, Leslie Gates, Aileen Hower, Janet Josephson, Tim Mahoney, Jason Petula, Beth Powers-Costello, Cynthia Taylor, Charlton Wolfgang, Tiffany Wright

The Millersville University’s professional education unit adheres to the following assessment timeline:

o Prior to the beginning of each semester: Associate Dean receives information from the program coordinators regarding identification of any new or modified assessments for the Banner System. The Associate Dean works with the Instructional Technology specialist assigned to the PEU to make necessary adjustments in the system.

o During each fall and spring semester: Data are collected via Banner according to dates identified by the program coordinators, Associate Dean and the Instructional Technology

27

specialist. Reminders are emailed to the responsible individuals for data entry twice during the semester.

o During each fall and spring semester: Cooperating Teachers in partner school districts are notified of the login process for submitting final evaluations in Banner. Support is offered by the Director of Student Teaching in the Office of Field Services, CAEP standard 2.

o After the completion of each semester, all program coordinators can access assessment results and reports through Banner. The Associate Dean offers support and will download data as requested.

o Once a semester (when appropriate): A Data Day gathering/meeting includes the sharing of some assessment results with all PEU faculty and external partners. Each Data Day has a theme, and assessment focus or question. These days are planned and sponsored by the Assessment Committee.

o Each month, the Associate Dean meets with the Dean of the School of Education to discuss technical issues, programmatic concerns, and other outcome goals.

o During the summer months: The Associate Dean works on updates, assessment organization, and provides reports as needed to each program coordinator, and the Dean of the School of Education.

o Task: Evidence in need of verification or corroboration

1. Confirm that initial license and initial license post-baccalaureate candidates complete the same EPP and program assessments

EPP Response: The initial (BSE) license and the initial license post-baccalaureate (PB cert) candidates complete the same EPP unit created assessments: The Professional Behaviors assessment, and the MU Adapted Danielson. The course numbers reflect the difference between Post Bacc level (PB cert) and undergraduate, initial, levels for the purpose of tuition and credits. For example: Professional Behaviors for the initial (BSE) candidates is numbered EDFN 211, but the Post Bacc is EDFN 545 but the unit assessment for Professional Behaviors is the same. Data tables disaggregated by initial (BSE) and PB Cert programs can be seen in attachments titled 3 cycles of data EPP created assessment Professional Behaviors aligned with InTASC, 3 cycles of data EPP created assessment MU Adapted Danielson (COOP) aligned with InTASC, and 3 cycles of data EPP created assessment MU Adapted Danielson (Super) aligned with InTASC. 2. Confirm whether or not post-baccalaureate candidate data is disaggregated from initial license

undergraduate data EPP Response: The Professional Education Unit (PEU) assessment system allows for disaggregated data

reporting for the initial (BSE) license undergraduate and the post baccalaureate (PB cert) candidate. When entering the system, the user chooses Program Definition. The next screen allows for a choice of individual programs or the PEU system listing of all assessments assigned to each course. Each PEU program assessment scores reports summarized by frequency, and average mean. Assessment Definitions from the list of options to view how each assessment for a chosen program is set up. Assessment definitions, data input type, rating type, rubric, are then displayed. In the Assessment definitions there is an alignment table listing check boxes for the assessor to align Millersville’s Conceptual Framework elements, InTASC categories, CAEP standards, Student Learning Outcomes, and exemplars designated by the program. On request our IT department can make the PEU system sharable to reviewers.

28

The attachment titled 3 cycles of data for EPP assessment (COOP) MU Adapted Danielson aligned with InTASC show BSE (Int) program data apart from the PB cert data in an excel worksheet. Each program is listed on the tabs, i.e. BSE Art, PB cert Art, BSE Bio, PB cert Bio.

Task: College and Career Readiness Standards

A. Evidence in need of verification or corroboration 1. Please provide direct evidence for initial (BSE) license candidates across programs to demonstrate skills (differentiated instruction; problem-solving; critical thinking; collaboration and communication) that afford access to rigorous college and career ready standards for all P-12 students

Stipulation: The EPP has not provided sufficient direct evidence to ensure candidates demonstrate skills and commitment that afford all P-12 students access to rigorous college and career ready standards (Component 1.4)

EPP Response: Please see unit assessment data tables disaggregated by initial (BSE) and PB Cert programs in attachments titled 3 cycles of data EPP created assessment Professional Behaviors aligned with InTASC, 3 cycles of data EPP created assessment MU Adapted Danielson (COOP) aligned with InTASC, and 3 cycles of data EPP created assessment MU Adapted Danielson (Super) aligned with InTASC. Please see the table below that shows which MU Adapted component addresses differentiated instruction; problem-solving; differentiated instruction; collaboration and communication) that afford access to rigorous college and career ready standards for all P-12 students standard numbers for the initial (BSE) license programs.

The cooperating teacher (COOP) survey gives results to questions of student teacher competency utilizing skills; problem-solving; differentiated instruction; collaboration and communication) that afford access to rigorous college and career ready standards for all P-12 students.

The survey results are reported in the Cooperating Teacher Survey the table below. The survey is launched every semester. The results provide evidence our candidates can utilize technology in the field during student teaching. The % of cooperating teachers that agree or strongly agree MU candidates can utilize technology in planning, instruction, for monitoring progress and to differentiate and extend learning.

Cooperating Teacher Survey Results – problem-solving; differentiated instruction; collaboration and communication) that afford access to rigorous college and career ready standards for all P-12 students (survey sent to all student teachers in INITIAL or PB Cert programs) Spring 2019

Question Strongly Agree % # responding

Somewhat agree % # responding

Neither agree or disagree % # responding

Somewhat disagree % # responding

Strongly disagree %# responding

Write detailed lesson plans.

differentiated instruction, problem-solving

54% N= 128

36 N= 85

3% N=6

5% N= 12

2% N=5

Lesson planning provides concrete ways the candidate provides

differentiated instruction

40% N= 31

30%

N= 25

4% N= 24

21% N= 16

5% N= 4

29

instruction that meets the needs of all learners.

Write detailed lesson plans

problem-solving; differentiated instruction; collaboration and communication

54% N= 128

36% N= 85

3% N= 6

5% N= 9

2% N=5

Formative data driven assessment data is used to monitor instruction.

problem-solving; differentiated instruction;

24% N= 26

9% N=24

41% N= 31

21% N= 15

5% N=4

Self-reflection guides modification of instructional practices

collaboration and communication

47% N= 111

12% N= 29

2% N= 5

0 % 1% N=3

College and Career Ready Standards for P-12

Component in MU Adapted Danielson used twice during Stu Tchg

differentiated instruction problem-solving collaboration and communication

1b: Demonstrating knowledge of students, 1c: Setting instructional outcomes

X

2b: Establishing a culture for learning

X

3b: Using questioning and discussion techniques

X X

4c: Supervised communication with families X 4e: Growing and Developing professionally, 4f: Showing Professionalism

X

Professional Behaviors Foundations and Professional Block Level

Participates in the Professional Community X Respects Diversity and Civil Rights of Others X

B. Excerpt from Self Study Report to be Clarified or Confirmed 1. . "Each candidate demonstrates proficiency in developing lessons that incorporate the college and career-

focused Pennsylvania State Department Learning Standards for PK-12 students" (SSR, p. 18). - Clarify how Pennsylvania Learning Standards meet college and career readiness standards or if Pennsylvania Learning Standards are essentially the same as Common Core

EPP Response: From the PDE liaison for Millersville. How does the rigor of the Common Core Standards (CC) compare to the PA Academic Standards? Upon release of the final draft of the College and Career Ready (CCR)

30

Standards, PA completed a preliminary study to measure the degree of alignment. The results of that study showed a very strong alignment of CCR Standards. The PA Standards construct is a solid design and while the language of the standards may change, the content remains constant. Using the PA Standards and SAS’s Curriculum Framework show significant alignment between the PA and Common Core standards.

http://www.corestandards.org/standards-in-your-state/ Forty-one states, the District of Columbia, four territories, and the Department of Defense Education Activity (DoDEA) have adopted the Common Core State Standards. PENNSYLVANIA STANDARDS ADOPTED: JULY 2, 2010 ADOPTED BY: PENNSYLVANIA STATE BOARD OF EDUCATION FULL IMPLEMENTATION: 2013-14 SCHOOL YEAR*

Task: Technology

A. Evidence in need of verification or corroboration 1. Please provide direct evidence to demonstrate how all initial licensure programs are integrating

technology for student learning and tracking performance AFI: The EPP does not provide sufficient evidence of candidate facility with modeling and applying technology (Component 1.5) EPP Response: See attachment EDHS Technology Use the cooperating teacher (COOP) survey gives results to questions of student teacher competency utilizing technology. The survey results are reported in the Cooperating Teacher Survey the table below. The survey is launched every semester. The results provide evidence our candidates can utilize technology in the field during student teaching. The % of cooperating teachers that agree or strongly agree MU candidates can utilize technology in planning, instruction, for monitoring progress and to differentiate and extend learning.

Cooperating Teacher Survey Results – Technology (survey sent to all student teachers in INITIAL or PB Cert programs) Spring 2019

Question Strongly Agree % # responding

Somewhat agree % # responding

Neither agree or disagree % # responding

Somewhat disagree % # responding

Strongly disagree %# responding

totals

Integrate the use of technology in planning and instruction.

42% #98

43% #101

6.3% #15

8.05% # 19

1.2% #3

100% #236

Utilize technology for tracking data for use in monitoring all learners’ progress

46.% # 75

35% # 57

4% # 26

16% #26

4% #6

100 #190

Utilize technology to differentiate and extend learning

52% # 90

20% # 60

9% #50

16% # 27

3% #6

100% # 233

Student teachers he disaggregated data by program in the EPP created assessments MU Adapted Danielson. The cooperating (COOP) and the supervisors assess candidates at the mid-term of the student teaching semester and

31

at the 15 week or final week of the semester. The sub scores or components for each assessment are included in this table.

Technology Cross Walk

MU Adapted Danielson Framework

INTASC Standards

Unit Assessment

Standard 3: Learning Environment The teacher works with others to create environments that support individual and collaborative learning, and that encourage positive social interaction, active engagement in learning, and self-motivation.

2 (a) Creating an environment of respect and rapport 2 (b) Establishing a culture of learning 3 (a) Communicating with students

Performance

3(g) The teacher promotes responsible learner use of interactive technologies to extend the possibilities for learning locally and globally. 3(h) The teacher intentionally builds learner capacity to collaborate in face-to-face and virtual environments through applying effective interpersonal communication skills.

2 (e) Organizing physical space Knowledge

3(m) The teacher knows how to use technologies and how to guide learners to apply them in appropriate, safe, and effective ways.

1 (a) Demonstrating knowledge of content & pedagogy 1 (b) Demonstrating knowledge of students

Standard #4: Content Knowledge The teacher understands the central concepts, tools of inquiry, and structures of the discipline(s) he or she teaches and creates learning experiences that make these aspects of the discipline accessible and meaningful for learners to assure mastery of the content.

Performance 4(g) The teacher uses supplementary resources and technologies effectively to ensure accessibility and relevance for all learners.

Standard #5: Application of Content The teacher understands how to connect concepts and use differing perspectives to engage learners in critical thinking, creativity, and collaborative problem solving related to authentic local and global issues.

1 (a) Demonstrating knowledge of content & pedagogy 1(b) Demonstrating knowledge of students

Performance 5(c) The teacher facilitates learners’ use of current tools and resources to maximize content learning in varied contexts.

1 (c) Setting instructional outcomes

Knowledge

5(k) The teacher understands the demands of accessing and managing information as well as how to evaluate issues of ethics and quality related to information and its use. 5(l) The teacher understands how to use digital and interactive technologies for efficiently and effectively achieving specific learning goals.

1 (f) Designing effective assessments 3 (d) Using assessment in instruction

Standard #6: Assessment The teacher understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teacher’s and learner’s decision making.

Performance

6(i) The teacher continually seeks appropriate ways to employ technology to support assessment practice both to engage learners more fully and to assess and address learner needs.

32

Millersville candidates are reaching proficient in their use of technology when teaching well above the 85% level in each component of each assessment. While these components/sub scores are reported out individually, initial license courses at the undergrad and Post Bacc levels incorporate technical skills across the programs and throughout the time candidates spend in teacher education preparation. The attachment EDHS Technology Use displays the numerous places technology is utilized. During the COVID virus shut down Millersville faculty and students are relying on remote delivery of instruction for all teacher education courses. The programs are relying on Atlas, a National Board-Certified tool, for case studies and videos of teaching for reflection and observation. Faculty are modeling technical expertise in how to teach utilizing technology, collect assessment, share student progress and communication practices with our candidates. In response to faculty instruction our candidates are utilizing technology to work with cooperating teachers, communicate, share lesson planning/unit plans, and demonstrate comprehension of the course content.

Component MU Adapted Danielson

Proficient scores Semester, total N, percent

Teacher Work Sample

Proficient scores Semester, total N, percent

3c. Learning tasks and activities, materials, resources, instructional groups and technology are aligned with the instructional outcomes or require only rote responses.

Fall 2019 89/96% Spr 89/96% Fall 2018 67/96%

18.) DFI: Use of a Variety of Instruction, Activities, Assignments and Resources

Spr 2020 60/82% Fall 2019 89/99% Spr 2019 73/93%

2e The teacher makes effective use of physical resources, including computer technology. The teacher ensures that the physical arrangement is appropriate to the learning activities.

Fall 2019 89/96% Spr 89/96% Fall 2018 67/96%

20.) DFI: Use of Technology

Spr 2020 60/82% Fall 2019 89/99% Spr 2019 73/93%

1

Standard 2

EPP Response underlined Attachments highlighted

Task: Cooperating Teaching Training

A. Evidence in need of verification or corroboration

2.2 Evidence of Training

AFI: The EPP has provided little evidence to support that mutually agreeable expectations for candidate entry, preparation, and exit (2.1)

EPP Response: Each semester the Field Services office creates seminars and workshops based on Cooperating teacher need. The curriculum for the workshops is based on the published "Responsibilities of the Cooperating Teacher" within the A Guide for Field Services 2020. We train on supervision methods and on co-teaching as well as specific expectations of student teachers.

Please see C. in this task. In C. are training agendas, attendance, topics covered, transition points reference to the # Years of PDS Faculty and Partner Meetings for three years of meetings and annual meeting with partners and COOPs/mentor teachers.

The department of field services has had a change in Director in the last 18 months. The former job description was for a part time temporary faculty position with only 3 weeks of job duties during the summer months. The College of Education re-wrote the Director of Field Services job description calling for a full- time staff member with a 12-month contract. The new director started in the October 2019 fall semester. The full time Director is able to maintain firm partnerships with school partners, consistently communicate with partners, and participate in the College of Education’s shared governance by serving on the Teacher Education Council, Chair of the Clinical Partners Committee, and work closely with the Associate Dean of the College to establish secure communications and liaison relationships with our P-12 partners. Part of the Director’s duties includes offering training for cooperating teachers (COOPs).

The transition in the leadership position for the Department of Field Services (DFS), the former director leaving in May 2019 and the new director arriving in Oct. 2019, has interrupted continuous formal training for COOPs in the 2019-2020 academic year. Most of our candidates reside in the PDS programs, grades 7-12 and Early, Middle, and Exceptional Education (EMEE) have offered continued training and interaction to its mentor teachers COOPs) and administrators (see section C 1. Of this Task). Response in B. of this task provides a proposed plan for improved training under the new Director of Field Services.

B. Excerpt from SSR to be clarified or confirmed

2.2 stated that 92% of 214 respondents answered 'yes' in that they were provided enough information about their role as a cooperating teacher, however, qualitative data shows that more training is needed and warranted by a number of the cooperating teacher.

EPP Response: The COOPs that responded they wanted more training are new COOPs. Since PDS programs have continued their training of COOPs uninterrupted by the change in DFS leadership we understand the COOPs who need or want training are those new to the role. Understanding the experienced COOPs could always use an update from the Department of Field Services (DFS), the more

2

experienced COOPs indicated in written comments they did not want or need additional training. One experienced COOP even suggested they would no longer serve as a COOP is training is required.

In response to the survey input, Department of Field Services (DFS) will work to provide an on-line training in additional to a face to face meetings offered for COOPs as a part of the supervisors’ meeting held each semester. In addition, DFS revised A Guide to Field Services 202 which utilized partner feedback. This Guide is readily available on our website (no password needed) https://www.millersville.edu/fieldservices/field-guide.php for guidance and direction to all supervisors, COOPs, faculty, PDS liaisons, candidates, and administrators.

Training will be provided for all COOPs with the understanding that some of the more experienced COOPs may not participate. Because we cannot be assured all COOPs will participate in training, the current DFS director and the Associate Dean offer the plan below for COOP training. The table shows the proposed topic, type of training, and the way the training will be offered. We propose to offer a certificate of completion to the COOP after completion of the online training. Additionally, we will ask for an evaluation suggesting ways the training could be improved.

Proposed Topic Type of Training Online or Face to Face Evidence Agenda, Training Eval, Certificate

Facilitation Skills COOPS attend Semester required Supervisor Meeting

Face to Face Agenda, attendance

PDE News Legislative News as a document, Document outlining PDE policies impact on supervision and clinical, i.e. clearances

Online and Face to Face Agenda, email invitation, attendee roster

MU Expectations New COOP mentee tips PowerPoint

Online Certificate, IT installed Count for Site Visits

Class Management COOPS attend Semester required Supervisor Meeting: Faculty member as speaker

Face to Face Agenda, attendance sheet

Virtual Conference: Supporting Learners in an Online Environment: We Are All in This Together. (planned for June 30, 2020)

Speakers, panel discussion, breakout sessions

Online Attendance, registration forms, emails, and agenda documentation

C. Question for EPP concerning additional evidence, data, and interviews

1. How is the EPP ensuring that COOPS are trained and held accountable for MU expectations? How do you monitor to make sure that all cooperating teachers attended training? What evidence does evaluation of cooperating teacher provide?

EPP Response: A survey evaluation of COOPs is completed by student teachers, but according to the collective bargaining unit agreed upon by our partner school districts only the Dean of the College of Education can conduct any type of faculty evaluation. Copies of the completed student evaluation of supervisors can be verified by the lead site visitor during the Nov. 2020 visit.

The following narrative and chart provide how COOPs are trained and held accountable for MU expectations. All district partners and liaisons have been notified of this web site and document: A copy of “A Guide to Early Field Experiences” is available for viewing at

3

http://www.millersville.edu/earlyfieldexp/cooperating-teachers.php on the Millersville University website/Early Field Experience/Cooperating Teachers page under the “Guide for Early Field Experience” block. We also have a Microsoft Teams for supervisors on-line. The supervisors can relay information to the COOPS.

The department of field services has had a change in Director in the last 18 months. The former job description was for a part time temporary faculty position with only 3 weeks of job duties during the summer months. The College of Education re-wrote the Director of Field Services job description calling for a full- time staff member with a 12-month contract. The new director started in the October 2019 fall semester. The full time Director is able to maintain firm partnerships with school partners, consistently communicate with partners, and participate in the College of Education’s shared governance by serving on the Teacher Education Council, Chair of the Clinical Partners Committee, and work closely with the Associate Dean of the College to establish secure communications and liaison relationships with our P-12 partners. Part of the Director’s duties includes offering training for cooperating teachers (COOPs).

The transition in the leadership position for the Department of Field Services (DFS), the former director leaving in May 2019 and the new director arriving in Oct. 2019, has interrupted continuous formal training for COOPs in the 2019-2020 academic year. Most of our candidates reside in the PDS programs, grades 7-12 and Early, Middle, and Exceptional Education (EMEE) have offered continued training and interaction to its mentor teachers COOPs) and administrators (see section C 1. Of this Task).

Since 2014 Millersville University’s Secondary Education Professional Development School (PDS) is a partnership between many of the University’s secondary certification programs and local school districts. Participating certification programs include all sciences (biology, chemistry, earth science and physics), social studies, English, world languages, art, and technology and engineering. Beginning in the fall 2019 Early, Middle, and Exceptional Education Professional Development Program along with our district professional development schools. Goals and more detailed information about the PDS programs are located at these web site: Secondary PDS, https://www.millersville.edu/edfoundations/pds.php and EMEE PDS, https://www.millersville.edu/eled/early-middle-exceptional-education-professional-development-program-pds.php.

The PDS programs enroll most, 77% 0f 197 for spring 2020, of our candidates for student teaching. The candidates begin in the fall as interns assigned to Mentor teachers. The spring semester the candidates are student teachers and have the same classroom teacher referred to as a COOP. Each PDS school is assigned a liaison faculty member from Millersville. The liaison’s responsibility is communication and supervision of our candidates and the PDS school partners. Training for the mentors/COOPS begins in the fall semester of each academic year. The table below shows a sampling of the 2019-18 dates of the meetings, the topic covered (always with school partner input and attendance) attendance, which transition point in the program and resulting actions. Below the table is a list of the liaison and the school partner locations. The attachment titled PDS Partner and Faculty meeting provides this information 3 years.

Date Attendees Topic Action Results 2/15/19 Thomas Bell, Sarah Brooks,

Sharon Brusic, Emily Davis, Sarah Dutton, Leslie Gates, Anne Kinderwater-Carroll, Ellen Long, Kim McCollum-Clark, Nakeiha

Developed Methods syllabus statement regarding passing unit plan assessment. You must pass the unit plan assignment for this course to proceed to student teaching. This policy is in place because you must demonstrate that you have the skills to

Published statement in the unit plan assessment.

Transition point: Entry to Clinical

4

Primus, Anne Stuart, Miriam Witmer

engage in long-range instructional planning to proceed to student teaching. If you earn two or more unsatisfactory ratings or a grade lower that 70% on your unit plan, you have not passed the assignment.

2/1/19 Thomas Bell, Sarah Brooks,

Sharon Brusic Emily Davis (Zoom) Sarah Dutton (Zoom) Leslie Gates Anne Kinderwater-Carroll (NA) Ellen Long (NA) Kim McCollum-Clark (NA) Nakeiha Primus Anne Stuart Miriam Witmer

PDS Assessments data, review EDSE 471 syllabi,

Review data, no revisions made. Refined data submission by all supervisors and mentors

Transition point: Entry to Clinical

10/26/18 Sarah Brooks, Sharon Brusic Emily Davis (Zoom) Sarah Dutton (Zoom) Leslie Gates Anne Kinderwater-Carroll (NA) Ellen Long (NA) Kim McCollum-Clark (NA) Nakeiha Primus Anne Stuart Miriam Witmer

Have you reviewed all of the evals completed for your assigned interns? o Any unsatisfactory ratings? o Did mentor agree to host next semester? - Please discuss any concerns cited on the mid semester eval with mentor and intern. - Please make plans to observe each of your interns if you have not already.

Updated professionalism concerns into the Professional Behaviors Data Sheet on Google Drive. Professional Development Plans were formulated, and meetings held. PDPs posted in D2L

Transition Point: Entry to Clinical

10/31 Sarah Brooks, Sharon Brusic Emily Davis (Zoom) Sarah Dutton (Zoom) Leslie Gates Anne Kinderwater-Carroll (NA) Ellen Long (NA) Kim McCollum-Clark (NA) Nakeiha Primus Anne Stuart Miriam Witmer

EDFN is proposing graduate cross listed courses for EDSE 340 and SPED 346 for Fall 2018. How will we collaboratively assess interns’ professional behaviors using new rubric this semester?

Courses were cross listed. Assessment data can be collated by PDS director and distributed.

Transition point: Exit from Student Teaching

Liaisons Miriam Witmer York County School of Technology, Michelle Trasborg Conestoga Valley High (Spanish, technology), Huesken Middle (technology, Anne Stuart Blue Ball Elem, Paradise Elem, Denver Elem, Valley View Elem, Mountville Elem Nakeiha Primus ASPIRA Academy, Marticville Middle Susanne Nimmrichter Penn Manor High Kimberly McCollum-Clark McCaskey High Nanette Marcum-Dietrich Conestoga Valley High (science + English) Ellen Long Central York High School, Garden Spot Middle, Manor Middle Anne Kinderwater-Carroll Manheim Township Middle, Martic Elementary Leslie Gates Brownstown Elementary, Landisville Educational Center, Farmdale Elementary Ojoma Edeh Herr Central York Middle Sarah Dutton Manheim Township High, Donegal Intermediate Sandy Deemer King Elementary, Wharton Elementary Dan Daneker Huesken Middle (English, Huesken Middle (science + social studies), CV High School, In addition to the meetings listed in the table above the PDS director holds an annual music, Spanish) Sharon Brusic Lampeter-Strasburg High Sarah Brooks Hempfield High, Centerville Middle,

Secondary and K-12 Partners’ Meetings are held annually. The last Partners’ on Thursday, March 12, 2020 from noon to 3pm in Stayer Hall MPR.

Task: Process for partner input

A. Evidence in need of verification or corroboration 1. 2.2. Evidence of Process for Partner – how do you document the partnership, all MOUs, any

meetings with agendas, attendance, and action steps

5

B. Excerpt from SSR to be clarified or confirmed

2.2. states that the "Demographic Portfolio" proves that the PEU has expectations for partners when assigning candidates to mentor teachers (COOPS)," and that "This process allows the school partner to provide input into placement requirements," however, the evidence and process is unclear.

EPP Response: All Affiliation Agreements (MOUs) are available for the lead site visitor in Nov. 2020. The file containing all Affiliation Agreements was too large to include in the attachments for this addendum.

Email correspondence is in the table below beginning on the bottom of page 7 in this addendum narrative for Standard 2.

The “Demographic Portfolio” displays the demographics of all our school partners. To provide diverse field placements to increase instructional expertise for teaching all P-12 students, our Department of Field Services (DFS) uses the demographics to reach out to school districts. For example, our Foundations courses require urban clinical placements because we developed our programs coursework to cultivate well-prepared professional educators who use their understanding of individual differences and diverse cultures and communities to ensure inclusive learning environments that enable each learner to meet high standards. Because Foundations courses rely on urban placements by curricular design, DFS can consult the demographics provided by the census bureau to refine their placement requests If we were not intentional in our placements, some school districts might be neglected and not have a communication outlet for teacher preparation programs. Transition point: Entry to Advanced Professional Studies

Additional ways Millersville reaches out to the school partner from them to provide input is through providing student information to the partner for approval and changes. This email is a good example:

Lauriana Engle From: Michelle Hackman <[email protected]> Sent: Friday, January 18, 2019 10:21 AM To: Lauriana Blessing Cc: Karen Schlasta; Christopher Miller; Gail Decker Subject: RE: MU Student Intern Placement Information – Early Childhood Professional Block II Attachments: arrest or conviction form.pdf; Confidentiality Letter.doc Follow Up Flag: Follow up Flag Status: Completed Good Morning Lauri, Samantha Monfredo is cleared to be in our buildings. Please ask her to complete the attached paperwork and bring the forms with her on her first day to turn into the office. Best, Michelle C. Hackman District Registrar & Instructional Services Secretary Donegal School District 1051 Koser Rd Mount Joy, PA 17552 P: (717)492-1304 F: (717)492-1350 From: Lauriana Blessing [mailto:[email protected]]

6

Sent: Thursday, January 10, 2019 1:48 PM To: Lori Mentzer <[email protected]>; Sarah Bair <[email protected]>; Faithe Oberholtzer <[email protected]>; Michelle Billig <[email protected]>; Karen Schlasta <[email protected]>; Kathlynn Musser <[email protected]> Cc: Christopher Miller <[email protected]>; Michelle Hackman <[email protected]> Subject: MU Student Intern Placement Information – Early Childhood Professional Block II Dear Educational Partner, Hello and thank you for participating in Millersville University’s Early Field Experience program! The information in the attached document confirms the Millersville University students coming to your classroom/school as part of their Early Childhood Professional Bloc II coursework. Please review the table in the attachment for details. Please let me know if you notice any errors in teacher assignments or email addresses, or if you are unable to accommodate a student on the date(s) listed. Placements are assigned to provide specific experiences for students and meet detailed requirements. However, if at any point you feel a student should be reassigned or moved to another co-op teacher, please contact the Office of Field Services to discuss. We value your input!

A copy of “A Guide to Early Field Experiences” is available for viewing at http://www.millersville.edu/earlyfieldexp/cooperating-teachers.php on the Millersville University website/Early Field Experience/Cooperating Teachers page under the “Guide for Early Field Experience” block. Students will be receiving their assignment information in the next few days and have been asked to contact you directly to secure any details they might need to begin the placement. If you have any questions, recommendations or concerns at any time, please do not hesitate to reach out me. Thank you again for giving our students this opportunity. Kind regards, Lauri Lauriana Engle (Blessing) Office of Field Services MILLERSVILLE UNIVERSITY P.O. Box 1002, Millersville, PA 17551-0302 Phone: 717-871-5561 | [email protected] | www.millersville.edu

Connect with us: Facebook | Twitter | YouTube

C. Question for EPP concerning additional evidence, data, and interviews’ 1. What process is used to show that school partners are providing input?

AFI: There is limited evidence that the EPP works with partners to outline and review clinical experiences (2.3)

EPP Response: Collaboration comes in many forms: individual collaboration between university supervisors and mentor (cooperating) teachers, collaboration between the field experience office and individual cooperating teachers, collaboration between the field experience office and school districts.

7

• COOPs are surveyed each semester. To share school providers input COOPs are emailed the results of the COOP survey.

• We provide a template of the standard MOU agreement between Millersville University and school districts hosting student teachers.

• A Guide for Field Services which details relationships with partners is published on the MU website that does not require passwords. All field supervisors and COOPs are made aware of the site address: https://www.millersville.edu/fieldservices/field-guide.php

Collaborative efforts by the Department of Field Services (DFS) provides district administrators, program chairs, and Professional Development (PDS) faculty liaisons with detailed information about each PEU clinical request. Providing detailed placement information gives our partners input into how we make placements and what the course requirements for those placements. The process works like this: A course developed by the faculty member designates a clinical experience (field placement is required). The faculty communicates the course objectives for the field placement to the DFS staff member. The staff member has an established relationship with the liaison member of the school district’s administrative offices that handle field experiences for all the schools in the district. The appointed school administrative liaison communicates Millersville’s field placement request along with course number and objectives for the placement. The Principal of each school is responsible for selecting the mentor teacher to work with Millersville’s candidates. Millersville does not choose mentor teachers for any field placement, we ask for recommendations for COOPs from the Principals via the district level liaison Our candidates and faculty do not reach out directly to schools or potential COOPs directly as this violates the agreement between Millersville’s DFS and the school districts. When the teacher and the Principal agree a candidate can complete filed experience in the classroom, the Principal contacts the district liaison who then communicates with Millersville’s DFS staff member via email or phone to discuss the placement’s match with the course. Once the match is determined by the district admin, the Principal, and DFS a confirmation of the placement is sent to the district admin, Principal, and faculty of Millersville. The table below is an example of the data kept for this process for one course. The table below is from courses requiring field experiences from School District of Lancaster.

Course # /Program

Course Assignment Expectation for Cooperating Teacher

Setting

ECRH 519 The field experience for ECHD 519 works in conjunction with the Teacher Action Research Project (TARP) that is a requirement for the course. The TARP is an in depth project that includes several components: collecting and documenting contextual factors within the educational setting; development and implementation of an intervention for a child or small group of children; and documenting, analyzing, and interpreting the data collected; and finally reflecting on the data in relationship to the MU student’s own teacher practices. The field hours would be devoted to the first two sections of the TARP: contextual factor and implementation.

Support candidates’ quest for documentation

Early Learning Center, ages 3-6. Dates/times TBD between co-op and MU Student.

SPED 100 Observations as well as getting involved in class activities as deemed by the cooperating teachers.

Provide general information about the program. Sign attendance form.

(1) Center (early childhood, PreK or Preschool); (1) Special Ed: K-12; IU School to work program, MS, HS, LS, ES, Resource Room, Behavior Placements. IU early int.

EDFN 211 (1.) Demonstrate professional dispositions. (2.) Interview small group of students for field interview and reflective report assignment. (3.)

(1.) Model positive practices (2.) Allow for interactions with

K-4, Urban (Columbia, Lebanon City, School District

8

Provide assistance to individual students (4.) Teach a reading lesson (5.) Visual rhetoric assignment

students. (3.) Assist and support students in instructional tasks (4.) Provide feedback on preservice teacher development and specific instructional tasks

of Lancaster, York City SD) (If student = dual ERCH/SPED major, can be placed in K-6 elementary school setting). -- Will need to work with a student with special needs. (FYI - see SFB: Art, Music & Tech Ed can be in K-12, they are K-12 cert.)

The placement email below provides an example of the communication between Millersville and our district partners. The table provides dates and topics for further email communications which provide evidence partners work alongside the EPP’s Department of Field Services (DFS).

Date From To Topic

May 9,2018

Central York School District

Early Field, DFS Course expectations from MU. Please match with potential COOPs and classrooms.

Feb. 19, 2019

Student Teaching

DFS ‘we continue to evaluate our request processes & timelines based upon feedback from our district partners. In doing so, we are moving towards a system of requesting Early Field Experiences earlier in the semester. Our hope is to provide placements details earlier in the process. Our office defines Early Field Experiences as opportunities given to pre-student teachers to visit schools, observe students and teachers and acquire/refine the complex skills involved in teaching’.

Sept 4, 2019

IU 13 School District

DFS Student placement information. DFS asking for confirmation and feedback and identification of COOPs.

May 4, 2020

School District of Lancaster (SDoL)

DFS Inquiry of how many clinical placements are needed. Outline of possible placements.

May 23, 2019

SDoL Director of DFS restrictions on when we talk with principals and coops, all student placement assignments at least 3 weeks in advance.

May 19, 2020

Lisa Hardwig, SDoL

DFS early field and student teaching staff responsible for placements.

We kindly request that we have the entire list of all placements completed prior to any communication to our principals and staff so that we can send out the placement information to our principals first. We would also like to have all forms sent at one time.

Task: COOPs Credential Accountability

A. Evidence in need of verification or corroboration 1. 2.2. Evidence that EPP ensures COOPs credentials

B. Excerpt from SSR to be clarified or confirmed 1. 2.2 states that, "The P-12 education COOP must possess appropriate professional educator

certification, three years satisfactory certified teaching experience, one-year certified experience in the specific placement with a performance rating of proficient or above, and directly engages in teaching subject matter or conducting learning activities in the area of student teaching."

9

C. Question for EPP concerning additional evidence, data, and interviews 1. How are partners are involved in selecting COOPs and how are EPPs ensuring that their

credentials are met?

EPP Response: (A.) (B.) (C.) The selection of COOPs is regulated by the Pennsylvania State Department of Education (PDE. The regulations states: Candidates are assigned by Principals of local education agency (LEA) to a cooperating teacher with appropriate professional educator certification (3 years satisfactory certified teaching experience on the appropriate certificate and 1 year certified experience in the specific placement) who is trained by the preparation program faculty (22 Pa. Code §354.25(f)). If there are complaints about a cooperating teacher from a Supervisor or candidate, those complaints are investigated by the Director of Field Services and the chair of the department. If the complaints have solid evidence that a cooperating teacher is not providing sufficient guidance and support for our candidates, that cooperating teacher is not selected to mentor a candidate in the next semester. We plan to offer mentor professional development online training beginning in fall 2020. If a cooperating teacher has had a verified complaint about their support or mentoring, that cooperating teacher will be required to complete the online professional development training before being asked to serve a as cooperating teacher again. An example of this process was when a supervisor accused a mentor teacher of being biased against older candidates. The Director of Field Services (the Associate Dean of EDHS served in this capacity during the vacancy in the Director’s position) contacted the chair of the department. The chair reached out to a known associate of his who happened to be the Principal of the school where the mentor teachers was employed. Through the discussion of the accusation, it was found it was not the mentor teacher at fault but the supervisor. The supervisor was not asked to supervise in the following semester until the chair of the department had an opportunity to discuss the accusation with the supervisor and the Principal of the school. The situation was resolved amicably for all concerned. The candidate did not suffer any ill effects as the accusation was made in error.

(A.) Millersville’s partnerships with school districts is well establish as our school partners and our college have been placing candidates in schools for many years. Millersville relies on the Principal of each LEA to vet the cooperating teachers as the state requires. Millersville relies on the district liaison to reach out to Principals for their selection and verification with the district office administration and PDE of the cooperating teachers for early field work and for student teaching. In the fall of 2019 the Dean of the College of Education, the Associate Dean and the Director of Field Services visited each of our urban school partner Superintendents. The table shows dates and districts.

Date Attendees School Partner

Sept 26, 2019 Dean Drake, Dr. Mahoney, Dir of Foundations, Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field Services

City of York

Oct. 17, 2019 Dean Drake, Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field Services, Superintendent of Schools, Human Resources Dir.

Lebanon School District

Oct. 23, 2019 Dean Drake, Dr. Mahoney, Dir of Foundations, Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field Services, Superintendent of Schools

Columbia Borough

March 20, 2019

Dean Drake, Dr. Bolton, Assoc Dean, Lunch with Dr. Rau, School District of Lancaster

Jan. 23, 2019 Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field Services School Dist of Lancaster Feb. 27, 2019 Dean Drake, Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field

Services, Superintendent, Curriculum and Instruct Director, and Coordinator of Placements

School District of Lancaster

10

March 13, 2019

Dean Drake and Dr. Mahoney Direct of Foundations City of York School District second mtg.

The meetings were to solidify our partnerships in terms of receiving feedback about our candidates’ performances in the schools, introduce the new Associate Dean, certify the selection of qualified cooperating teachers for both early field and student teaching clinical experiences. This kind of personal visit ensures communication is on-going. A direct result of one of the meeting was the hiring of a candidate for a Physics position. The school needed a Physics teacher and Millersville needed a placement of a student teacher at the school district close to his place of residence. The meeting participants were able to work out a long-term subbing position and then a job offer for the candidate. Another direct result of meetings with LEA leadership is the requirement of ethics training for all our candidates. ACT 126 - Educator Ethics Training (Effective 1/15/2020) ACT 126 – Educator Ethics Training is available on the Department of Education's SAS Portal and must be completed prior to submitting your APS application. Students who do not have a SAS account, must first register for an account by visiting PDE SAS.

https://www.millersville.edu/cert/aps.php What is the evidence showing the partners are providing feedback to the EPP.

The liaison from the administration offices of the LEA districts is responsible for continually verifying all teacher credential along with Principals as required by the State Department. Millersville’s PDS liaisons are in the schools consistently for supervision of the candidates in the partner schools. During the placement of PDS candidates, called interview and placement, Principals recommend the cooperating teacher/mentor. Millersville’s liaisons follow the recommendations of the Principals of the LEA as required by state policy and teacher unions requirements.

Task: Formal Methods of Feedback

A. Evidence the EPP engages informal methods of feedback with Partners

EPP Response: In the fall of 2019 the Dean of the College of Education, the Associate Dean and the Director of Field Services visited each of our urban school partner Superintendents. The table shows dates and districts.

Date Attendees School Partner

Sept 26, 2019 Dean Drake, Dr. Mahoney, Dir of Foundations, Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field Services

City of York

Oct. 17, 2019 Dean Drake, Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field Services, Superintendent of Schools, Human Resources Dir.

Lebanon School District

Oct. 23, 2019 Dean Drake, Dr. Mahoney, Dir of Foundations, Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field Services, Superintendent of Schools

Columbia Borough

March 20, 2019

Dean Drake, Dr. Bolton, Assoc Dean, Lunch with Dr. Rau, School District of Lancaster

Jan. 23, 2019 Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field Services School Dist of Lancaster Feb. 27, 2019 Dean Drake, Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field

Services, Superintendent, Curriculum and Instruct Director, and Coordinator of Placements

School District of Lancaster

March 13, 2019

Dean Drake and Dr. Mahoney Direct of Foundations City of York School District second mtg.

The meetings were to solidify our partnerships in terms of receiving feedback about our candidates’ performances in the schools, introduce the new Associate Dean, certify the selection of qualified

11

cooperating teachers for both early field and student teaching clinical experiences. This kind of formal yet personal visit ensures communication is on-going. The time allows for feedback from partners and clarification of questions and issues about with partners are concerned. An example from these meetings is the issue of candidates or student teachers or candidates in the advanced professional courses serving as a substitute in a classroom. Our school partners needed clarification of the policy Millersville’s candidates follow. The state policy for student teachers is that may substitute for their cooperating teacher for only 10 days. Student teachers may not substitute in other classrooms and leave their assigned student teaching placement. If a student teacher is offered a substitute position that will not take away from the student teaching placement classroom, the student teacher will have to register with the substitute service the school district uses. This issue is a district human resources issue and not handled by Millersville. Candidates in field placements can substitute in the schools if they are registered with the substitute service approved by the school district. Candidates may not accept a substitute position that interferes with their course work or course field experience requirements. The submitting does not replace the field experience time assigned by the course faculty.

Another example of a formal method for collecting feedback from partners is the survey data collected before Millersville’s special education (SPED) department revised the program requirements according to state law enacted in fall 2019. The state law for SPED changed EPP program offerings to change to offering certificate programs for P-12 in Special Education instead approving certification for SPED in grades P-8 as a dual major with PreK-4 regular education and a SPED certification for grades 7-12. Our program faculty initiated a survey of our partners to gather feedback on offering one program for P-12 or offering a dual certification program for PreK-4 and SPED. Although the total number of responders was not great, the data acquired in the survey questionnaire is useful and validates our partners’’ needs and opinions for our programs. The resulting A Qualitative Analysis of Practitioner Feedback was shared with faculty and the responders to the survey. The 17 practitioners who responded to the survey agree on three unifying areas of skill and expertise. Science having to do with student behaviors, the specialized professional skills required to build learning environments for students with significant support needs (low incidence disability) and the link between assessment, methods, and content. The 3-themes that emerged and are helpful for the MU planning committee in the planning stage: 1. Mentor teachers and leaders as faculty, 2.Self-Contained classrooms are on the rise, and 3. Secondary certification is critical The themes may serve as a measurement of ultimate quality.

B. 2.2 Excerpt from SSR to be clarified or confirmed. Excerpt from SSR to be clarified or confirmed (1) 2.2 states that, "Informal methods include receiving feedback from partners through faculty visits to schools, university supervisor discussions with COOPS, and PDS mentor teacher meetings with EDHS faculty The Department of Field Services (DFS) orientation meetings and supervisor meetings provide feedback as well." . Describe FORMAL meetings – PDS liaisons, the person in charge of PDS, formal meetings and the feedback.

EPP Response: Please see the table below which summarizes Department Meetings and DFS Early field placements which includes meetings with supervisors and Supervisors, temporary or full time course faculty (see chart delineating faculty status included in this section’s response) supervise and assist candidates but also communicate and gather feedback from partners.

Teacher candidates are enrolled in field experiences when they take foundations bloc courses, including EDFN 211, EDFN 241. In EDFN 211 & 241, students complete a field experience in an urban setting. An

12

evaluation form is completed by the mentor teacher. The mentor teacher is assessing the teacher candidate's performance in the areas of professional dispositions, instructional competencies, and career attitudes. The evaluation form is reviewed by both EDFN professors who also reflect upon teacher candidates' performance in their respective courses. Both EDFN professors must recommend the teacher candidate in order for the candidate to move forward in the program.

An evaluation at the course conclusion is completed by the instructor which takes into consideration the teacher candidate's performance in these experiences. The instructor rates the teacher candidates in the following categories: 1) Communicates Professionally, 2) Demonstrates Professional Growth, 3) Demonstrates Professional Relationships, 4) Exhibits Attributes Suitable to the Profession, 5) Displays Responsible and Ethical Behavior, 6) Academic Readiness, 7) Field Evaluation: Professional Competencies, 8) Field Evaluation: Instructional Competencies, and 9) Field Evaluation: Professional Behaviors/Dispositions. In addition, the instructor indicates whether the teacher candidate is recommended for advanced professional studies (APS transition point entry to clinical).

The instructor can also indicate whether there are any reservations. The supervisors bring the cooperating teacher and school administrator feedback back to faculty meetings for discussion and actions. The field supervisor/ PDS liaisons, visits involve communications with both the coops and the students. Meeting minutes in in the attachment Dept Minutes, Field Eval, Student Sample had minutes from department meetings with highlighted items that are directly related to receiving and acting on partner feedback.

The former Director of Field Services held ‘chat and chew’ (no agenda just free flowing sharing and communication) follow up sessions at the end of the semester with faculty supervisors (student teaching) to discuss the semester. Field Services holds end of semester meetings with the program departments as well (Transition point exit from program). The attachment Supervisor Meetings contains the agendas and attendance for two years of supervisor meetings.

Supervisors in student teaching serve as liaisons between the partner school COOP and the program department chairs. Communications about issues with candidates, COOP concerns (including strengths and weaknesses of the COOP mentorship of candidates, candidate’ issues, and concerns for training come through the supervisor in the form of emails, supervisor meetings, written MU Danielson and state Required PDE 430 forms, Professional Development Plans (PDP), and data entry in the PEU assessment system. For early field placements, beginning of the candidates’ program (transition point entry into the program) supervisors are course faculty. For the evaluation of candidates see attachment Dept Minutes, Field Eval, Student Sample.

PDS Student teachers are supervised by temporary faculty and full-time faculty. Please see the table with full time or temporary faculty designations. Feedback is shared at PDS meetings and the annual PDS meeting which includes mentor teachers, faculty from school partners, MU faculty, Department of Field Services Director, and Associate Dean of the College. The professors require field evaluations for the courses. Foundations Block professors require COOPs to submit an evaluation of their field work as part of their coursework. Please see attachment titled Supervisor Meetings for agenda, attendance, and topics.

Feedback from full time faculty are shared at department meetings, PDS annual meetings, liaison interview and placement meetings at schools directed held by PDS Directors of secondary PDS and EMEE PDs. COOPs/Mentor teachers meet with PDS directors and align candidates interests with classrooms

13

within the PDS partner schools. See folder titled PDS meeting minutes, and Department of Field Services supervisor provides meeting times, attendance, topics, and actions taken.

This chart shows full time and temporary faculty who serve as supervisors.

C. Question or EPP concerning additional evidence, data, and interviews

1. What is the evidence showing the partners are providing feedback to the EPP? EPP Response: Feedback from full time faculty are shared at department meetings (please see attachments titled Dept Minutes, Field Eval, Student Sample), PDS Partner and Faculty )includes agenda and attendance for annual meeting), liaison interview and placement meetings at schools directed held by PDS Directors of secondary PDS and EMEE PDs. COOPs/Mentor teachers meet with PDS directors and align candidates interests with classrooms within the PDS partner schools.

In response to meeting with partners during school visits, supervisor meetings, Dean meetings arranged with individual partners, events (included in EPP response to standard 3 tasks) held at MU like Color of Teaching and Conference on Education and Poverty and at PDS meetings partners made it known an Ethics training for our candidates would increase their knowledge of professional behaviors. Ethical practice is a research area one of our faculty was involved in with the Pennsylvania State Department of Education (PDE). ACT 126 – Educator Ethics Training is available on the Department of Education's SAS Portal and must be completed prior to submitting candidates’ Advanced Professional Studies. Partners were also pleased candidates were exposed to the PDE requirements for ethics training earlier in their programs. Another action taken through communication with partners is the March 29, 2019. On March 29, 2019, faculty from across the professional education unit gathered in Stayer 108 to establish inter rater reliability on a professional behaviors’ rubric. The result of the meeting was confirmation of the interrater reliability among all groups using the assessment.

Supervisor Last Name

Program Full Time Temporary

Brown EMEE X Green EMEE X Holt EMEE X Legutko EMEE X Palmquist EMEE X Ullmann EMEE X Vaites EMEE X Wytovich EMEE X Long EMEE (PDS) X Kinderwater-Carroll EMEE (PDS) X Leslie Gates Music (PDS) X Kerr EDFN (PDS) X Dutton EDFN (PDS) X Deemer EDFN (PDS) X Daneker EDFN X Brusic EDFN Educ (PDS) X

14

Those in attendance, by department/affiliation:

Early, Middle, and Exceptional Education:

Jen Shettel, Marcia Nell, Aileen Hower, Kim Heilshorn, Ellen Long, Janet Josephson, Rich Mehrenberg, Bill Himele, Jason Davis, Deborah Tamakloe, Beth Powers, Suzanna Boyle.

Educational Foundations:

Miriam Witmer, Tim Mahoney, Ollie Dreon, Laurie Hanich, Sarah Dutton, Anne Carroll, Ojoma Edeh Herr, Sarah Brooks, Thomas Neuville, Tiffany Wright, Sandy Deemer.

Associate Dean: Marcia Bolton

Director of Field Services: Jessica Stevens

Mathematics: Michael Wismer

History: Victoria Khiterer

Biology: Dominique Didier

Foreign Language: Susanne Nimmrichter

Art & Design: Leslie Gates

Pequea Elementary, Penn Manor School District: Shirley Murray

Task: Reviewing Clinical Experiences

A. Evidence in need of verification or corroboration 1. 2.3 - evidence that EPP and partners collaborate to review clinical experiences

EPP Response: Feedback from full time faculty are shared at department meetings, PDS annual meetings, liaison interview and placement meetings at schools directed held by PDS Directors of secondary PDS and EMEE PDs. COOPs/Mentor teachers meet with PDS directors and align candidates interests with classrooms within the PDS partner schools. See folder titled PDS meeting minutes, and Department of Field Services supervisor meetings in the AttachmentDepartment Meetings and PDS Partner Meetings provides meeting times, attendance, topics, and actions taken.

B. Excerpt from SSR to be clarified or confirmed C. 2.3 states "MU works with partners to outline and review clinical experiences to ensure they provide experiences with depth, breadth, diversity, coherence, and duration to provide candidates with rich, meaningful experiences." EPP Response: The 2019 Assessment Handbook on pgs. 34-36 provide evidence our partners ensure experiences are rich, meaningful, and supportive. The Professional Development Plan is a document that shows candidates and partners work together with MU supervisors to become a strong pre-service teacher. If a field-based partner or university supervisor identifies professionalism concerns in the field and if either the field experience coordinator or the program think a formal review process is needed, then a formal process is initiated. If the candidate has been removed by the field partner and both the field experience coordinator and program leader agree that a formal process is not needed, then a new placement will be found as soon as possible. Please see attachment Sample PDP Plans. Additional documents that show the working partnership between partners and MU are the field evaluation forms programs collected before admission to advanced professional studies courses. The

15

form provides COOP input in the form of a field assessment of skills and behaviors to the faculty of a candidate for review before the candidate can progress in their education program. If the candidate is struggling the supervisors (faculty) and the COOP plan ways to support the candidate for growth. Please see attachment Department Meetings Plus Evaluation Sample. The folder within the attachment titled Dept Minutes, includes the form for Field Eval and a Student Sample. Because of the bult in support system very few candidates do not succeed in their application to Advanced Professional Studies (APS) course work, transition point Entry into the Program, 2019 Assessment Handbook attachment p. 15. More student samples are available at the on-site visit in Nov. 2020. D. Question for EPP concerning additional evidence, data, and interviews 1. What is the process the EPP follows in order to review clinical experiences for candidate? Where is the evidence indicating the modification of the PEU assessment titled Teacher Work Sample (TWS) Where is the input that they wanted the TWS changed, what changes were made, and follow up

EPP Response: At Fall 2017 meetings designed to collect feedback about our PDS Program with teacher and administrators from our partner districts, concerns the TWS were raised. In response, during the 2018-19 & 2019-10 school year action research was piloted in the York Suburban School District with PDS interns as a replacement for the TWS. Based on the successes of this pilot, in January 2020, the EMEE Dept. approved a motion to replace the TWS with action research as we move forward.

The change occurred due to partners response at this meeting:

Millersville PDS Program Feedback 10.25.17 Input from: Kim Stoltz Lorra Cummins Denise Fuhrman Gina Neiderer Heather Hoover Sara Bosco Doreene Ridgeway Lisa Amspacher Kellie Aughenbaugh Cheryl Johnson Adreinne Myers Amy Hare Lexy Morrow WHAT’S GOING WELL -- Like the interview process - can match philosophies Like what they get to see---that it’s right away--starting in the fall Students who participate in Back to School Night---gives them respect from parents etc. Relationship Mville with YS --ease of start-up. Get things set up prior to the start of the year. ONE PROFESSOR to cover the building. Interviews are critical -- must keep. (I wasn’t aware that some interviewed but did not take anyone) The students who come in before school and those first weeks are set up very well Full year experience Middle level kids well trained in their content area WHAT SHOULD CHANGE--- Middle level kids not well prepared for anything out of their content area. Students need more opportunities to teach. Kids have lots of knowledge but not actual teaching

16

TWS seems to contradict the co-teaching model. TWS lends itself to science and social studies. District doesn’t have much time to teach in those areas. TWS are very time consuming TWS is not a realistic thing relative to how units and info is tight anymore. Going back to old thematic units from our student teaching duty. TWS is not practical The new TWS format that was piloted in York School District is attached to this addendum Titled: New TWS. Included in this attachment (scroll down to the bottom of first page titled York Suburban School District/Millersville University PDS Action Research Project) are detailed of the Action Research/Inquiry Project including a timeline. This project is an excellent example of our candidates’ use of research in their programs. The next step in the TWS process is the Associate Dean will amend the assessment rubric in the system data collection system. The PEU assessment system will reflect changes in the assessment, the sub scores, and be aligned with CAEP and InTASC standards for future data collection.

Task: Professional Development Plans

A. Evidence in need of verification or corroboration 1. 2.3. Prior Professional Development Plans 2. EPP Response: The plans can be verified on site during Nov. 202 lead site visit. The attachment Sample PDP plans includes actual plans created and signed by all parties. B. Excerpt from SSR to be clarified or confirmed 1. 2.3 states "The EPP states that through collaboration with partners, they identify candidates experiencing difficulties early in the field experience and report their concerns to the program chair, advisor, and Director of Field Services, yet no evidence of this is provided. Is there evidence of this occurring and/or professional development plans (PDP) developed to assist the teacher candidate in establishing achievable goals for improved professional behaviors. EPP Response: The attachment Sample PDP plans includes actual plans created and signed by all parties. The total PDP plans created to date can be verified on site during Nov. 202 lead site visit if such verification is required. For details regarding the development of the PDP plans see the 2019 Assessment Handbook attachment p. 34 - 36. Taken from those pages: All candidates are developing as professionals and are expected to have specific needs for professional performance growth. Faculty should help candidates succeed and to address most needs with informal mentoring and support. Formal review will take place in a meeting including relevant faculty selected by the program leader. In cases involving field experience, the person responsible for field placements for that program (typically the Field Experience Coordinator) must be invited to participate. C. Question for EPP concerning additional evidence, data, and interviews 1. What is the process by which the EPP collaborates with partners to proactively help struggling students? EPP Response: The process. Attachment 2019 Assessment Handbook pgs. 34-36, to proactively help struggling students begins with the COOP and the supervisor discussing the professional behaviors issue the candidate is experiencing. The faculty supervisor brings the concern to the department meeting. The

17

Supervisor shares written communication in the form of the field experience evaluation form or written observation notes from the COOP and the supervisor to the department meeting. The purpose of this sharing is to collaborate with other faculty that may have had the candidate in their course and what their experiences were with the candidate. The Chair of the department arranges a meeting with the COOP and the supervisor to discuss the formative meeting where the goals and objectives are reviewed. The COOP, faculty (Chair of department usually), and supervisor arrive at the meeting with the candidate prepared to discuss the professional behaviors rubric and the observed candidate issues but not with a decision about what will be written in the plan . The candidate and their advocate are invited to the PDP meeting where all goals and objectives and concerns are discussed in a proactive formative way. The purpose of this meeting and PDP plan is to allow the candidate to discuss the concerns brought from the COOP, supervisor, and department chair. The candidate and the advocate can ask for clarification of any aspect of the PDP. The entire group decides on the best course of action and the PDP is completed and signed with dates for a re-assessment of goal and objective attainment. The PDP includes support steps faculty, supervisor, and COOP will take to assist the candidate. Other faculty or tutoring services can be offered to the candidate as needed. The re-evaluation date is set along with times and dates written evidence of the candidate’s work on achieving goals and objectives are set. Feb. 3 department meeting example: Professional Behaviors Dr. Josephson in consultation with the COOP is expressing a professional behaviors issue on a student in her SPED block course. His course assignments are poorly prepared and does not communicate/respond to emails. She has reached out to him three time via email and asked him verbally several times to meet with her. Dr. Josephson has initiated discussions with other faculty that has had him in the past, and they have not noticed anything extreme. Action: Dr. Mehrenberg will contact candidate and set up a meeting with Dr. Josephson. COOP, and Associate Dean in attendance.

Preliminary recommendations for new AFIs including a rational

EPP Response: Please see Task: Cooperating Teacher Training, Task: B. 1. And C.

Minutes that show that the partners have reviewed the transition point requirements they agree with them in included in the attachment titled PDS Partner and Faculty Meetings provides this information 3 years.

Date Attendees Topic Action Results 2/15/19 Thomas Bell, Sarah Brooks,

Sharon Brusic, Emily Davis, Sarah Dutton, Leslie Gates, Anne Kinderwater-Carroll, Ellen Long, Kim McCollum-Clark, Nakeiha Primus, Anne Stuart, Miriam Witmer

Developed Methods syllabus statement regarding passing unit plan assessment. You must pass the unit plan assignment for this course to proceed to student teaching. This policy is in place because you must demonstrate that you have the skills to engage in long-range instructional planning to proceed to student teaching. If you earn two or more unsatisfactory ratings or a grade lower that 70% on your unit plan, you have not passed the assignment.

Published statement in the unit plan assessment.

18

2/1/19 Thomas Bell, Sarah Brooks, Sharon Brusic Emily Davis (Zoom) Sarah Dutton (Zoom) Leslie Gates Anne Kinderwater-Carroll (NA) Ellen Long (NA) Kim McCollum-Clark (NA) Nakeiha Primus Anne Stuart Miriam Witmer

PDS Assessments data, review EDSE 471 syllabi, Review data, no revisions made. Refined data submission by all supervisors and mentors

10/26/18 Sarah Brooks, Sharon Brusic Emily Davis (Zoom) Sarah Dutton (Zoom) Leslie Gates Anne Kinderwater-Carroll (NA) Ellen Long (NA) Kim McCollum-Clark (NA) Nakeiha Primus Anne Stuart Miriam Witmer

Have you reviewed all of the evals completed for your assigned interns? o Any unsatisfactory ratings? o Did mentor agree to host next semester? - Please discuss any concerns cited on the mid semester eval with mentor and intern. - Please make plans to observe each of your interns if you have not already.

Updated professionalism concerns into the Professional Behaviors Data Sheet on Google Drive. Professional Development Plans were formulated, and meetings held. PDPs posted in D2L

10/31 Sarah Brooks, Sharon Brusic Emily Davis (Zoom) Sarah Dutton (Zoom) Leslie Gates Anne Kinderwater-Carroll (NA) Ellen Long (NA) Kim McCollum-Clark (NA) Nakeiha Primus Anne Stuart Miriam Witmer

EDFN is proposing graduate cross listed courses for EDSE 340 and SPED 346 for Fall 2018. How will we collaboratively assess interns’ professional behaviors using new rubric this semester?

Courses were cross listed. Assessment data can be collated by PDS director and distributed.

AFI: The EPP has little evidence to support that the EPP co-selects, prepares, evaluates, supports, and retains high-quality clinical educators. (2.2) Training for COOPs please see table above and

EPP Response: Please see Task: Task: COOPs Credential Accountability Sections A, B, C

The selection of COOPs is regulated by the Pennsylvania State Department of Education (PDE). The regulations states: Candidates are assigned by Principals of local education agency (LEA) to a cooperating teacher with appropriate professional educator certification (3 years satisfactory certified teaching experience on the appropriate certificate and 1 year certified experience in the specific placement) who is trained by the preparation program faculty (22 Pa. Code §354.25(f)).

(A.) Millersville’s partnerships with school districts is well established as our school partners and our college have been placing candidates in schools for many years. Millersville relies on the Principal of each LEA to vet the cooperating teachers as the state requires. Millersville relies on the district liaison to reach out to Principals for their selection and verification with the district office administration and PDE of the cooperating teachers for early field work and for student teaching. In the fall of 2019 the Dean of the College of Education, the Associate Dean and the Director of Field Services visited each of our urban school partner Superintendents. The table shows dates and districts.

Date Attendees School Partner

Sept 26, 2019 Dean Drake, Dr. Mahoney, Dir of Foundations, Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field Services

City of York

Oct. 17, 2019 Dean Drake, Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field Services, Superintendent of Schools, Human Resources Dir.

Lebanon School District

Oct. 23, 2019 Dean Drake, Dr. Mahoney, Dir of Foundations, Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field Services, Superintendent of Schools

Columbia Borough

19

March 20, 2019

Dean Drake, Dr. Bolton, Assoc Dean, Lunch with Dr. Rau, School District of Lancaster

Jan. 23, 2019 Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field Services School Dist of Lancaster Feb. 27, 2019 Dean Drake, Dr. Bolton, Assoc Dean, Jessica Stephens, Director of Field

Services, Superintendent, Curriculum and Instruct Director, and Coordinator of Placements

School District of Lancaster

March 13, 2019

Dean Drake and Dr. Mahoney Direct of Foundations City of York School District second mtg.

The meetings were to solidify our partnerships in terms of receiving feedback about our candidates’ performances in the schools, introduce the new Associate Dean, certify the selection of qualified cooperating teachers for both early field and student teaching clinical experiences. This kind of personal visit ensures communication is on-going. A direct result of one of the meeting was the hiring of a candidate for a Physics position. The school needed a Physics teacher and Millersville needed a placement of a student teacher at the school district close to his place of residence. The meeting participants were able to work out a long-term subbing position and then a job offer for the candidate. Another direct result of meetings with LEA leadership is the requirement of ethics training for all our candidates. ACT 126 - Educator Ethics Training (Effective 1/15/2020) ACT 126 – Educator Ethics Training is available on the Department of Education's SAS Portal and must be completed prior to submitting your APS application. Students who do not have a SAS account, must first register for an account by visiting PDE SAS.

https://www.millersville.edu/cert/aps.php What is the evidence showing the partners are providing feedback to the EPP.

The liaison from the administration offices of the LEA districts is responsible for continually verifying all teacher credential along with Principals as required by the State Department. Millersville’s PDS liaisons are in the schools consistently for supervision of the candidates in the partner schools. During the placement of PDS candidates, called interview and placement, Principals recommend the cooperating teacher/mentor. Millersville’s liaisons follow the recommendations of the Principals of the LEA as required by state policy and teacher unions requirements.

EPP Response: Please see C. in this task. In C. are training agendas, attendance, topics covered, reference to the attachment, PDS Partner and Faculty Meetings, in this addendum for three years of meetings and an annual meeting with partners and COOPs/mentor teachers.

The department of field services has had a change in Director in the last 18 months. The former job description was for a part time temporary faculty position with only 3 weeks of job duties during the summer months. The College of Education re-wrote the Director of Field Services job for a full- time staff member with a 12-month contract. The new director started in the October of the fall 2019 semester. The full time Director is able to establish firm partnerships with school partners, consistently communicate with partners, and participate in the College of Education’s shared governance by serving on the Teacher Education Council, Chair of the Clinical Partners Committee, and work closely with the Associate Dean of the College to establish secure communications and liaison relationship with our partners, which includes offering training to cooperating teachers (COOPs). The transition in leadership for the Department of Field Services (DFS) has interrupted continuous formal training for COOPs in the 2019-2020 academic year. PDS schools have offered continued training and interaction to its school partners and administrators.

Standard 3

EPP Response underlined Attachments are highlighted

Title: Addressing high needs areas

A.1. Evidence in need of verification or corroboration The EPP states, "The PEU demonstrates efforts to know and address community, state, national, regional, or local needs for hard-to-staff schools and shortage fields such as STEM, English-language learning, and students with disabilities through collaborative field placements. Our P-12 partners are invited to PDS professional development meetings to offer insights on how to meet the needs of their districts."

AFI: Not all programs have provided a recruitment plan and available plans do not sufficiently address community, state, national, regional, or local needs for hard-to-staff schools and shortage fields such as STEM, ELL, and students with disabilities. (3.1)

EPP Response: We apologize for submitting data that was not disaggregated by program. Please see attachment titled: Program and Unit Recruiting Initiatives for all program data. Attachment titled Unit Recruitment Plan, p. 1, addresses our partnership with schools of high need: Lancaster, York, Lebanon, and occasionally Berks and Chester counties. The College of Education and Human Services (EDHS) foundation courses (EDFN) and faculty partner with urban school districts in a collaborative way so that our MU teacher candidates experience an educational setting with intersectional identities in a densely populated place and provide a value-added experience to school communities and partnerships. The dashboard data Race/Ethnicity, included in this addendum Task: Addressing high needs areas, A.1, for our College grants BSE degree data showing our diverse population that graduate to work in high needs schools. The Excel workbook tables provide data of where our candidates were employed in the as reported in our Alumni Satisfaction and Job Placement (available to the lead site visitor Nov. 2020. The data table shows 82% of EMEE graduates stayed in PA and mostly in these school districts: Eastern Lancaster County School District Eastern Lebanon County District, and York are school districts with high needs due to poverty, lack of resources, and STEM needs that are greater than neighboring school districts such as the wealthy Penn Manor School District as evidenced by the state department District Report Card data: (empty cells indicate no data) https://www.education.pa.gov/K-12/ESSA/ESSAReportCard/DataSources/Pages/default.aspx

District 2 years Regular attendees

Indian/Alaskan Native

Asian Black Economically Disadvantaged

English Learner

Hawaiian/Pacific Islander

Hispanic Student with Disabilities

Lancaster 94% 91% 96% 89% 85% Lebanon 95% 95% 94% 95% 94% York 86% 85% 89% 89% 80% Penn Manor

88% 93% 93% 93%

In continuing efforts to meet the needs of the dynamic and diverse area of the Pennsylvania community, EPP and its partners collaborate with local schools and districts to address community, state, regional, and national, needs, and especially, the needs for hard-to-staff schools and in critical shortage fields (e.g., STEM, English-language learners, and exceptional student education).

Since 2012, all initial and advanced licensure programs have included the State of Pennsylvania mandated Accommodations and Adaptations for Diverse Learners in Inclusive Settings and the preparation of English Language Learners. All programs needed to be revised to include 9 credits or 270 hours of Special Education and 3 credits or 90 hours of ELL coursework while staying within state- mandated credit limits. Evidence we meet the state needs and provide data on our Race/Ethnic make up for our college, The Office of Institutional research dashboards provide a breakdown of the demographic make up of our college, departments, managers, faculty, and our University. Initially, department chairs have access to the dashboards and share with faculty. The College Council of the College of Education and Human Serves and School of Social Work is made up of department chairs, the Dean of the College, and the Associate Dean. Data from the dashboards is used to review faculty complements, student enrollment, retention rates, class data can be copied and shared throughout the University and with stakeholders for program improvement purposes. An example of sharing data with stakeholders would be to gather more ideas for recruitment focused on increasing the percentages of diverse groups of students. These data were shared in the college council meeting, May 28, 2020, to develop recruitment plans for each department chair in attendance.

All teacher preparation and specialized programs have been revised to include the mandated Accommodations and Adaptations for Diverse Learners in Inclusive Settings and the preparation of English Language Learners. All programs include 9 credits or 270 hours of Special Education and 3 credits or 90 hours of ELL coursework while staying within state- mandated credit limits. Every candidate in initial programs, including Post Bacc cert programs, is required to take Foundations Bloc, consisting of the courses EDFN 211 and EDFN 241. These courses are taught in adjacent time periods to facilitate the major curricular emphasis on diverse student populations and to accommodate a four week, eight day field experience in an urban, multicultural, low socio-economic status (SES) elementary, middle or secondary school with a high percentage of ELL. The rubric for and data on candidates' performance in the diversity assignment in these courses for spring 2019 are shown in EDFN 211 was revised extensively and two additional diversity enriched courses were added to all initial programs to meet the requirements of state policy 49-2. Upper level courses in each initial program also integrate elements of diversity as evidenced in course syllabi, course assessments, and in field and practicum experiences. The table below provides the spring 2020 placements. The site column shows the diversity of placements that were agreed upon with partners and Millersville’ Department of Field Experiences (DFS).

Courses Placement Assignment Intro to Early Education (1) Special Ed: K-12; IU, MS, HS, LS, ES,

Sections: M/W .03; T/R .01, .02, .04 (Fall: M and T; Spring: W and R) Resource Room, Behavior Placements. IU early Int.

Orientation to SPED K-4, Urban Early Childhood Foundations Pre K Counts classroom;

Course: Erch 496; Section: M/W .91; T/R .93 Head Start Program

Early Childhood Prof Block Early Childhood Center

Early Childhood Prof Block II K-4 Regular Educ Tchg Gifted Learners Gifted Teacher Gifted 675.50 course Gifted Teacher SPED Professional Bloc I (SPB1) (1) IU12, IU13 classroom Preferred, 1) PreK Counts, Head Start, Early Int. SPED Professional Bloc I (SPB II) Strand 1: K-8; moderate to severe focus, autistic support, emotional support, life

skills or multiple disabilities. Prefer IU12 and IU13. Prefer not paired. MDLV Professional Block 4-6, Regular Ed

Courses: MDLV 323, 486 Elem School Preferred SECED Foundations Block (SFB) M/W - 2/10, 2/12, 2/19, 2/24, 2/26, 3/2, 3/4, 3/9 or 32 X 4-8 for MDLV Only; Urban

Secondary Ed Foundations 9-12, Urban Courses: EDFN 545.50A & .50B, 590.01 16-32 hrs total (16 if in 545 ONLY or 590 ONLY, 32 if in BOTH 545 and 590) 9-12, Urban

Post Bacc 9-12, Urban Courses: EDFN 545.50A & .50B, 590.01 16-32 hrs total (16 if in 545 ONLY or 590 ONLY, 32 if in BOTH 545 and 590) 9-12, Urban

These field placements were assigned through DFS and in coordination with partners at all levels, and faculty for the courses. Demographic data for the school districts and partners in childcare centers. Millersville has been intentional about filed placements in meeting with our school partners. The School District of Lancaster is a long-standing partner with Millersville. In meetings with the superintendent and College of Education the Color of Teaching Program was implemented. Please see this website to view details: https://www.millersville.edu/edfoundations/coloroft/index.php. The table below has a review of the PDS meetings and topics which show there is collaboration and input from partners when choosing school placements for student teaching. Additionally, the array of liaisons is representative all differing levels of school districts. For example, School District of Lancaster schools re vastly different within the same district so PDS and field placements utilize all schools in the district. Hempfield School is a .07% Black/Hispanic and is a very high resource school situated in a high resource and personal property tax rate zone. Conversely a school located in the Columbia Bourgh (just outside the city of Lancaster City limits) has a 6% Black/Hispanic percentage and low resource allocations due to low property taxes and poverty in the area.

Valley High (Spanish, technology), Huesken Middle (technology, Anne Stuart Blue Ball Elem, Paradise Elem, Denver Elem, Valley View Elem, Mountville Elem Nakeiha Primus ASPIRA Academy, Marticville Middle Susanne Nimmrichter Penn Manor High Kimberly McCollum-Clark McCaskey High Nanette Marcum-Dietrich Conestoga Valley High (science + English) Ellen Long Central York High School, Garden Spot Middle, Manor Middle Anne Kinderwater-Carroll Manheim Township Middle, Martic Elementary Leslie Gates Brownstown Elementary, Landisville Educational Center, Farmdale Elementary Ojoma Edeh Herr Central York Middle Sarah Dutton Manheim Township High, Donegal Intermediate Sandy Deemer King Elementary, Wharton Elementary Dan Daneker Huesken Middle (English, music, Spanish) Sharon Brusic Lampeter-Strasburg High Sarah Brooks Hempfield High, Centerville Middle, Huesken Middle (science + social studies), CV High School.

Date Attendees Topic Action Results 2/15/19 Thomas Bell, Sarah Brooks,

Sharon Brusic, Emily Davis, Sarah Dutton, Leslie Gates, Anne Kinderwater-Carroll, Ellen Long, Kim McCollum-Clark, Nakeiha Primus, Anne Stuart, Miriam Witmer

Developed Methods syllabus statement regarding passing unit plan assessment. You must pass the unit plan assignment for this course to proceed to student teaching. This policy is in place because you must demonstrate that you have the skills to engage in long-range instructional planning to proceed to student teaching. If you earn two or more unsatisfactory ratings or a grade lower that 70% on your unit plan, you have not passed the assignment.

Published statement in the unit plan assessment.

2/1/19 Thomas Bell, Sarah Brooks, Sharon Brusic Emily Davis (Zoom) Sarah Dutton (Zoom) Leslie Gates Anne Kinderwater-Carroll (NA) Ellen Long (NA) Kim McCollum-Clark (NA) Nakeiha Primus Anne Stuart Miriam Witmer

PDS Assessments data, review EDSE 471 syllabi,

Review data, no revisions made. Refined data submission by all supervisors and mentors

10/26/18 Sarah Brooks, Sharon Brusic Emily Davis (Zoom) Sarah Dutton (Zoom) Leslie Gates Anne Kinderwater-Carroll (NA) Ellen Long (NA) Kim McCollum-Clark (NA) Nakeiha Primus Anne Stuart Miriam Witmer

Have you reviewed all the evals completed for your assigned interns? o Any unsatisfactory ratings? o Did mentor agree to host next semester? - Please discuss any concerns cited on the mid semester eval with mentor and intern. - Please make plans to observe each of your interns if you have not already.

Updated professionalism concerns into the Professional Behaviors Data Sheet on Google Drive. Professional Development Plans were formulated, and meetings held. PDPs posted in D2L

10/31 Sarah Brooks, Sharon Brusic Emily Davis (Zoom) Sarah Dutton (Zoom) Leslie Gates Anne Kinderwater-Carroll (NA) Ellen Long (NA) Kim McCollum-Clark (NA) Nakeiha Primus Anne Stuart Miriam Witmer

EDFN is proposing graduate cross listed courses for EDSE 340 and SPED 346 for Fall 2018. How will we collaboratively assess interns’ professional behaviors using new rubric this semester?

Courses were cross listed. Assessment data can be collated by PDS director and distributed.

B. 1. Unable to find evidence of action that addresses knowledge of community, state, national, regional, or local needs for hard-to-staff schools and shortage fields such as STEM, English-language learning, and students with disabilities. EPP Response: The University monitors hard-to-staff and shortage fields as well as shortages by region, particularly Lancaster Public Schools, York, Lebanon, and rural districts. We receive Pennsylvania State Department of Education data annually as well as the federal notification in response to state data. We also receive consultant reports as engaged by the state. ESSA requires that each state track data and hold schools and LEAs accountable for the performance of the following student groups: economically disadvantaged students, children with disabilities, English learners, and students from major racial and ethnic groups (Asian, Black, Hispanic, Multi-Racial, NA or AK Native, Native HI or Other Pacific Islander, and White). The Associate Dean communicates regularly with the Director of Field Services regarding data reported through ESSA. Field placements are coordinated around this data with the chairs of departments. The ESSA data is published at this website for public consumption. https://www.education.pa.gov/K-12/ESSA/ESSAReportCard/DataSources/Pages/default.aspx

District 2 years Regular attendees

Indian/Alaskan Native

Asian Black Economically Disadvantaged

English Learner

Hawaiian/Pacific Islander

Hispanic Student with Disabilities

Lancaster 94% 91% 96% 89% 85% Lebanon 95% 95% 94% 95% 94% York 86% 85% 89% 89% 80% Penn Manor

88% 93% 93% 93%

An example of action by Millersville’s College of Education that addresses knowledge of community, state, national, regional, or local needs for hard-to-staff schools and shortage fields such as STEM, English-language learning, and students with disabilities is the Conference on Education and Poverty, www.conferenceoneducationandpoverty.org. The purpose of the Conference on Education and Poverty is to share information, practices, policies, and research pertaining to working with students in poverty. The Lockey Lecture series recognizes and celebrates all forms of diversity. This year’s lecture featured Dr. Kokka will deliver her talk, "Use the Healing Force of Baby Yoda: Supporting Well-Being and Critical Consciousness through Healing-Informed Social Justice Mathematics." It can be seen from the attendance data many community members were present at this lecture with opportunity to interact with Millersville’s President, Provost, Dean of Education, and Associate Dean of Education. Community 47 faculty+staff 37 students 165

total registered 249

These activities provide opportunities to interact with community on a very personal level. The needs of the community, the makeup of the population of the community, the community needs are at the forefront when involved with them in a local event or through a conference event. The theme of the events allows Millersville to interact with demographics of whom we serve. Holding a national level holding conference or attending conferences for research allows the College of Education to grasp the important demographic groups. 1. Advisory councils provide opportunity to interact with community members and keep in touch with

the needs of hard to staff positions. The Tech Education advisory council meets annually. The last meeting was held on April 17, 2019. The Dean and Associate Dean from the College of Education and Human Services, the Dean of the College of Science and Technology attended along with Technology Education faculty, and students. Topics included: What kinds of issues and trends exist in K-12 schools that we need to be attentive to in teacher preparation (e.g., assessment, writing focus, makerspaces, coding/programming, digital citizenship)? How can we help future Technology & Engineering teachers to better know their roles in K-12 curriculum (e.g., as integrative curriculum team members)? Standards for Technological Literacy Update Project: Current status, plans, and timeline. The Leadership advisory council is made up of Superintendents and Principals form our school partners and but also Principal candidates from across the state. The meeting held Oct 13 Members in Attendance: Marty Hudacs, Tiffany Wright, Philip Gale, Jerry Egan, Cheryl Desmond, Mindy O’Brien, Ann Gaudino, Krysteena Koller (Graduate Assistant). Cheryl informed members about the Pro Public Education & Democracy group originating from Lancaster County and invited everyone to join the Facebook page and become a participant.

Standard 4

EPP Response: underlined Attachments are highlighted

Title: 4.1

A. Evidence in need of verification or corroboration

B. Excerpt from SSR to be clarified or confirmed

C. Questions for EPP concerning additional evidence, data, and/or interviews

(1) How are the state's value-added data analyzed and interpreted in order to improve programs?

What the value-added data showed you in an analysis and with only one year going to follow trends in the weak areas.

EPP Response: State value-added data has only been collected for one year, but Millersville utilized the data in conjunction with other assessment data for our programs to identify areas our graduates, and now professional teachers, are not meeting the growth value index for the state. We are planning to complete case studies and Teacher Work Samples with new teachers in order together more ‘on the ground’ data by which we can evaluate our graduates.

The Pennsylvania Value Added Assessment System (PVASS) Utilizing all the data available (growth and achievement), educators can make data-informed instructional decisions to ensure the academic growth and achievement of all students. For EPPs the PVAS data indicates the proficiency of our graduates to make an impact on student growth and achievement. We use the data to increase of knowledge of weaknesses in our graduates’ preparation and as a measure to review our programs. A positive value for growth (unadjusted for poverty or adjusted for poverty) suggests that, on average, recent completers in the assessed area have a higher teacher-specific growth index than the Standard for Pennsylvania Academic Growth, which is zero. A negative value for growth (unadjusted for poverty or adjusted for poverty) suggests that, on average, recent completers in the assessed area have a lower teacher-specific growth index than the Standard for Pennsylvania Academic Growth. Areas of concern and room for growth: Our Middle Level Program (MDLV) grades 4-8 showed a negative value for growth both unadjusted and adjusted for poverty. This value is very concerning but not a surprise for our Middle Level Program. The Middle years faculty has been meeting regularly within their department and with school partners to strengthen our program offerings. A major issue is faculty are few and spread thinly. The content of the middle years is not taught in our College and faculty see this as a major issue. Pass rates scores in the middle years program reflect our candidates are not achieving enough content instruction and are lacking in content application. PVASS values confirm what Title II Pass Rates show. This low enrolled program is another concern, so recruitment efforts have ramped up fo the 2020-21 school year. Middle School faculty attended all recruitment events held in May and will continue to attend the open houses with materials suited for recruitment. The Dean is involved in talks with MDLV faculty and how to best utilize their expertise in low enrolled courses in this time of financial troubles due to the pandemic.

MDLV faculty use the PVASS data to support the request for additional faculty, to ramp up recruitment efforts, support more content needs to be taught by our faculty (experts in the field for MDL content) and the re-organization of the program requirements and courses. The middle years faculty armed with

PVASS data, test scores, enrollment, and comparison data surveyed MDLV partners and shared candidate. A half day meeting with faculty, MDLV partner school COOPs and administrators, MU administration and faculty had a meeting time arranged when COVID -19 shut the schools down. The faculty and partners will resume the discussion and gathering of input to share at a mee when the crisis is over.

Another example of the review and use of PVASS data impacting program improvement is the value for writing growth is a negative value. Again, concerning but not surprising. This data confirms our candidates are not strong in performing on writing on the State required Basic Skills tests for writing Core writing pass rate:

2017-2018 2016-2017 2015-2016

MU State MU State MU State

Overall 83% 74% 86% 79% 81% 78%

CORE - Math 93% 89% 96% 92% 88% 89%

CORE - Reading 97% 96% 100% 95% 96% 95%

CORE - Writing 86% 83% 88% 83% 68% 85%

Our programs have begun highlighting writing skills within course instruction. A University tutoring system focused on writing has just been implemented. In addition to a focus on developing writing skills within course work our faculty will make use of referrals to this system. The CORE basic skills pass rate was shared in the Assessment Committee meetings that occurred 2019-20 (see attachment Assessment Committee Meetings and Data Days and Dept Minutes, Field Eval, Student Sample) so all departments could be aware this area was a weakness for candidates. The writing (W) designation for course work was reviewed to be sure enough writing rich courses are included our candidates’ programs. This is an area that will continue to be a focus in the 2020-21 academic year.

Title: 4.2 A. Evidence in need of verification or corroboration

B. Excerpt from SSR to be clarified or confirmed

C. Questions for EPP concerning additional evidence, data, and/or interviews

(1) What observation and/or student survey assessments are used to measure the application of professional knowledge, skills, and dispositions corresponding with teaching effectiveness and/or P-12 student learning of completers?

Stipulation: The EPP provided no evidence of research-based observation data or student surveys are not available for completers (4.2) Data provided describes results for clinical experiences.

EPP Response: Informal observations occur for undergraduate initial licensure candidates. We are in the process of developing a systematic way of observing our graduates in their first -third year of employment. Please see phase in plan for ADV programs A.4.3. Potential plan for initial programs: Sept -November 2020 develop rubric for observation within Assessment Committee. Select and begin pilot case studies with candidates (PDS partners for initial programs). Choose faculty to participate in case

study and observation evaluation. November 2020 share all progress (current or planned) with site team reviewers. January 2021 review pilot case studies and observation evaluation and modify process with site partners, graduates, and program faculty and coordinators

PDS liaisons often observe and interact with graduates now teaching in PDS partner schools and we are working on a way to capture or survey these students’ comments and skills through case study or completion of a teacher work sample. We are also investigating if teacher evaluation data from the school district could be access by Millersville. The state of Pennsylvania Department of Education (PDE) Millersville’s liaison reported the state does not collect teacher evaluation data. The liaison is willing to speak with the lead site visitor during the site visit in Nov. 2020 to answer any questions. School districts collect teacher evaluation data but will not share data due to CBA union contract stipulations.

Graduate Programs and PB cert programs develop their own departmental surveys. Usually survey is created through Qualtrics survey tool.

Program # of Respondents

Total Surveyed/% Returned

Weakness Strength

Principal 15 46/46% utilize available community resources to meet the needs of students.

treat all students equitably and take their differences into account in professional practice," "respect and value diversity including the unique backgrounds, abilities and interests of all students," & "Promote the success of all students by acting fairly, with integrity and in an ethical manner"

Sch PSY 20 3/ 14% Curriculum and Operation of schools

Field Supervision (Practicum) and Field Supervision (Internship

Nurse 37 19/36% technology in the specialty work with diverse students with special health needs; work with English Language Learners

Counselor collaboration and advocacy students are developing sensitivity as practicing professionals Rdg Spec 15 9/ 14% Utilize available community

resources to meet the needs of students. Work collaboratively with families and community members

Promote the success of all students by acting fairly, with integrity, and in an ethical manner. Demonstrate competence in my chosen content area. Respect and value diversity including the unique backgrounds, abilities, and interests of all students

(2) How does the EPP ensure that the observation and/or student survey assessments for a include a representative sample or a purposive sample over time?

EPP Response: Ensuring the surveys include a representative sample or are purposeful depends on reviewing the survey regularly before launching. Purposive sampling, by definition, is useful when a researcher is looking to investigate a phenomenon or trend as it compares to what is considered typical or average for members of a population. Survey recipients are taken from rosters for courses.

Advisory councils, faculty, and the office of Institutional Research are often sought out to review the surveys to be sure the questions are not leading and address a need for information. Qualtrics is the primary tool for launching surveys. Qualtrics reviews the quality of the question and offers edits to ensure questions are phrased correctly to get data desired. For example, in Nursing the survey showed technology use was a weakness. This weakness led to a review of the various programs being used by school districts to see if the Department of Nursing could a technology tool used by the schools, i.e. IPads, for candidates to gain some experience prior to the practicum. This action strengthened the

surveys’ purposeful sampling as surveyed candidates were placed in school districts review ensure the sample is representative of the information needed.

(3) What conclusions result from valid interpretations of the data?

Alumni Job Satisfaction and Job Placement “AJPS” disaggregates data by programs that respond to the survey. The initial teaching license programs for candidates seeking certification to teach in grades 7-12 are disaggregated but the survey results include respondent’s evaluation of the department that houses the major. Early, Middle, and exceptional (EMEE) has the largest number of candidates in the initial programs for teacher certification in the College of Education. The survey for members of the College of Education have supplemental questions added to their survey that focus on teacher preparation. Conclusions: The Post Graduate Survey can lend some insight into graduates of our programs in different school settings. A theme for all weak areas in the advanced programs survey is working with the community and advocating for the profession. Identify of these two areas of weakness have caused changes in the supervision and evaluation of the practicum. Closer attention to the Memorandum of Agreement (MOU) documents that outline the interaction of our practicum students has been reviewed. Additions of specific activities expected of our students has become a part of the MOU. Placements ware being closely scrutinized to meet the needs of our students and the P-12 students where they are placed. Self-placement of candidates seems to add to interacting with community because the candite is familiar with the site. The Principal program has worked hard to closely align their program assessments with the ELCC standards. The Principal SPA program report was not recognized for the last submission. Since this time, the leadership faculty have worked closely with the Associate Dean and advisory council partners to make changes in the program. For example, the practicum now includes a principal/supervisor doing an observation simultaneously.

The School Nurse survey indicated a weakness in utilizing technology. There are many electronic record packages and each school district has a different one. This makes it extremely difficult to teach one type when it might not be the one used by that school district. The past two years, this has not been as much as a problem since many have some experience prior to the course, NURS 560, if they had substituted in a district or work as school health assistant. The basic understanding of electronic records is reviewed during the legal aspects in school nursing (NURS 560). But the specifics of how each program works is not possible to recreated in the classroom. They are encouraged in the practicum to gain as much experience as possible in the setting with the CSN Site Supervisor. The strength of the nursing program from the survey is working with diverse and English Language learners. This is a direct result of the State Department of Education requiring all candidates for licensure to take course work in working with English Language Learners (ELL courses) and Divers Learners (SPED courses)

Title: 4.3

A. Evidence in need of verification or corroboration

B. Excerpt from SSR to be clarified or confirmed

C. Questions for EPP concerning additional evidence, data, and/or interviews

(1) After a thorough analysis and interpretation of results are conducted, how will these results be applied to program improvement?

(2) What were the response rates for these surveys?

(3) How do programs compare when data are disaggregated?

AFI: The EPP provided limited evidence of employer satisfaction with completer preparation. (4.3) Though number completed is provided, response rate is not identified, nor are data disaggregated by program.

EPP Response: Two post graduate surveys are launched. The details of each survey and analysis of each survey follows.

University Level: Alumni Job Placement Survey or “AJPS”: Given the voluntary, self-selected nature of the survey, we are aware of the systematic threats of “volunteer bias” and “social desirability bias” in the composition of our sample. Therefore, we do the following:

• Use multiple methods: online survey, phone survey, passive commercial social-media scan. • Individual online invitations sent from department chairs rather than a “faceless” bureaucrat. • Multiple, staggered invitation times and days of the week. • Phone callers use randomized patterns and a “neutral” script. • Multiple sources for invitation email addresses, including the respondent’s own contact details

supplied to the Alumni association and the Senior Exit Survey at graduation. • Multiple sources (“triangulation”) for key questions (such as employer, industry employed, and

continuing education institution). • Wording of sensitive questions, such as income from employment, are collected in categories

rather than “exact” amounts, which results in more representative responses. We have achieved a response rate of up to 55% by these methods, which is better than our PASSHE peers. Comparison of demographic characteristics (e.g., race/ethnicity, full-part time status, first-time versus transfer admittance) between the population (i.e., the full list of graduates) and the sample respondents shows the two groups to be comparable, except on gender—males in the sample are under-represented compared to their proportion in the population.

EPP Response: Analysis of results. The College of Education Employer survey is not disaggregated by program. Employer surveys beginning 2019 are sent out every spring semester to the previous academic years' completers. The employment list is downloaded from the Pennsylvania State Department of Education (PDE) website including the principal's email address and the completer(s) they employ. Millersville did not have an Assessment Coordinator in the Associate Dean position until 2018 when the current Associate Dean position was created. The Employer Survey launched spring 2019 is not disaggregated by programs. An adjustment to the survey launched for spring 2020 will make disaggregated response by program possible. 156 Principals were invited to take the Employer Survey in spring 2019. The Pennsylvania Department of Pennsylvania (PDE) supplied the emails for Principals at schools where our Millersville graduate s were hired. 164 Principals was the audience size. 30 surveys were started and 30 were completed. A return rate of 22%.

There was a 9% return for Neither Agree nor Disagree to questions. This does not supply any survey results knowledge to our questions. With a purposeful sample, Principals only for novice teachers, we

expected the option to be selected rarely. The next time the survey is launched this option for response will be deleted.

Additionally, there is no room on the survey for comments. The initial feeling was Principals are busy and would not take the time to make comments, but it limited our results for analysis for program improvement. The next survey launched will have room for comments.

Even though the survey from spring 2019 was not disaggregated by program some implications for program improvement are implicit. Please see the table from College of Education employer survey:

Survey Item Responders/% of response

Agreed or Strongly agreed

Neither agree or disagree

Implications for change

Millersville’s graduates/teachers facilitate student learning of content through the integration of technology.

16/95% 84% 15.79%/3 While this % = 3 responders for a low total N, this is worth considering.

This weakness in our preparation has shown up consistently in surveys. This response has prompted changes in course requirements to use technology in the courses, use technology, remote instruction and reflective lesson planning, use of data for student progress, and other additional skills learned though the COVID-19 shut down in the field placements. Additionally, the shutdown of face to face instruction lead to using what we learned to spur a Virtual Conference to provide current in-service teachers a venue to learn more about online teaching. The conference will take place June 30, 2020. We will have details and data to share during the site visit.

Millersville’s graduates/teachers present the content to students in challenging, clear ways.

13/68% 16/84% 11%/2 While this % = 2 responders for a low total N, this is worth considering.

Graduates are not applying differing instructional strategies for all students. The implication for the programs is to review planning data from the clinical experiences, require different instructional strategies (linked to integration and use of technologies) in all course requirements.

When planning instruction, Millersville’s graduates/teachers consider the school, family, and community contexts.

15/79% 15/79% 21%/4 While this % =4 responders for a low total N, this is worth considering.

Community relations must be stressed in all clinical practice and course work. Even though students are usually not “allowed” to communicate with community families, the instruction in all program courses must stress this skill. School PSY has strong survey responses to their program graduates in this area. It is proposed the School PSY program share their strategies for teaching candidates this skill so graduates can practice in their classrooms after graduation.

Millersville’s graduates/teachers assess student learning to gauge their instructional impact on the P12 learner.

15/79% 15/79% 21%/4 While this % = 4 responders for a low total N, this is worth considering.

Reflective practices need to be strengthened so novice teachers understand how to use data to gauge impact on their students. This weakness also appeared in the PVASS data from the state. PVASS data from last year brought this to the attention of the Assessment committee and led to the unit assessment review to determine the strength of our assessments. Action was stalled amid the COVID-19 shut down.

The University launched Alumni Job Satisfaction and Job Placement or “AJPS”, which we have administered annually in the fall semester since 2013, asking questions of the previous year’s graduates. We added “supplemental” questions for education department graduates to the survey in Fall 2018, and it has gone out twice—to 2017-18 and 2018-19 graduates. The 2017-18 reports were completed last year and can be accessed by the lead site visitor in Nov. 2020. The 2018-19 results are not back in for the submission of this addendum. The AJPS is disaggregated by program response. Our Early, Middle, and Exceptional Education program (EMEE, College of Education) has data. The programs Education shares with other colleges for content preparation includes content course work in the college of Arts, Humanities, and Social Sciences and the College of Science and Technology for grades 7-12 content areas in Art, English, Biology, Chemistry, and Math are included in the 2017-18 results. The complete data for Alumni Job Satisfaction and Job Placement or “AJPS”, can be found at https://www.millersville.edu/iea/assessment/alumnijobplacement/alumni-satisfaction-and-job-placement.php (if the site asks for a password, it is “jsam95”)

Overall, results from the Alumni Satisfaction Survey are generally positive regarding the alumni's perception of their preparedness based on the EPP's program and shared programs. Specifically, satisfaction with the alumnus' overall FIU experience has steadily been increasing from 59% in 2012-13 to 100% in 2014-15.

Program Quality Rating # Respondent Percentage

EMEE Poor 1 1.4% Fair 1 1.4% Good 33 44.6% Excellent 39 52.7% Total 74 100.0% Biology Poor 0 0.0% Fair 2 9.1% Good 14 63.6% Excellent 6 27.3% Total 22 100.0% History (SS) Poor 0 0.0% Fair 2 11.1% Good 8 44.4% Excellent 8 44.4% Total 18 100.0% Chemistry Poor 0 0.0% Fair 0 0.0% Good 1 6.7% Excellent 14 93.3% Total 15 100.0% Math Poor 0 0.0% Fair 1 4.2% Good 8 33.3% Excellent 15 62.5% Total 24 100.0%

Upon completion of the program, graduates reported high overall satisfaction and high levels of

satisfaction within the areas of Quality of Instruction, Courses in Area of Specialization, Clinical Experiences. The percentages indicating candidates feel their preparation is excellent or good is high for every content area.

Weaknesses are evident in Math which has 8 candidates/33% that rated the program quality as good. This rating might have a lot to do with competence in the Math curriculum. The student test scores in Math are not high for education students who would be included in this department rating. Many Math education students drop out of the program due to difficulty passing courses with the required >C.

The EMEE program alumni had a high rate of feeling not sufficiently prepared to use technology effectively to collect data and improve teaching and learning. 58% of respondents indicated they were not prepared in the utilization of technology in any aspect of their teaching. This response has prompted changes in course requirements to use technology in the courses, use technology, remote instruction and reflective lesson planning, use of data for student progress, and other additional skills learned though the COVID-19 shut down in the field placements. Additionally, the shutdown of face to face instruction lead to using what we learned to spur a Virtual Conference to provide current in-service teachers a venue to learn more about online teaching. The conference will take place June 30, 2020. We will have details and data to share during the site visit.

The COVID virus experience provided an excellent venue for learning about modeling a delivering online instruction and pinpointed a weakness in our graduates. Another result of the extensive remote instructional model used during the pandemic is increased training for our faculty. 80% of our College of Education faculty took part in an extended spring break week to pursue training provided by the University IT department on the use of technology resources for every aspect of delivering online instruction. Examples included use of the D2L learning platform, use of technology resources outside of the university such as the National Board approved tool Atlas. Student teachers who are now graduated will benefit from an abundance of online practice in planning they received in spring 2020. We expect our alumni survey to reflect a decrease in the percentage of novice teachers who think they are not prepared to use technology for instruction, data driven decisions, and data collection.

Title: 4.4 A. Evidence in need of verification or corroboration

B. Excerpt from SSR to be clarified or confirmed

C. Questions for EPP concerning additional evidence, data, and/or interviews

(1) What are the response rates for these surveys?

(2) When the data are disaggregated, what efforts can be made for individual program improvement.

AFI: The EPP provided limited evidence of completer satisfaction. (4.4) Response rates and data from the completer satisfaction survey are not disaggregated by program 4. Preliminary recommendations for new stipulations including a rationale for each

EPP Response:

When survey data is disaggregated some efforts can be made for individual program improvement such as employer notes new teacher’s plans are not complete and do not contain long- and short-term goals

and objectives. Millersville could invite former students, now teacher to attend online, Zoom or Web Ex platform workshops on long- and short-term planning.

COVID pandemic offered our faculty an opportunity to receive extensive technology training in the delivery of instruction including assessment data collection, making data driven decisions, and planning.

The COVID-19 caused an immediate response from Millersville’s faculty to address adequate online teaching for P-12 students. During an extended spring break 80% of our faculty participated in training for emote instruction delivery. The IT department made individual appointments so each faculty member could be offered training unique to their needs. When Employer data is disaggregated by program, we expect our Employer survey to reflect an increase in the percentage of novice teachers who are prepared to use technology for instruction, data driven decisions, and data collection. WE also expect the Employers to feel Millersville novice teachers are more confident and able to handle diverse learning conditions effectively. Using programs like Atlas, our graduates were able to learn to adjust to extreme teaching conditions and learn how to differentiate learning through effective long and short-term planning. The COOPs and supervisors reported working with our candidates remotely with many planning and reflective practice activities. A quote from a student teacher that will assist this novice teacher with the first year and beyond: I have assisted my host teacher, along with other members on the 5th grade team, by providing meaningful feedback to students, assisting students and staff members with technology complications, maintaining the positive relationship through online communication, tracking online participation in all subjects and creating enrichment and review activities for a wide range of learners.https://blogs.millersville.edu/news/2020/04/27/millersville-student-teachers-get-creative-with-online-learning/

Millersville is thinking proactively about novice and experienced teachers in the field by offering a Virtual Conference titled: Supporting Learners in an Online Environment. The virtual conference is in response to partner feedback as an area of need for in-service and novice teachers. The conference will offer 7 strands presided over by MU faculty, school partners, COOP/Mentor teachers, and school administrators. This kind of conference offers graduates of our program continued technology services and professional development. The lack of online teaching experience is an area the Employer Survey could identify by program

Satisfaction of Completers 4.4 The provider demonstrates, using measures that result in valid and reliable data, that program completers perceive their preparation as relevant to the responsibilities they confront on the job, and that the preparation was effective.

EPP Response: The Vice President for Institutional Research has this response to measures that result in valid and reliable data from the survey Alumni Satisfaction and Job Placement. Some of the “classical” issues of reliability and validity do not apply to the Alumni satisfaction and job placement annual update reports and data collection. For the most part, we are not trying to understand or characterize abstract concepts or traits, we are merely trying to describe concrete behaviors (e.g., employment, continuing education). Millersville is not conducting statistical tests of multiple-test or multiple-form reliability, internal consistency, or other aspects of measurement theory. The closest thing to an abstraction that we

measure is the respondent’s self-reported satisfaction with their major program and with Millersville overall—which we take at face-value.

The most significant threat to validity in the survey context, which we do take very seriously, is sampling bias. Our response to this is two-fold: we make a concerted attempt to maximize the response rate, and we adhere to as many best-practice procedures as we can manage, in order to make our sample as representative of the full population of graduates as possible. Given the voluntary, self-selected nature of the survey, we are aware of the systematic threats of “volunteer bias” and “social desirability bias” in the composition of our sample. Therefore, we do the following:

• Use multiple methods: online survey, phone survey, passive commercial social-media scan. • Individual online invitations sent from department chairs rather than a “faceless” bureaucrat. • Multiple, staggered invitation times and days of the week. • Phone callers use randomized patterns and a “neutral” script. • Multiple sources for invitation email addresses, including the respondent’s own contact details

supplied to the Alumni association and the Senior Exit Survey at graduation. • Multiple sources (“triangulation”) for key questions (such as employer, industry employed, and

continuing education institution). • Wording of sensitive questions, such as income from employment, are collected in categories

rather than “exact” amounts, which results in more representative responses. We have achieved a response rate of up to 55% by these methods, which is better than our PASSHE peers. Comparison of demographic characteristics (e.g., race/ethnicity, full-part time status, first-time versus transfer admittance) between the population (i.e., the full list of graduates) and the sample respondents shows the two groups to be comparable, except on gender—males in the sample are under-represented compared to their proportion in the population.

Alumni satisfaction and job placement annual update of recent graduates by year and program are in attachment titled Alumni Job Satisfaction and Job Placement. This report, Alumni satisfaction and job placement is prepared annually by the Office of Research and Assessment for the Council of Trustees. The summarizes graduate rates, average student loan debt, alumni location, bachelor’s degree recipient goals at Commencement, six to ten months out job placement rates of bachelor’s degree recipients, and overall satisfaction with quality of education. Some information in this report is made available at department and school level, pending response rates. The reports included in the attachment for this addendum are for programs that have teacher candidates. The lead site visitor can verify all survey data on the website during the Nov. 2020 visit. The Excel workbook tables provide data of where our candidates were employed in the attachment Alumni Satisfaction and Job Placement, tab titled Employment. For example, the table shows 82% of EMEE graduates were employed throughout the state of PA. Eastern Lancaster County School District Eastern Lebanon County District are school districts with high needs due to poverty, lack of resources, and STEM needs that are greater than neighboring school districts such as the wealthy Penn Manor School District. In 2018-19 85% of baccalaureate degree recipients rated the quality of their education experience in their major as good or excellent. 96% of baccalaureate degree recipients were employed six to ten months after graduation (attachment Alumni Job Satisfaction and Job Placement, within attachment see Annual Job Placement and Satisfaction 18-19). 2017-18 95% were employed six to ten months after graduation and 92% rated the quality of education in the major good or excellent.

This table shares satisfaction of alumni regarding their job preparation for programs that had responders. The data show EMEE majors, total enrollment is largest of our programs at 381, find jobs across Pennsylvania (PA) with a high percentage. The list of schools they are employed at varies across all district needs. The Biology data indicates student who study Biology for certification in teaching have less a chance getting a job in PA and in their majors. This data corresponds with the data collected for EPP created assessments for student teaching. The program is very low enrolled and not many who start out in the certification program complete the program for teaching. On the MU Danielson only 3 candidates were enrolled and finished in Biology for the three cycles of data reported. The same is true for Chemistry certification seeker where no candidate finished the initial program for the 3 cycles of data collected. EMEE = Early, Middle, and Exceptional Education Program

Program Year N = Total Participants

Employment related to Major

Total from program employed in PA

Satisfied with Preparation Strongly Agree and Agree replies

EMEE 16-17 238 98% 82% 93% 17-18 258 97% 84% 89% Art 16-17 58 75% 88% 83% 17-18 251 99% 83% 84% Biology 16-17 142 20% 72% 88% 17-18 165 85% 62% 88% Chemistry 16-17 65 81% 77% 98% 17-18 41 79% 63% 100% English 16-17 81 42% 82% 63% 17-18 58 82% 71% 65% History 16-17 59 59% 79% 86% 17-18 52 50% 78% 88% Math 16-17 47 82% 68% 94% 17-18 64 86% 70% 96% Music 16-17 27 80% 92% 87% 17-18 67 74% 77% 92%

1

Standard 5

EPP response underlined Attachments highlighted

5.1 A. 1 Examples of Annual SLO reports:

EPP Response: Selected SLO reports are included as an attachment, President's Annual Report and SLO Additional Reports. Additional SLO reports will be available for lead site visitor in Nov. 2020

5.1 A. 2 Demo of Dashboards

EPP Response: Millersville will be proud to share the dashboards during the site visit in Nov. 2020.

B. 1. The SSR states that data collected through the online assessment systems can be disaggregated by candidate major and by semester. For what other dimensions can the system disaggregate the data (e.g. over time, by race/ethnicity, gender, etc.)?

EPP Response: Data can be collected over time by course. Other demographics are not collected through the PEU assessment system. Race/ethnicity, gender, retention rates are collected through the Institutional dashboards via the Banner system. These data are available to all departments and programs.

B. 2 In what ways, other than the process associated with the Annual Student Learning Outcomes Assessment Reports, is systems operations and data reviewed and what is the frequency of these reviews.

EPP Response: The College of Education and Human Services and the School of Social Work (EDHS) convenes the College Council (CC) twice monthly (every week during the COVID shut down). The CC includes the EDHS Dean, EDHS department chairs, EDHS Office Manager, and EDHS Associate Dean. The CC authors the President’s Annual Report annually (College council meeting agendas and minutes can be shared during the site visit in Nov 2020). The CC shares the data dashboards and collects data for scheduling of courses, assignment of faculty, regulation of the Teacher Education Policies, and all other college activities. The current focus of the CC is the re-opening of EDHS for the fall.

The Pennsylvania State Department of Education requires an annual report using the data collected from all programs and a Major Review of all programs occurs in a 7-year cycle. CAEP requires an annual report for all teacher certification initial and advanced programs. The Assessment Committee meets regularly to review all unit and program assessment data. Millersville is a member of Middle States performs peer evaluation and accreditation of public and private universities and colleges and therefore collects and reviews systems operations and data annually.

5.2A 1 Describe the process for review of assessment data, how frequently it is done, provide supporting documentation to verify (e.g. meeting minutes with list of attendees, report outcomes from these reviews)

AFI: The EPP provided limited evidence that program changes and modifications are supported or linked back to evidence/data.

2

5.2.2 Assessment data input into the Banner system allows studies to be conducted on validity and reliability and interrater reliability for all significant assessments. Please explain how the system is used for these purposes.

EPP Response: As mentioned for B. 2, the College Council (CC) disseminates data and how they are used for faculty assignment, program changes, course scheduling, and other EDHS policy matters. For example, recently at a CC meeting the enrollment data for each program (this dashboard will be shared during the site visit in Nov 2020 as will minutes of college council meetings if requested). CC discussed recruitment measures to increase enrollment, ethic data for each program, increase and decrease in recruitment data and efforts across the college. An action item from this meeting was to include ways recruitment efforts matching the University EPIICC values will be achieved. At the last CC meeting EPIICC values was again addressed when discussing course enrollment and scheduling for the re-opening of the fall semester after the COVID shut down in soring 2020. The action item was to send a memo to the Associate Provost citing concerns about changing times for course delivery. The adherence to already scheduled times, even though plans for fall re-opening include multi-modality course deliverance as an option, agrees with the universities EPIICC values of keeping compassion for students as a priority. A quote from Dean Drakes message, July 2, 2020, to the EDHS Faculty and Assoc Provost Jim Delle show the use of data to make changes and inform faculty after the most recent CC meeting:

Earlier today we had another very lively College Council; we have been meeting almost every week since the onset of COVID-19. The vast majority of today’s meeting was devoted to preparations for fall. Thanks to input from you and the hard work of the department chairs, we were able to provide Jim Delle with an updated list of course delivery plans for fall. I trust that this dataset has been shared with you by your chair, as I have requested. You will need those data for the next set of tasks. If you need to see the latest data, your chair can provide a copy from our College Council Microsoft Teams site.

As you know, courses have been designated as either Online/Remote (OL) or Face-to-Face (F2F). OL courses are those that will be taught entirely online/remotely – that is, there will be no on-campus elements of the courses. F2F courses will require students to come to campus for a class meeting, even if only one time. As a university, we have strived to provide as many courses as possible in a F2F format for first-year residential students. In our college, this means courses like WELL 175, PSYC 100, UNIV 103, ERCH 110, and SPED 101.

Our next steps follow. Please expect specific instructions from your chair and then work with them to provide the kinds of data we need next. If you have questions, please reach to your chair and/or me.

1. “Determine whether OL courses are synchronous or asynchronous. If synchronous, when and how often will your courses meet synchronously. My sincerest hope is that you will meet synchronous classes during class days and times when those courses were originally offered in a traditional face-to-face manner on campus (as in our original non-pandemic schedule). To alter the days and times for synchronous instruction from those original days and times will cause tremendous upheaval in student schedules requiring herculean efforts to make the changes. If the pre-pandemic course schedule was a compact between the university and our students with regard to when courses would be offered, to change those days and times now would be a violation of that compact.”

3

Development of our internal Banner PEU assessment system met the criteria for keeping abreast of changing assessment technology. Assessment data input into the Banner system not only allows us to review the overall strengths of programs using individual assessments, it also allows us to conduct assessment of the validity and utility of program major.

Assessment of the validity and utility of major assessment data takes place in all programs, both in the assessment development process and in the interpretation of data. Assessment data input into the Banner system allows programs to conduct comparative studies to be conducted on validity and reliability and interrater reliability for all significant assessments. Please explain how the system is used for these purposes. Unit assessment system data is collected in a uniform way across programs for the unit. For example, all programs collect student teaching data at the 7.5- and 15-week time periods with the same MU Adapted Danielson assessment (see attachment 3 cycles of data for EPP created Assessment). Unique assessment components required for individual program needs are collected independently of the unit system. Data are collected over semesters and academic years and used for Data Day review and programmatic changes and are shared with programs. An example of changes across a unit assessment is the professional behaviors assessment. The faculty and partners wanted to change the assessment to monitor observed behaviors that demonstrated professionalism not thoughts. The thinking behind the professional behaviors is the component Demonstrates Commitment to Becoming a Professional to be evaluated as proficient a candidates is not expected to just “believe” or “think” about becoming a professional, they are required to demonstrate becoming a professional through actions such as: , accepting feedback and working to make improvements through actions such as arriving on time, dressing professionally, communicating with professional language, and attending professional meeting as offered. 5.2 C 1: It was indicated that the Banner system for collecting assessment data allows the EPP to conduct studies of the validity, reliability, and interrater reliability of all major assessments. What specific data is available to support this statement?

EPP Response: Cooperating teachers and supervisors enter data for the Mu Danielson two times a semester for the student teachers. WE ran inter-rater reliability using the data from the Banner system which contains the PEU assessment system. Banner feeds course instructors, sections, faculty assigned to enter data into the PEU system, and the semester the courses are offered automatically into the PEU data collection system. Using these data, the unit can compare the ratings given by COOPs and supervisors. See attachment 15-week Danielson Percentages COOP versus SUPER.

Reliability is about the consistency of a measure, and validity is about the accuracy of a measure. We review the report data collected by the Banner system, PEU assessment data, annually for the last 8 years. The comparative analysis of the data done at data days shows us the data is consistent in the measurement of the performance of our candidates. Clear, usable assessment criteria contribute to the openness and accountability of the whole process. The context, tasks, and behaviors desired are specified so that assessment can be repeated and used for different individuals and produce reliable data. Explicit criteria also counter criticisms of subjectivity. The most recent review of the EPP created assessment Professional Behaviors was recently reviewed by faculty from all departments and lead by a

4

school partner to review the assessment criteria and interrater reliability in March 2019 (see standard 1 evidence for reliability and validity). The PEU system can be reviewed by the review team site leader in Nov. 2020.

5.2 C 2 What steps has the EPP taken to ensure the EPP created assessment and surveys used in the quality assurance systems scores at the minimal level of sufficiency as defined by the CAEP assessment rubric?

EPP Response: Please see Standard 1 Task: Unit Assessments A 1 for a narrative and a data table. The CAEP Sufficient level and specific elements are aligned with the assessment committee meeting date and attendees in a matrix matching the CAEP sufficient evaluation framework with meeting topics and action items. Meeting minutes’ topics include specific items form the working of the assessment committee are posted in the Millersville University's online education platform D2L as shared data for faculty. Additional data is included in B. #2, addressing the process used for unit assessment evaluation and revision. Additional assessment committee topics, dates, and members present are evidence of revision of the PEU unit assessments and addresses Administration and Purpose, Content of Assessment, and scoring. Evidence the Cooperating Teacher Survey, Student Teacher (completer) Survey, and the Early Field Diversity Survey meet the minimal level of sufficiency as defined by the CAEP assessment rubric are included in the table below titled Cooperating Teacher Survey, Student Teacher (completer) Survey, and the Early Field Diversity Survey. The surveys are launched through Qualtrics survey tool which establishes a confidence level for data for each survey.

EPP Response: Ensuring the surveys include a representative sample or are purposeful representative samples depends on reviewing the survey annually before launching. Purposive sampling, by definition, is useful when a researcher is looking to investigate a phenomenon or trend as it compares to what is considered typical or average for members of a population. Survey recipients are taken from rosters for courses stored in Banner Learning Management System.

Advisory councils, faculty, and the office of Institutional Research are often sought out to review the surveys to be sure the questions are not leading and address a need for information. An example comes from the Music program supervisors. The supervisors noticed the Qualtrics is the primary tool for launching surveys. Qualtrics reviews the quality of the question and offers edits to ensure questions are phrased correctly to get data desired. For example, in Nursing the survey showed technology use was a weakness. This weakness led to a review of the various programs being used by school districts to see if the Department of Nursing could a technology tool used by the schools, i.e. IPads, for candidates to gain some experience prior to the practicum. This action strengthened the surveys’ purposeful sampling as surveyed candidates were placed in school districts review ensure the sample is representative of the information needed.

Cooperating teachers provide triangulation of data from K-12 experts on the quality of our candidates. Furthermore, the Danielson evaluation is well-understood by our cooperating teachers since it is the same evaluation that is used for their own evaluation. In the Spring of 2014, 88% of our cooperating teachers had received training from the IU or their district on the use of the Danielson instrument. We believe that this percentage is now nearing 100% with the continued full implementation of the teacher

5

evaluation system in K-12. Therefore, these ratings represent external partner evaluations using a high-quality instrument that they have been trained on.

Qualtircs computes confidentiality rating for the Early Field Survey, The Student Teacher Survey, and the COOP survey is at 95%. The two surveys have been launched since 2014 and have been reviewed for consistency in data day review.

Table: Cooperating Teacher Survey, Student Teacher (completer) Survey, and the Early Field Diversity Survey

CAEP Sufficient level 1. ADMINISTRATION AND PURPOSE (informs relevancy)

CAEP Sufficient Level 2. CONTENT OF ASSESSMENT (informs relevancy)

CAEP Sufficient Level 3. SCORING (informs reliability and actionability)

CAEP Sufficient Level 4. Data Reliability

CAEP Sufficient Level 5. Data Validity

CAEP Sufficient Level 6. Survey Content

CAEP Sufficient Level 7 Survey Data Qualify

Early

Fie

ld S

urve

y

a. The point or points when the assessment is administered during the preparation program are explicit. b. The purpose of the assessment and its use in candidate monitoring or decisions on progression are specified and appropriate. c. Instructions provided to candidates (or respondents to surveys) about what they are expected to do are informative and unambiguous.

e. Evaluation categories or assessment tasks are aligned with CAEP, InTASC, national/professional and state standards.

a. Indicators assess explicitly identified aspects of CAEP, InTASC, national/professional and state standards. c. Indicators unambiguously describe the proficiencies to be evaluated. e. Most indicators (at least those comprising 80% of the total score) require observers to judge consequential attributes of candidate proficiencies in the standards.

a. The basis for judging candidate performance is well defined. d. Feedback provided to candidates is actionable—it is directly related to the preparation program and can be used for program improvement as well as for feedback to the candidate. e. Proficiency level attributes are defined in actionable, performance-based, or observable behavior terms. [NOTE: If a less actionable term is used such as “engaged,” criteria are provided to define the use of the term in the context of the category or indicator.]

b. Training of scorers and checking on inter-rater agreement and reliability are documented.

d. The EPP details its current process or plans for analyzing and interpreting results from the assessment. e. The described steps meet accepted research standards for establishing the validity of data from an assessment.

a. Questions or topics are explicitly aligned with aspects of the EPP’s mission and also CAEP, InTASC, national/professional, and state standards. b. Individual items have a single subject; language is unambiguous. c. Leading questions are avoided. d. Items are stated in terms of behaviors or practices instead of opinions, whenever possible. e. Surveys of dispositions make clear to candidates how the survey is related to effective teaching.

a. Scaled choices are qualitatively defined using specific criteria aligned with key attributes. b. Feedback provided to the EPP is actionable. c. EPP provides evidence that questions are piloted to determine that candidates interpret them as intended and modifications are made if called for.

6

Stud

ent T

each

er (c

ompl

eter

) Sur

vey

Data

col

lect

ed e

ach

sem

este

r

a. The point or points when the assessment is administered during the preparation program are explicit. b. The purpose of the assessment and its use in candidate monitoring or decisions on progression are specified and appropriate. c. Instructions provided to candidates (or respondents to surveys) about what they are expected to do are informative and unambiguous.

e. Evaluation categories or assessment tasks are aligned with CAEP, InTASC, national/professional and state standards.

a. Indicators assess explicitly identified aspects of CAEP, InTASC, national/professional and state standards. c. Indicators unambiguously describe the proficiencies to be evaluated. e. Most indicators (at least those comprising 80% of the total score) require observers to judge consequential attributes of candidate proficiencies in the standards.

a. The basis for judging candidate performance is well defined. d. Feedback provided to candidates is actionable—it is directly related to the preparation program and can be used for program improvement as well as for feedback to the candidate. e. Proficiency level attributes are defined in actionable, performance-based, or observable behavior terms

b. Training of scorers and checking on inter-rater agreement and reliability are documented.

d. The EPP details its current process or plans for analyzing and interpreting results from the assessment. e. The described steps meet accepted research standards for establishing the validity of data from an assessment.

a. A description or plan is provided that details steps the EPP has taken or is taking to ensure the validity of the assessment and its use. b. The plan details the types of validity that are under investigation or have been established (e.g., construct, content, concurrent, predictive, etc.) and how they were established. c. If the assessment is new or revised, a pilot was conducted. d. The EPP details its current process or plans for analyzing and interpreting results from the assessment. e. The described steps meet accepted research standards for establishing the validity of data from an assessment.

a. Scaled choices are qualitatively defined using specific criteria aligned with key attributes. b. Feedback provided to the EPP is actionable. c. EPP provides evidence that questions are piloted to determine that candidates interpret them as intended and modifications are made if called for.

7

5.3 A. 1 Evidence to verify Data Day meetings are “well attended by PEU faculty and invited school partners?

The EPP provided limited evidence that program changes and modifications are supported or linked back to evidence/data.

5.3 B 1 The Evidence of changes supported by data, monitored, and results of the changes trend toward Improvement. Please provide a separate file.

EPP Response: A data day folder is in attachments titled: Assessment Committee Meetings/Data Day. Attendance, dates of meetings, and actions taken are indicated in the reporting of the meeting findings. The PEU Assessment Committee and the Associate Dean are charged with evaluation of assessments and data that are used on the Unit level. The development of our current student teaching instrument was a response to an analysis by the Assessment Committee of the content validity of our old student

Coop

erat

ing

Teac

her S

urve

y Da

ta c

olle

cted

eac

h se

mes

ter

a. The point or points when the assessment is administered during the preparation program are explicit. b. The purpose of the assessment and its use in candidate monitoring or decisions on progression are specified and appropriate. c. Instructions provided to candidates (or respondents to surveys) about what they are expected to do are informative and unambiguous.

e. Evaluation categories or assessment tasks are aligned with CAEP, InTASC, national/professional and state standards.

a. Indicators assess explicitly identified aspects of CAEP, InTASC, national/professional and state standards. c. Indicators unambiguously describe the proficiencies to be evaluated. e. Most indicators (at least those comprising 80% of the total score) require observers to judge consequential attributes of candidate proficiencies in the standards.

a. The basis for judging candidate performance is well defined. d. Feedback provided to candidates is actionable—it is directly related to the preparation program and can be used for program improvement as well as for feedback to the candidate. e. Proficiency level attributes are defined in actionable, performance-based, or observable behavior terms.

b. Training of scorers and checking on inter-rater agreement and reliability are documented.

d. The EPP details its current process or plans for analyzing and interpreting results from the assessment. e. The described steps meet accepted research standards for establishing the validity of data from an assessment.

a. Questions or topics are explicitly aligned with aspects of the EPP’s mission and also CAEP, InTASC, national/professional, and state standards. b. Individual items have a single subject; language is unambiguous. c. Leading questions are avoided. d. Items are stated in terms of behaviors or practices instead of opinions, whenever possible. e. Surveys of dispositions make clear to candidates how the survey is related to effective teaching.

a. Scaled choices are qualitatively defined using specific criteria aligned with key attributes. b. Feedback provided to the EPP is actionable. c. EPP provides evidence that questions are piloted to determine that candidates interpret them as intended and modifications are made if called for.

8

teaching instrument. “Data Day” meetings are the primary method used by the assessment committee for comprehensively evaluating validity and utility of data. Data Day meetings annually, are open to all PEU faculty and stakeholders. Unit data is shared with faculty at these meetings. These meetings provide faculty a chance to analyze the meaning and validity of multiple measures. Faculty typically ask questions about the statistical significance of data, the meaning of one data set in relation to complementary measures, the meaning of qualitative data when compared to quantitative measures with the same focus. Stake holders are asked for feedback and assistance in data analysis aimed at programmatic improvements. These meetings have led to the development of new survey instruments for cooperating teachers and for the revision of existing surveys of graduates. Additionally, the Professional Behaviors policy and rubric were developed through COOP and Supervisors scores of the MU Adapted Danielson. The scores showed a weakness in our candidates’ abilities to demonstrate professional behaviors. Data Day attachment, Jan. 26, 2018.

EPP Response: A data day folder is in attachments titled: Assessment Committee Meetings/Data Day

5.4 C 1. Link for where the outcome and impact measures and their trends posted on EPP website. Please explain other ways these are shared publicly.

Stipulation: The EPP provided no evidence that measures of completer impact and their trends are widely shared or used for program improvement.

EPP Response: The Millersville website shares numerous links for measures of completer impact as listed below. A good example is the website titled Becoming a Teacher, https://www.millersville.edu/education/become-a-teacher/index.php

This site shares Title II data from three years. A review of this data led the programs to focus more on composition and writing skills. The overall writing scores were the lowest of all the basic skills scores. Our programs added more writing enriched designated courses such as EDUC 424 Diagnostic Reading (W), SPED 311, Design/Implement Instruction (W), and Early Childhood 315 (W). Another data driven partner driven decision was to include an urban placement requirement in each Foundation course that is required for all majors. Our partner, Dr. Rau the Superintendent of the School District of Lancaster, voiced concerns about the development and education of her 6th grade boys. That lead to Dr. Miriam Witmer was awarded an external grant for 1/4 release time for both Fall and Spring to create and coordinate a mentoring program for 6th grade Black boys in the School District of Lancaster. Dr. Witmer brings this experience and knowledge into her work with Millersville candidates in the courses she teaches. Data is not in as yet but hopefully Lancaster will see a decrease in 6th grade black male dropouts.

Links for the outcome and impact measures and trends are posted on EPP website.

Title II reports: https://www.millersville.edu/education/titleii.php

Graduation and Persistence Rates: https://www.millersville.edu/computerscience/curriculum/enrollment-and-graduation-rates.php

Fact book with admissions data, department analysis, tuition and fees, degrees by program and Major: https://www.millersville.edu/iea/ir/factbooks/1718/index.php

Student teacher: https://www.millersville.edu/studentteaching/index.php

Employment rate: page 8 Student Body: https://www.millersville.edu/catalogs/graduate/index.pdf

9

Employment: https://www.millersville.edu/hr/employment/

Admission policies are published in university catalogs, including criteria for entrance into teacher preparation programs, https://www.millersville.edu/programs/. This information is also regularly distributed during advisement meetings.

Incoming first year and transfer candidates receive information about policies in their programs during orientations, web sites designed to inform students using videos, wikis, and narrative explanations, and through other email communications and social media. Although accessing the D2L system requires a password (a demo will be held for site visit Nov 2020 if requested) this site: https://www.millersville.edu/search-results.php?query=D2L shows the many resources available to students. During the COVID -19 shut down new student meetings were held via Zoom. Although the attendance was not as good as the face to face recruitment meetings, our attendance was 116 enrolled and 80 attended. During the recruiting event potential students come and visit with College Deans, faculty, and staff from the entire University. Lunch sessions are held for our grades 7-12 potential students who attend their content areas for the breakout sessions. An example is a Social Studies certification seeker will attend the College of Arts, Humanities and Social Sciences amin session and a lunch session with education faculty members for their program in Social Studies education.

All teacher candidates in secondary preparation programs are assigned an academic co-advisor in education to assist in the process of advisement. Academic class schedules include designations for Advanced Professional Studies (APS) courses and teacher candidates' records include information as to their compliance with entrance into their programs.

All candidates have access to a wide range of resources and services to ensure their success at MU. The university web site is the primary source of quick-access information including academic calendars, catalogs, publications, policies, services, etc. https://www.millersville.edu/

Candidates receive direct access to information through advisement meetings. Every candidate is assigned an advisor upon entering MU and must meet with the advisor at least once per semester in order to acquire a pin number needed for class registration. This system ensures that candidates make contact with faculty who can provide accurate guidance and support.

5.4 C 2. See attachment President’s Annual Report SLO Additional Reports. Describe annual reporting measures are used in your assessment system. Provide evidence to show analysis of trends, comparisons with benchmarks, identification of changes made in your preparation curricula and experiences, how/when/with whom results are shared, resource allocations affected by your uses of the information and future directions.

EPP Response: Please see attachment President’s Annual Report and SLO Additional Reports (a new system for storing and reporting has just started for SLO reports. The site, Nuventive, will be available to the lead site visitor for Nov. 2020 visit). There are two aspects of our system that assure regular and systematic use of data. First, each department is required to produce annual reports including an analysis of current programs and plans for improvement, SLO Report.

1. All Student Learning Outcomes (SLOs) for the program. Along with a listing of all SLOs, each SLO should include:

a. how each SLO was measured.

10

b. when each SLO was measured (e.g. year 1, year 2, year 3, year 4, and/or year 5); c. the results from each SLO assessment; and d. any actions taken based on the results from the assessment.

2. Curricular Map. For concepts, theories, and skills introduced, reinforced, or applied in each required or elective course, indicate:

a. which SLO(s) is/are introduced in the course. b. which SLO(s) is/are reinforced in the course. c. which SLO(s) is/are applied in the course; and d. which courses had data collected for program assessment.

3. Student Learning Outcomes Assessment Plan, Results, and Use. Overview of the plan for assessing student learning and the results. Include how the department has incorporated the results of the assessment back into the curriculum or department i.e. closing the loop on identified issues. At a minimum, the following questions should be answered:

a. Are students meeting the program’s learning outcomes at the planned level? i. If not, what should be changed to achieve the desired results? ii. If the learning outcomes are met, are there specific efforts that can be attributed to the students’ success?

5. Support for the SLOs supported by departments other than the department hosting the program. Please include:

a. department name that provides support for the program. b. course name and identifier, if applicable; and c. description of the support provided.

In each report, "evidence must be presented...that assessment results have been analyzed and have been or will be used to improve candidate performance and strengthen the program." This description should not link improvements to individual assessments but, rather, it should summarize principal findings from the evidence, the faculty's interpretation of those findings, and resulting changes made in (or planned for)the program as a result. Second, the Assessment Committee and Assessment Coordinator are charged with regular analysis and distribution of data. The Assessment Committee sponsors regular "Data Day" meetings for unit faculty and external partners. The goal of these meetings is to use data for improvement of courses, programs, and clinical experiences, particularly across the unit. The processes described above assure that our approach to these criteria is systematic. Evidence of implementation of the criteria can best be seen through concrete examples. While all assessment can be used by candidates for improvement, several specific examples highlight how we make this purpose explicit. First, the instruments assessing impact on student learning include reflective pieces where candidates are asked to self-evaluate and to make plans for improvement. They are also asked to constantly reflect on their teaching and student learning. Second, the Millersville student teaching instrument is used twice during a candidate's placement. The rating scale used at mid-placement provides formative rather than summative criteria (e.g., "Reasonable progress evidenced vs. Exemplary: candidate consistently and thoroughly demonstrates". A third example is the Professional Development Plan used with Special Education candidates. Candidates self-assess professional dispositions, develop a plan for improvement, collect data, and do a post analysis of their improvement. 5.5 A 1. How is survey data shared with stakeholders through D2L platform?

EPP Response: We apologize for this unclear statement about the D2L platform. Survey data is shared with stakeholder through email. The email to cooperating teachers (COOPs) and adjunct and requests

11

their feedback and comments. Standard 2 of the addendum report provides how COOP survey comments contribute to field services improvements such as devising an on-line training for new The survey results are emailed to cooperating teachers, student teachers, and placed on the D2L platform for faculty to access and share. Faculty meetings focus on survey data and analyze results to improve supervision and obtain information about placements and cooperating teachers such referrals for very successful COOPs and exceptional placement information that provide our candidates with placements that will broaden their experiences and deepen their content knowledge.

Please access this site: https://www.millersville.edu/education/peu/index.php through the Millersville home pages. This site does not require a password. Stakeholders can access the Professional Education Assessment system and view videos of how the system records data. Stakeholders can see what kind of data is available through the PEU system and what kinds of reports we use internally. Department chairs have access to all reports in the D2L system for the purpose of data driven decision making. Chairs can easily share report data with stakeholders, compare data results with other programs with stakeholders, and report to stakeholders in advisory council meetings.

5.5 A 2. Specific evidence of diverse shareholder involvement in decision making, program evaluation, and selection and implementation of changes for improvement. Although examples of stakeholder input considered in some program change decisions is provided, there is limited evidence to document that stakeholder are involved the decision making.

AFI: The EPP provided limited evidence that program changes and modifications are supported or linked back to evidence/data. There is limited evidence that the EPP analyzes assessment data, identifies trends and patterns within and across preparation programs, investigates differences, and uses data for continuous improvement.

EPP Response: As described earlier, the Assessment Coordinator regularly collects, compiles, summarizes, and analyzes unit data including Praxis data, student teaching data, and surveys of graduates. Each program has at least one faculty member responsible for collecting data on major program assessments. Unit assessment data is reported publicly in the PEU Assessment Committee and in PEU "Data Day" meetings. Departments share programmatic assessment data with faculty within the department and with the Assessment Coordinator. Reports on individual candidates are also available to appropriate faculty for purposes of advisement. Assessment data and resulting curricular and programmatic changes for the annual reports and in the Major Review. A sampling of data driven changes is below. There are other examples noted in this Addendum and in the attachments referenced in this bulleted list.

• We hold a “Data Day” meeting each semester. Because of the growing body of data that is now available to us, we try to organize these around a theme that is common to all programs. We invite our external school partners (cooperating teachers, supervisors, principals, and superintendents) to join us for the discussion. Often the format of these meetings is to have roundtable discussions, and then to report-out to the participants or to share the results later via the Assessment Committee. Groups of faculty have identified multiple areas for possible change at the unit level. See attachment Assessment Committee Meetings and Data Days.

• With advice from our advisory council, we created the Principals' Preparatory Inventory (PPI) which is a full-day assessment whereby 1st year program students work through tasks common in a principal's day, program faculty and the coordinator utilize the results to inform curricular decisions at the program, course and individual level. The 360-hour principal internship also

12

serves the same purpose, although this component of the program, along with its evaluation component, occurs at the end of the students' program. Professional Development Schools conducted workshops with all partners and revised collaboratively revised assignments and assessments. Attachment PDS Partner and Faculty Meetings.

• Enrollment data is a chosen focus for a Data Day in spring 2021.Recruitment plans will be reviewed, and strategies will be discussed. In the past Data Day meetings faculty received data regarding enrollments in their programs and collectively defined a public relations focus for future recruitment efforts.

• Many of the areas of needed improvement can be conceptualized as the integration of theory with practice. One suggestion was to develop an earlier field experience for secondary education students. The secondary students have two urban placements now during the Foundations classes for initial and PB cert programs.

• One area of needed improvement was in preparing educators for integration of literacy skills throughout the curriculum and to support English Language Learners and preparing candidates to support students with special needs. We have developed two new courses at the secondary level to address this and a related suggestion has been made that we must have a shift in thinking so that we all consider ourselves to be ELL, SPED, and/or Diversity Teachers not just program teachers. For example, PreK-4 teachers have inclusive classrooms therefore they need content knowledge in ELL, SPED and Diversity issues to plan and deliver instruction to their grade level group. These teachers also must have ready knowledge of resources for referrals if the need arises.

• The Foundations bloc field experience has, for many years, required an urban placement. Data indicated that placements are dwindling at a rapid pace. Despite these difficulties, a commitment was made to keep the urban placement to maintain diversity in Field Experience. Please see standard 2 table with meetings with school partners.

• The Applied Engineering, Safety, and Technology (AEST) program created or revised 12 courses in response to data collection on the needs of majors. Programmatic changes are not directly impacting other programs; however, they are being shared in case there is a potential benefit for other programs.

• School Psychology has developed exit interview data to improve the program. One interest for Data Days is to examine the data for its benefits, learn more about it, and see if it can be extended to other programs effectively.

A. 1 Phase in responses and Phase in Plan

Title: Plan across program areas for candidates to demonstrate proficiencies

A. Evidence in need of verification or corroboration

(1) matrix indicating generic professional skills addressed in all advanced program areas as well as proficiencies (to be) assessed in program areas.

B. Excerpt from Self-Study Report (SSR) to be clarified or confirmed

(1) Each advanced program area has evaluation tools that allow both Millersville faculty and practitioner-mentors to evaluate the knowledge, skills, and dispositions of candidates and their ability to implement best practices.

EPP Response: Please see table titled as below in A.2.1

C. Questions for EPP concerning additional evidence, data, and/or interviews

(1) Is there a plan to collect data on candidate learning in all programs with valid and reliable instruments? (2) How do all program areas know that state approval standards are met?

EPP Response: All program areas ‘know’ that Pennsylvania Department of Education (PDE) approval standards are met because they completed an alignment matrix with how evidence is collected, attachment State Standard Alignment with Programs is included in plan A.1.1A. Additionally, submitting program data for state approval requires every teacher certification program, all Millersville Programs have been through PDE major review and been approved (PDE approval letter in SSR) component 5.3, attachment ADV programs Major Review, provide a matrix aligning all course work and requirements to the PDE state standards.

There are six areas that advanced candidates must show competence according to the CAEP handbook. Also, the CAEP handbook is the statement that at least three of the six must be addressed in the ADV plan for A.1. Please see the attachment titled A.1.1 Proficiency Matrix for evidence for six areas aligned with ADV programs. Included in the plan below are: A.1.1 A Supporting appropriate applications of technology for their field of specialization; A.1.1 B Application of professional dispositions, laws and policies, codes of ethics and professional standards appropriate to their field of specialization; and A1.1 C Application of Data Literacy: Use of research and understanding of qualitative; Quantitative and/or mixed methods research methodologies.

Plan for attachment titled A.1.1 Proficiency Matrix programs Attachments are highlighted

Exhibit A.1.1. A An Applications of technology for their field of specialization Requirement

Collaborative Meetings with Partners A. 2.1 (including how many times they meet) Content Knowledge of ADV Candidates used in the field A.2.1, A.2.2 Evidence of monitoring of candidates and providing support A.1.1 Evidence submitted to State Major Review and posted in D2L learning management system for annual data retreats and program department meetings.

RELA

TIO

NSH

IP T

O S

TAN

DAR

D O

R CO

MPO

NEN

T Explicit link of the

intended data/evidence to the standard or component it is meant to inform

All ADV programs are required by the Pennsylvania State Department of Education (PDE) licensure program approval to name courses and provide a description of how the program integrates technologies

A description of the content and objective of the data/evidence collection

To address the standard applications of technology our programs have detailed descriptions and course names provided in the attachment titled ADV Program Integration of Technology and A.1.1 Proficiency Matrix. Some programs like Nursing, have a survey question regarding the use of technology, but not all programs have this item on their program survey. Updating the ADV present surveys that have been in use since 2016 will provide an evaluation of ADV candidates’ use of technology. An example of program improvement to strive for is the change made in the School Nurse program. The survey of graduates provided a weakness in using technology programs. The ADV program added a course, NURS 560 including objectives and goals for the course to assess use of technologies. Using state standards as a foundation for survey response items is a good way to align all survey data to required standards. State standards are: IIB.4 Incorporate technology into instruction appropriately, IIB.11 Integrate technology and other resources appropriate to prepare students for higher education, full citizenship, and the workforce, attachment State Standards Aligned with Programs

Tags for Data evidence A 1.1

TIM

ELIN

E AN

D RE

SOU

RCES

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit.

May-July 2020 Committee selected to review present survey and develop question regarding technology use. August 2020 Assessment Committee review of changes in survey September 2020 launch Qualtrics survey October- Nov. 2020 collect data and begin data analysis. Nov. 2020 share survey with site leader during visit. Dec. -January 2021 Programs note any program improvements that are needed. February 2021 submit data and any program changes to assessment committee

A description of the personnel, technology, and other resources available;

Assessment committee members will be asked to review change to survey draft and provide feedback. D2L and PEU assessment system through Banner. Alumni and graduate survey through Millersville’s Office of Institutional Planning and Research Qualtrics

DATA

QU

ALIT

Y

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Since ADV programs has an established pattern of surveying graduates and partners and has an established readability, content, written quality of questions for stablished validity. A thorough review of the added questions validity will keep the overall validity of the survey in tack. Qualtrics is normally selected as the survey tool and its tools do a crosstabs analysis to provide a confidence level. The ADV programs have a list of contacts on file.

Steps that will be taken to attain a representative response

A concerted attempt to maximize the response rate, and we adhere to as many best-practice procedures as can be managed, in order to make our sample as representative of the full population of graduates will be employed. Update of these representatives occur annually as candidates or sites change.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Data will be reviewed annually for PDE annual reporting. Assessment committee data days held annually will review survey data implications for programmatic changes.

Exhibit A 1.1 B Application of professional dispositions, laws and policies, codes of ethics and professional standards appropriate to their field of specialization

Requirement RE

LATI

ON

SHIP

TO

ST

ANDA

RD O

R CO

MPO

NEN

T Explicit link of the intended data/evidence to the standard or component it is meant to inform

All ADV programs have narrative explanations and some evidence of candidates’ integrity and ethical behavior, professional conduct. See attachment ADV Programs Survey and Analysis and A.1.1 Proficiency Matrix; column, Professional Responsibilities for course listing and assessments used by the ADV programs. Also attached Quantitative data collection is not consistent or complete among all ADV programs. Rdg Specialist, Principal, Supervisor, School Psy have data available from the SPA reports.

A description of the content and objective of the data/evidence collection

ADV programs not included in data collection through the PEU assessment system will be added. Another option is to change the Employer Survey currently used with established CAEP level 3 or above on the CAEP assessment rubric will add additional survey queries for ADV programs. The list of contacts used presently would be updated to include ADV program employers.

Tags for Data evidence A.1.1

TIM

ELIN

E AN

D RE

SOU

RCES

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit.

The Associate Dean will establish an assessment collection report for ADV programs currently not collecting data through the PEU unit system. If ADV programs are collecting data but not reporting it through the PEU, data collection process will be clarified and a system for collection of the data beyond a narrative explanation will be established. D2L and PEU assessment system through Banner. Alumni and graduate survey through Millersville’s Office of Institutional Planning and Research August 2020 Communicate with ADV programs to review data collection process for quantitative professional behaviors. September 2020 Assessment Committee meeting to establish data collection through PEU. November 2020 collect data through PEU for all ADV programs. December-January 2021 review data and assess need for programmatic changes.

A description of the personnel, technology, and other resources available;

Assessment Committee will discuss data collection with ADV program representatives and establish a connection with the Professional Behaviors assessment currently used by the Professional Education Unit.

DATA

QU

ALIT

Y

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Unit assessment used for Professional Behaviors has established reliability, internal consistency, and content inter-rater statistics. Training of scorers and checking on inter-rater agreement and reliability are documented.

Steps that will be taken to attain a representative response

Assessment committee monitors all Unit assessments annually. Professional behaviors data is collected at a minimum of twice in an academic year and monitored by the department chairs. Update of these representatives occur annually as candidates or sites change.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Data review and comparison of unit assessments at annual Data Day review for program improvement. Assessment committee review each semester produces plan for program improvement. Share data collection with stakeholders for program improvement.

Exhibit A1.1 C Application of Data Literacy, application of content knowledge in the field Also see table titled Collaborative Meetings with Partners A. 2.1 (Evidence submitted to State Department Major Review and shared in Banner, D2L EDHS assessment files) Requirement

RELA

TIO

NSH

IP T

O S

TAN

DARD

O

R CO

MPO

NEN

T

Explicit link of the intended data/evidence to the standard or component it is meant to inform

In the attachment A.1.1Proficiency Matrix in the first column as all ADV programs provide evidence of the Application of Data Literacy by providing courses where candidates demonstrate the ability to read, work with, analyze, and argue with data. The second column of the attachment Use/Understanding of research methods is a continuation of the courses and evidence collected that demonstrates ADV candidates are taught and assessed on skills involving reading and understanding data.

A description of the content and objective of the data/evidence collection

The attachment Alignment to State Standards includes the standard, 1B.1 Recognize and implement the major concepts, principles, theories, and research related to adolescent cognitive, social, sexual, emotional, and moral development. Using this standard as part of supervisor, candidate, and alumni surveys would add quantitative data for all ADV programs.

Tags for Data evidence A.1.1

TIM

ELIN

E AN

D RE

SOU

RCES

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit.

Including a survey response item to supervisor, candidate, and alumni surveys would provide data that could be compared to other quantitative data collected through the PEU assessment system. D2L and PEU assessment system through Banner. Alumni and graduate survey through Millersville’s Office of Institutional Planning and Research

A description of the personnel, technology, and other resources available;

The Assessment Committee will discuss survey content and data collection with ADV program representatives and establish a connection with surveys already used in the ADV program to add survey response designed to collect data reflecting candidate use of data to inform instruction and data literacy skills.

DATA

QU

ALIT

Y

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Surveys for ADV programs have been utilized since 2015. The surveys have established reliability, internal consistency, and content inter-rater statistics. Training of scorers and checking on inter-rater agreement and reliability are documented.

Steps that will be taken to attain a representative response

Survey replies and launching will be monitored closely with follow up via email or phone calls. Graduate assistants will get in touch with contact list.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Regular assessment committee meetings provide opportunities within each semester for data review. Annual data review sessions offer opportunity for comparison of data between programs. Data review provides the program ways the data implies program changes.

The EPP provides some limited data measuring candidate knowledge and skills. (A.1.2) Not all programs have evidence of proprietary licensure examinations or other content specific data.

EPP Response: Please see table Knowledge of Content area other than the proprietary licensure exam table A.2.2 B

ADV Program Proprietary Assessment Praxis 2018-2019

Content Knowledge Pass Rate

# Test Takers

Counselor 5421 100% 8 Rdg Spec 5301 100% 6 Prin 5412 100% 4 Nurse The Pennsylvania State

Department of Education does not require a proprietary license exam due to the requirement of a Nursing license for entry into

* *

the program. All School Nurse candidates must hold a Bachelor of Science in Nursing (BSN) degree and current licensure as a registered nurse in Pennsylvania.

Sch Psy 5402 100% 15 Supervisor 5412 100% 3

The EPP provides no consistent data indicating measurement of candidate knowledge and skills in the proficiency areas indicated in CAEP handbook.

What evidence is available to candidate application of content knowledge in the field?

EPP Response: See Exhibit A1.1 C Application of Data Literacy, application of content knowledge in the field. Also see table titled Collaborative Meetings with Partners A. 2.1 (Evidence submitted to State Department Major Review and shared in Banner, D2L EDHS assessment files)

Title: A.2.2

A. Evidence in need of verification or corroboration

B. Excerpt from SSR to be clarified or confirmed

C. Questions for EPP concerning additional evidence, data, and/or interviews (1) What is the EPP's response to component A.2.1?

EPP Response: There are six areas that advanced candidates must show competence according to the CAEP handbook. Also, the CAEP handbook is the statement that at least three of the six must be addressed in the ADV plan for A.1. Please see the attachment titled A.1.1 Proficiency Matrix for evidence for six areas aligned with ADV programs. Included in the plan below are: A. 2.1 A; Use/Understanding of research methods; A.2.1 B Use of data analysis/evidence to develop supportive school environments; and A.2.1 C See table below.

Exhibit A. 2.1 A Use/Understanding of

research methods

Requirement

RELA

TIO

NSH

IP T

O S

TAN

DARD

OR

COM

PON

ENT

Explicit link of the intended data/evidence to the standard or component it is meant to inform

ADV programs by their nature are focused on application of research and data driven decisions. ADV program course and clinical experiences focus attention on action research, case studies, literature reviews, presentations, and papers.

A description of the content and objective of the data/evidence collection

Evidence collection already in place: Rdg Spec: Oral Presentation, Theoretical Orientation Paper. Counselor: Practicum, projects. Principal and Supervisor: Cycles of Clinical Supervision, Vision paper and presentation, Curriculum Revision, Sch Psy: Comprehensive Psychological Report; Applied Behavioral Analysis Project; Content Based Exams, Exams; Ecological Assessment Report; Laws Project Nurse: health needs assessment assign; clinical journal; case study exam

Tags for Data evidence A.2.1

TIM

ELIN

E AN

D RE

SOU

RCE

S

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit.

Emphasize and share stakeholder (advisory council) contributions of candidate understanding of use of research methods but publishing survey data and analysis in D2L. Continue with present evidence collection and use of research with program adjustments recognized in program or unit data review. Principal and supervisor updates are an example. ELCC requirements resulted in survey graduate edits that included stakeholder suggested combination of in-course

internship: Since all principal candidates are required to experience the twelve month cycle of the school, candidates should schedule EDLD 798 and EDLD 799 and the in-course internship in EDLD 614 with their leadership mentor so they are able to fulfill PDE’s twelve month requirement. PEU assessment data and assessment definition.

A description of the personnel, technology, and other resources available;

Annual assessment committee, program faculty, shareholder review of data results from assessments. Access to the PEU assessment system through D2L. Advisory Council meetings D2L and PEU assessment system through Banner. Alumni and graduate survey through Millersville’s Office of Institutional Planning and Research

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Documented surveys’ readability, content, question quality, and comprehensiveness assure level 3 of CAEP rubric.

DATA

QU

ALIT

Y

Steps that will be taken to attain a representative response

Contact list in use is current and updated as of spring 2019.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Emphasis and sharing of stakeholder input for programmatic changes. Stakeholder evaluation of action and research at sites shared at assessment committee meeting data review each semester to evaluate effectiveness of research methods.

Use of data analysis/evidence to develop supportive school environments

Annual review of data and sharing of results with partners. Site partners do the evaluation and assessment of candidates and assist in monitoring of candidates who need support, so they are constantly informed of candidate performance and able to provide feedback each semester. An example of a program modification is ELCC requirements resulted in survey graduate edits that included stakeholder suggested combination of in-course internship: Since all principal candidates are required to experience the twelve month cycle of the school, candidates should schedule EDLD 798 and EDLD 799 and the in-course internship in EDLD 614 with their leadership mentor so they are able to fulfill PDE’s twelve month requirement.

Scroll down for continuation of plan

Requirement RE

LATI

ON

SHIP

TO

STA

NDA

RD O

R CO

MPO

NEN

T Explicit link of the

intended data/evidence to the standard or component it is meant to inform

Specific data providing evidence how the ADV provider and schools work together. Gather more specific site personnel (Stakeholder) feedback that guides the co-construction and shared responsibility.

A description of the content and objective of the data/evidence collection

Share data results from the state Major Review and survey input via D2L. Collaboration with ADV programs on effectiveness of sharing data with school environments. Evidence of data collection already in place: Rdg Spci: Prep and delivery of instructional strategies; Counselor: collect and analyze various types of program data and work to present core guidance curriculum based on identified needs of stakeholders; Sch Psy: Content Based Exams; Professional Development Workshop/Presentation; Child-Psychoeducational Evaluation Report; Comprehensive FBA Report and Intervention; ELL Assessment and Recommendations Report; Therapy Evaluation; Content Paper; Principal: EDLD 610 Theory and Organizational Behavior, 614 School Community Relations, Evidence: Project based learning/Case Study; Supervisor: EDSU 700 Functions of Supervision, 701 Administrative Aspects of Supervision, 799 Applied Supervision Internship; Nurse: clinical standards; dispositions.

Tags for Data evidence A.2.1

TIM

ELIN

E AN

D RE

SOU

RCES

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit.

September 2020 arrange meetings with site stakeholders to establish sharing of data gathering. Gather information about COVID shut down impact on school environments and impact on ADV program practicums and internships. October 2020 devise a plan for sharing data evidence that will support school environments November 2020 interviews about use of data and analysis provides support for school environments.

A description of the personnel, technology, and other resources available;

ADV program faculty and site supervisors Adv program faculty and stakeholders attend Assessment Committee annual review of unit data. Banner system including D2L and PEU assessment system Alumni and graduate survey through Millersville’s Office of Institutional Planning and Research

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Review of Survey tools in use now that have documented evidence of surveys’ readability, content, question quality, and comprehensiveness. This step ensures survey validity. Review and evaluation of any editing of survey or data collection instrument annually.

DATA

QU

ALIT

Y

Steps that will be taken to attain a representative response

Representative purpose samples are in use at the current time. Update of representatives occur annually as candidates or sites change.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Action research impact published through sharing sessions. Advisory council meeting minutes shared on D2L site.

Exhibit A 2.1 C Application of Data Literacy Application of data literacy for candidates is well-documented in the attachment

A.1.1 Proficiency Matrix Requirement

RELA

TIO

NSH

IP T

O S

TAN

DARD

O

R CO

MPO

NEN

T

Explicit link of the intended data/evidence to the standard or component it is meant to inform

ADV program candidates are graduate or Post Graduate candidates who have earned teaching or professional licenses that provide evidence of experiences in the application of theory and use of data. Increased data literacy or the ability to read, understand, create and communicate data as information is required of ADV candidates through application.

A description of the content and objective of the data/evidence collection

Addressing this component requires collecting data from our program completers and their site evaluators. The collection of employer and alumni data and its application to program completion and to clinical field work using collected data that is read, understood, and communicated.

Tags for Data evidence A.2.1

TIM

ELIN

E AN

D RE

SOU

RCE

S

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to

ADV courses collect and use data in practicums, internships, projects, and case studies. Emphasis on specific stakeholder data from surveys that provide evidence stakeholders are contributing to the shared responsibility and co-

be available by the time the site visit.

construction for programmatic change. Discussion with ADV program faculty for strategy to collect additional specific input from stakeholders. i.e. this statement from School Counseling speaks of faculty input but not stakeholder (site supervisor), Faculty teaching this practicum course specifically address collaborative elements of professional practice, and students more thoroughly address these competencies than has been the case before the introduction of the 100-hour practicum experience. For site visit SOL report data for ADV programs, other data sources that supplement surveys, and faculty interviews.

A description of the personnel, technology, and other resources available;

Aug. -Sept 2020 Review of current data gathering tools, review of the SOL reports October 2020 ADV program faculty interviews January 2021 review collection strategies and edit surveys. May 2021 review data and discuss relevance of data collection

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Survey tools in use now have documented evidence of surveys’ readability, content, question quality, and comprehensiveness. This step ensures survey validity.

DATA

QU

ALIT

Y

Steps that will be taken to attain a representative response

Review current list of representative sample representatives established using contact list of candidates and site supervisors lists already in use. Follow up calls and emails to assure response. Update of representatives occur annually as candidates or sites change.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Department representation at assessment committee data share events.

(1) What evidence is available that clinical experiences are developmental in nature?

EPP Response: Please see table developmental in nature ADV programs.

EPP Response: PDE guidelines for field/clinical experiences define the clinical experiences as four stages of field experience, including student teaching. Each stage is progressively more intensive and requires the candidate to gradually assume more responsibility. Stage 1, 2 Observation/Exploration. Stages 1, 2 not defined in the ADV programs as the candidates are graduates with documented experiences in observation and exploration as documented by earned Bachelor’s Degree, Licensure as a Nurse, teaching experiences (Reading Specialist, Principal and Supervisor), Psychology bachelor’s or Masters’ degree. Stage 3 is Pre-student teaching or Internship and Stage 4 is Student Teaching or Internship.

Clinical experiences are developmental in nature ADV programs.

ADV Program # of hours Supervision Assessment or Product

Stag

e 1,

2

Counselor Pre-Internship Portfolio, reflections of specific observations, interviews with educational professional and other activities that acclimate them to the world of school, various stakeholder groups, and the profession of school counseling.

Stage 60 Counseling Faculty Portfolio, interview notes, professional association membership

Rdg Spec Classroom-Based Exploration- TRY IT OUT! Candidates not teaching would observe for a half-day in an elementary, middle, or high school classroom and 1. briefly describe the lessons observed. 2. Discuss the things observed in the lessons, in the classroom, or in the school that seem to “fit” with the concepts we are reading and talking about in this course. 3. Discuss the things you saw in the lessons, in the classroom, or in the school that seemed NOT to “fit” with

30 Reading Faculty A readability study, writing activities, and a research/project-based possibility.

the concepts we are reading and talking about in the course

Prin/ Supervisor

EDLD 614, 5-year improvement plan under a problem-based learning experience model, vision statement

60 Course Faculty Improvement plan, vision statement

Nurse Field experience in a public-school urban classroom.

16 Course faculty supervisor

The mentor teacher rates the teacher candidates in the following categories: 1) Communicates Professionally, 2) Demonstrates Professional Growth, 3) Demonstrates Professional Relationships, 4) Exhibits Attributes Suitable to the Profession, 5) Displays Responsible and Ethical Behavior, 6) Academic Readiness, 7) Field Evaluation: Professional Competencies, 8) Field Evaluation: Instructional Competencies, and 9) Field Evaluation: Dispositions

Sch Psy The Pre-practicum encompasses Stage one and two of field experiences. Students produce a portfolio that includes reflections of specific observations, interviews with educational professional and other activities that acclimate them to the world of school and its stakeholders. Twice a year, students submit their Portfolios that are reviewed and evaluated by faculty using a rubric. In their portfolio, students collect artifacts including samples of psychological reports and workshop attendance.

Portfolio Review meeting, Work Samples.

Stag

e 3

Counselor Practicum: candidate's weekly writings describing progress towards goals, logs, and field supervisor evaluation,

100 Shared School Site Supervisor/Faculty

Weekly logs, Supervisor Eval

Rdg Spec Coaching cycle, observation of a literacy lesson using an observation tool of choice, three methodological recommendations formulated following the observation, formulation of a single recommendation to be discussed, a pre-coaching discussion with the classroom teacher, a demonstration lesson or co-teaching, and a follow-up discussion.

20 School site and MU faculty supervisor

Demonstration lesson or co-teaching and follow-up notes from teacher and supervisor

Prin/ Supervisor

The internship experience is conducted over two semesters, EDSU 700, policy proposal,

180 School Site Assigned Supervisor

Policy proposal, supervision rubric completed by site supervisor and course faculty, 5-year Problem Based project

Nurse Combined with stage 4 Combined with stage 4

Combined with stage 4

Combined with stage 4

Sch Psy Practicum in School Psychology Evaluate the student’s performance based on where they are in their training. It will become a part of the student’s record and may be considered in assigning grades for the course. In your evaluation please consider the student’s work and involvement in university assignments, professional development, agency involvement, and school psychology opportunities provided by the school district.

120 School Psy faculty Site supervisor assessment

Stag

e 4 Counselor Internship: presentation of data collection

projects/core competency project and field supervisor evaluation

420 School Site Assigned Supervisor

Presentations, Competency Project, Supervisor Evaluations

Rdg Spec Coaching Practicum is equivalent to student teaching

160 Supervisor course instructor

Supervisor Evaluation

Prin/Supervisor The internship experience is conducted over two semesters, EDSU 700. six cycles of clinical supervision. They select three teachers, one each at the elementary, middle, and high school levels, and observe each teacher twice.

180 On site administrator, faculty supervisor

Rubric, Clinical Supervision Internship, supervisor evaluations submitted to the PEU assessment system. Rubric

Nurse Clinical assignment school health office of a public school

100 School supervisor, program supervisor

Supervisor evaluations

Sch Psy Two methods of Assessment, a “Case Study” and an “Exit Assessment” are conducted by Faculty during internship. The case study is reviewed at various points during the Internship to monitor and support each candidate’s progress. It is evaluated at the end of internship. The “Exit assessment,” is based on the Case Study and is conducted by faculty at the end of the internship.

1200 Sch Psy faculty and site supervisor

Case study Exit Assessment

(2) What evidence is available to candidate application of content knowledge in the field?

The EPP provides some limited data measuring candidate knowledge and skills. Not all programs have evidence of proprietary licensure examinations or other content specific data.

EPP Response: Please see table below and plan A.2.1 A

4) What evidence besides grades is available for candidate content knowledge? More than just grades for content knowledge assessments.

EPP Response: Please see table ADV candidates take the Praxis Content Tests. Principal and supervisor content tests are the same as is some of the course work. The Supervisor license/certification is earned while in the Educational Leadership program. This course of study provides a master’s degree in education, a K-12 Principal certificate, and a K-12 Supervisor of Curriculum and Instruction certificate. Students holding a master’s degree may pursue the principal certificate and supervisory certificate separately. This program has been accredited by the National Council for the Accreditation of Teacher Education and the Educational Leadership Constituent Council.

ADV Program A. 2.2. B.

Proprietary Assessment Praxis 2018-2019

Content Knowledge Pass Rate # Test Takers

Counselor 5421 100% 8 Rdg Spec 5301 100% 6 Prin 5412 100% 4 Nurse No test * * Sch Psy 5402 100% 15 Supervisor 5412 100% 3

School Nurse candidates must hold a Bachelor of Science in Nursing (BSN) degree and current licensure as a registered nurse in Pennsylvania. The Pennsylvania State Department of Education does not require a proprietary license exam due to the requirement of a Nursing license for entry into the program.

Knowledge of Content area other than the proprietary licensure exam: see Exhibit A1.1 C Application of Data Literacy, application of content knowledge in the field. Also see table titled Collaborative Meetings with Partners A. 2.1 (Evidence submitted to State Department Major Review and shared in Banner, D2L EDHS assessment files): The Supervisor license/certification is earned while in the Educational Leadership program. This course of study provides a Master’s degree in education, a K-12 Principal certificate, and a K-12 Supervisor of Curriculum and Instruction certificate. Students holding a master’s degree may pursue the principal certificate and supervisory certificate separately. This program has been accredited by the National Council for the Accreditation of Teacher Education and the Educational Leadership Constituent Council.

Detailed data reports are too large to upload into this addendum. The AIMS SPA reports can be accessed for data and analysis as indicated in the table below. Other specific data details can be shared at site visit in Nov. 2020.

Knowledge of Content area other than the proprietary licensure exam table A.2.2 B

ADV Program/Assessment Title

Analysis

RDED 621: Foundations of Reading Field-Based Investigations Assessment of Content Knowledge in Reading Education.

Analysis of Data Findings: Summary results for the 21 candidates assessed on this measure during Fall 2017 and Summer 2018 section 50A & 50B indicates that one candidate performed at a developing level: three performed at a proficient level; 17 students demonstrated exemplary performance. All 21 students performed satisfactorily, with a mean range of 3.27 to 4.0. Detailed data analysis in AIMS SPA report

Principal/Supervisor EDSU 799 (see data for this course SPED 799 A.2.2 question 3

Practicum Completion All completed activities require documentation. Evidence of completion will be submitted to MU university supervisor in the form of an organized, well labeled notebook. All work should be completed and submitted prior to the final presentation of your program portfolio. A log of all hours in the internships and in previous school-based experiences should be kept, with date and outcome. Your onsite supervisor will complete an evaluation of your internship. Your university supervisor will submit your Pass/Fall grade and will complete the assessment of your notebook according to the attached rubric. In the Clinical Supervision Practicum, nearly one hundred per cent of program candidates perform at the exemplary or proficient levels. This Practicum is designed to provide ongoing MU supervision through the internship in addition to the on-site administrator supervision. Detailed data contained in AIMS SPA report

Sch Psy Practicum Supervisor Evaluation

The overall ratings for school psychology competencies are 3.61 for the 2017 (n=9) cohort, and 3.75 (n=16) for the 2018 cohort. The scores for both groups show that many students fell within the range of 3.00 to 4.00 and indicate proficient or exemplary demonstration of the competencies being evaluated. Table 3.1 (a) shows that levels of attainment are seen in all areas assessed for the 2017 and 2018 cohorts respectively: Professional Dispositions (3.47; 3.56), school psychology competencies (3.31; 3.42), Basic work requirements (3.52; 3.71), and response to supervision (3.71; 3.80). Detailed data analysis in AIMS SPA report

Nurse NURS 560: School Nursing Clinical Performance Outcomes Graduates Survey

The Post-Bacc SN Certification students who have completed the program were queried as to "how well the program prepared them for the requirements of the position”. Qualtrics survey tool was used to query 37 individuals who have completed the SN Cert program from 20010-2015. There was 51% response rated with final of 19 surveys returned. The question How well did it prepare you to work with... categories queried: In Role of CSN; in NASN Standards of Practice; work with diverse students with special health needs; work with English Language Learners; with the new technology in the specialty. Over all the responses rated the MU Ssch Nurse Cert program from extremely well to moderately well in the five categories. Although most responses were between extremely and very well. The category with a lower rating was the technology in the specialty. Here 15 of the 19 responses rated very to moderately well with one rating 'not well at all'. This was a time when school districts were acquiring electronic school health records. There are many electronic record packages and each school district has a different one. This makes it extremely difficult to teach one type when it might not be the one used by that school district. The past two years, this has not been as much as a problem since many have some experience prior to the course, NURS 560, if they had substituted in a district or work as school health assistant. The basic understanding of electronic records is reviewed during the legal aspects in school nursing (NURS 560). But the specifics of how each program works is not possible to recreated in the classroom. They are encouraged in the practicum to gain as much experience as possible in the setting with the CSN Site Supervisor. Detailed data will be shared with site leader in Nov 2020

Counselor Fortunately, the recent inclusion of a 100-hour practicum experience has provided a meaningful, interactive and supervised experience to address these areas through multiple pedagogical and professional channels. Faculty teaching this practicum course specifically address collaborative elements of professional practice, and students more thoroughly address these competencies than has been the case before the introduction of the 100-hour practicum experience. Finally, these areas are addressed in core competency projects which students submit as a capstone to the internship experience. Analyzing this perception data will allow faculty to be more strategic about including opportunities to develop these collaborative skills during the development, implementation, and evaluation of the core competency project.

Scrolll down for continuation of table

Collaborative Meetings with Partners A. 2.1 (including how many times they meet) Content Knowledge of ADV Candidates used in the field A.2.1, A.2.2 Evidence of monitoring of candidates and providing support A.1.1 Evidence submitted to State Major Review and posted in D2L learning management system for annual data retreats and program department meetings. Counseling

Outcomes from Collaboration Meetings with Site Partners,

During practicum and the year prior to internship, students in the program are required to develop goals that are specific to their strengths and needs. These goals are co-created in collaboration with both the university faculty member and the site supervisor. During the University Field Supervisor meetings, progress towards internship and practicum goals are discussed and evaluated. In addition, the Field Supervisor Evaluation is used as a discussion tool to determine the extent to which the student is demonstrating mastery of competencies. Areas in need of improvement are addressed through specific remediation plans. These plans are monitored multiple times throughout the year to ensure that the student has made adequate growth. If goals are not met, alternative options are discussed, such as an extended internship, adjustment of internship site and/or supervisor, etc. Site supervisor has access to all forms used for evaluation for feedback and input or changes. University faculty collaborate with site supervisors regularly throughout the semester through a number of means including face-to-face meetings on site, face-to-face meeting off site, phone calls, emails, and other forms of computer-mediated communication.

Impact on Student Learning/Application of Content in the Field for ADV Program Candidates

While the role of the professional school counselor as defined by professional organizations and associations (e.g., ASCA, PSCA, etc.) is partially to support the academic mission of the school, the PSC is variably tasked with attending to students' career development and personal/social needs which have indirect effects on classroom student achievement. Thus, the goals that candidates create during practicum and internship do not necessarily directly address classroom student achievement. However, these goals area co-constructed with field supervisors and management agreements are signed by administration to ensure that the broader goals reflect the academic and general mission of the school district. After creating goals based on needs assessments (and other on-site data collection mechanisms), the candidates, in collaboration with university faculty and the site supervisor, work to establish evidence-based practices (EBPs) to meet the specific needs of the group. Candidates work to develop, implement, and evaluate the interventions by exploring process, perception, and outcome data.

Principal Outcomes from Collaboration Meetings with Site Partners, including how often meetings occur

The internship experiences are individualized to a candidate's current needs in collaboration with the placement's needs. This enhances the candidates’ understand of the supervisor's role, which truly is to provide what's needed for the teachers of a school to create better learning environments for students. Excerpt from the handbook: The MU supervisor will meet with you to review 1) your EDLD 798 and EDLD 799 timeline for completion and 2) your proposal for meeting the guidelines for EDLD 798 and EDLD 799. These guidelines and rubrics are in your Leadership Candidate Manual. The supervisor will meet with you and your on-site cooperating mentor at least once on-site in the beginning of the internship and one- two other times at the university or on-site, to review your progress and also to discuss your completed work. These visits vary depending on whether you are doing the internship in the Fall or Spring semester or during a shortened Summer term. Therefore– each semester there is an initial meeting, a final assessment meeting between the university supervisor and site supervisor and one or two other times.

Impact on Student Learning/Application of Content in the Field for ADV Program Candidates

Through projects focused on curriculum revision specifically, classroom student achievement around the PA Core standards gains the attention of the intern candidates. The supervisor then evaluates the curriculum revision product produced by the intern candidate.

Reading Specialist Outcomes from Collaboration Meetings with Site Partners, including how often meetings occur

The clinical faculty within the program have built collaborative partnerships with school and district administrators, principals, classroom teachers, and with specialized literacy professionals who recommend students for the practicum experience based upon need as indicated by assessment data obtained. Candidates interact with the students' families and the children's teachers and/or specialized literacy professionals to inform and extend their experiences with appropriate planning, assessment, and targeted instruction. The practicum experience includes various experiences related to adult learning and leadership in which candidates engage in continuous learning on 1) how to lead and engage adults in professional development, 2) data-based decision-making, and 3) improving literacy practices and outcomes for all learners by participating in professional learning communities and serving as a resource. Interns are formally evaluated each time that they are directly observed, typically on three separate occasions. They are evaluated toward the conclusion of the practicum experience using the clinician performance rubric, a summative evaluation. Additionally, candidates are

Collaborative Meetings with Partners 2.1: EPP Response: see table above for how many times collaborative meetings occur and the content of those meetings.

Evidence of Content Knowledge in the Field A.1.1 C: EPP Response: Please see table above titled Collaborative Meetings with Partners A. 2.1 (including how many times they meet) Content Knowledge of ADV Candidates used in the field A.2.1, A.2.2. C

evaluated each time that they have the responsibility for initiating a small group literature response activity, and upon their delivery of a staff development session of their own design.

Impact on Student Learning/Application of Content in the Field for ADV Program Candidates

The faculty field site supervisor evaluates the reading specialist candidates on multiple measures in order to document student achievement. These include direct observation of performance, captured by an observational tool, the evaluation of the case report completed on each participant, and the completion of the clinician performance rubric. Interns conference with their faculty supervisor 1) to review their interpretation of the assessment data obtained, 2) following each formal observation, and 3) as necessary to discuss each section of the summative case report that they prepare for each individual student experiencing difficulties and for whom they have the responsibility for instruction.

Sch Nurse Outcomes from Collaboration Meetings with Site Partners, including how often meetings occur

There is always a good collaboration with the CSN Preceptor and supervising faculty. Beyond the two visits phone calls or emails exchange how the student is doing or issues with the student that need to be addresses for a successful practicum. Additional visits are made to the onsite Practicum location if needed. The SN Cert students meet 1-2 times in the field as well as with the CSN Preceptor of the student. A review of the competencies and assignments are completed as well as discussion on the status of the rotation.

Impact on Student Learning/Application of Content in the Field for ADV Program Candidates

The CSN Site Supervisor is with the SN Cert student in the health office when he/she assesses any school student going there for health need. The CSN Site Supervisor sees the assessment and interventions provided to the school youth in the health office. The formal summative evaluation done by the CSN Site Supervisor is the 'Dispositions Form'. But when the Faculty site supervisor visits, verbal input is provided to the faculty member as to the SN Cert student performance in communication (all age levels and parents), assessment, appropriate interventions, documentation, teaching and follow-up. The Faculty site supervisor uses this information in the midterm evaluation of progress meeting.

School Psy Outcomes from Collaboration Meetings with Site Partners, including how often meetings occur

During practicum and the year prior to internship, students in the program are required to develop goals that are specific to their strengths and needs. During the University Field Supervisor visits, progress towards internship goals are discussed and evaluated. In addition, the Field Supervisor Evaluation is used as a discussion tool to determine the extent to which the student is demonstrating mastery of competencies. Areas in need of improvement are addressed through specific remediation plans. These plans are monitored multiple times throughout the year to ensure that the student has made adequate growth. In the event that goals are not met, alternative options are discussed, such as an extended internship, adjustment of internship site and/or supervisor, etc. The Faculty Field Supervisor meets with interns as a group once every other week during each of the two semesters that comprise the internship. In addition, the faculty member meets with each intern individually twice a semester.

Impact on Student Learning/Application of Content in the Field for ADV Program Candidates

Field Supervisors evaluate School Psychology interns during their 1,200-hour internship on activities and competencies relevant to the roles and functions of School Psychologists. These competencies are based on the standards for training and practice set forth by the National Association of School Psychologists (NASP). School Psychology program are required to develop and implement evidence-based interventions with children presenting academic behavioral challenges across multiple tiers. The successful implementation and results of these interventions are facilitated and overseen by the Site Supervisor and evaluated by the Faculty Field Supervisor.

Supervisor Outcomes from Collaboration Meetings with Site Partners, including how often meetings occur

The internship experiences are individualized to a candidate's current needs in collaboration with the placement's needs. This enhances the candidates’ understand of the supervisor's role, which truly is to provide what's needed for the teachers of a school to create better learning environments for students. Excerpt from the handbook: The MU supervisor will meet with you to review 1) your 799 Part I and 799 Part II timeline for completion and 2) your proposal for meeting the guidelines for EDSU 799 Part I and/or EDSU 799 Part II. These guidelines and rubrics are in your Supervisory Candidate Manual. The supervisor will meet with you and your on-site cooperating mentor at least once on-site in the beginning of the internship and one- two other times, at the university or on-site, to review your progress and also to discuss your completed work. These visits vary depending on whether you are doing the internship in the Fall or Spring semester or during a shortened Summer term. Therefore– each semester there is an initial meeting, a final assessment meeting between the university supervisor and site supervisor and one or two other times.

Impact on Student Learning/Application of Content in the Field for ADV Program Candidates

Through projects focused on curriculum revision specifically, classroom student achievement around the PA Core standards gains the attention of the intern candidates. The supervisor then evaluates the curriculum revision product produced by the intern candidate.

There was no evidence provided that the EPP maintains mutually beneficial, functioning partnerships (A.2.2) – EPP Response: Table, above, documents our ADV programs' interaction with stakeholders and the impact on the partners and their students

EPP Response: Please see for evidence of mutually beneficial, functioning partnerships in evidence cited in tables for A.2.1 A and A.2.2. C The files in this attachment describe the process for collecting Local Education Agency (LEA) feedback on their collaboration with program faculty and how the feedback is used to improve the program. Usually the LEA feedback is collected through survey or advisory council queries. The ELCC survey indicates the data collected is for NCATE. This error is corrected for survey data launched in 2020 as indicated by the data included in the attached using ELCC standards. Additional evidence EPP maintains mutually beneficial, functioning partnerships can be seen in attachment titled: ADV Program Survey and Analysis and Content Testing ADV Programs other than the proprietary licensure exam table, and A.2.2 question number 1: Clinical experiences are developmental in nature.

(3) What data are available for the SPED 799 assessments and the mentor assessments?

EPP Response: The assessment name and course name are EDSU 799, Special Education Supervisor. We

apologize for including a misleading assessment name. Please see requested data below.

ASSESSMENT FREQUENCY REPORT

Portfolio Project Field Component 799-Obj3: a. Induction b. Field Component 799-Obj5: Internship/ Active Research

Cumulative Semesters

MAJOR(S): PMCERT Supervisory Certification in Special Education

4 3 2 1 0

Semester

Frequency Total Exemplary Meets

Proficiency Partial Failure to Meet Proficiency Unsatisfactory No

Score Average Score

Spring 2020 2 0% 100% 0% 0% 0% 3

Fall 2019 1 100% 0% 0% 0% 0% 4

Spring 2019 1 100% 0% 0% 0% 0% 4

ASSESSMENT FREQUENCY REPORT

Portfolio Project: Component 681-Obj7. Special Education Administration

Cumulative Semesters

MAJOR(S): MED Leadership for Teaching and Learning, PMCERT Supervisory Certification in Special Education

4 3 2 1 0

Semester Frequency Total

Exemplary Meets Proficiency

Partial Failure to Meet Proficiency

Unsatisfactory No Score

Average Score

Fall 2013 1 0% 100% 0% 0% 0% 3

Summer 2 2013

3 0% 100% 0% 0% 0% 3

Spring 2013

1 100% 0% 0% 0% 0% 4

Exhibit A. 2. 1 A Partnerships for clinical preparation can follow a range of forms, participants, and functions. Please see table titled Outcomes of Collaboration meetings for process already in place. Requirement:

Rela

tions

hip

to S

tand

ard

or

Com

pone

nt

Explicit link of the intended data/evidence to the standard or component it is meant to inform

Mutually agreeable expectations between site and ADV programs for entry, preparation, and exit; ensure that theory and practice are linked; maintain coherence across clinical and academic components of preparation; and share accountability for advanced program candidate outcomes.

A description of the content and objective of the data/evidence collection

Data and evidence are shred in D2L in state major review files for all unit faculty to access during unit data retreats

Tags for data/evidence

A2.1

Tim

elin

e an

d Re

sour

ces

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

meeting agendas and minutes collected and shared in M drive November 2020 share with site team created sites and last year evidence of meetings with site partners that resulted in programmatic changes. December2020- January 2021 review sites and current evidence uploaded and garner feedback from partners. May 2021 provide evidence of partner input and possible program changes.

A description of the personnel, technology, and other resources available;

D2l site uploading of current evidence by graduate assistants. Associate Dean of EDHs acts as consultant for evidence gathering and review. Assessment commitment provides feedback and sharing of Initial program use of feedback from partner meetings (i.e. PDS annual meetings)

Data

Qua

lity

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Assessment committee member review, random selection of work group members for this purpose, provide feedback on partner co-construction and shared responsibility evidence gathering. Pilot of survey or any other form that may be developed to assure readability, content, question quality, and comprehensiveness.

Steps that will be taken to attain a representative response

Purposeful sampling using registration of candidates in ADV programs gathered through Banner system.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Annual review in departments as recorded in minutes and agendas. Attendance records of those in attendance. Agendas or minutes must include site partner input into entry, monitoring, and exit practices.

Exhibit A.2.2.B Please see table included above. Titled: Knowledge of Content area other than the proprietary licensure exam Exhibit C A.2.2 evidence that diversity of admitted and completing candidates is given explicit attention across advanced programs Requirement:

Rela

tions

hip

to S

tand

ard

or C

ompo

nent

Explicit link of the intended data/evidence to the standard or component it is meant to inform

Admitted and completing candidates represent diversity of surrounding area, placement demographics, and specialist area shortages as reported by state department. During internships and practicums candidates are assessed and monitored across all ADV programs. Please see table Outcomes of Collaboration meetings in A.2.1

A description of the content and objective of the data/evidence collection

Monitoring of diverse candidates is given explicit attention across all programs as part of the monitoring of ADV candidates at their sites. During the University Field Supervisor meetings, progress towards internship and practicum goals are discussed and evaluated. In addition, the Field Supervisor Evaluation is used as a discussion tool to determine the extent to which the student is demonstrating mastery of competencies. Areas in need of improvement are addressed through specific remediation plans. These plans are monitored multiple times throughout the year to ensure that the student has made adequate growth. In the event that goals are not met, alternative options are discussed, such as an extended internship, adjustment of internship site and/or supervisor, etc.

Tags for data/evidence

A.2.2

Tim

elin

e an

d Re

sour

ces

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

July 2020- December 2020 Continue to collect data through the supervisor meetings. September 2020 post remediation plans on D2L site developed for ADV programs. November 2020 share remediation plans with site visit team

A description of the personnel, technology, and other resources available;

Graduate assistants or faculty to upload remediation plans currently used or ones developed to D2L site for ADV remediation plans.

Data

Q

uali

ty

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Remediation plans currently used have established readability, content, question quality, and comprehensiveness as they have been in use and

reviewed by site partner’s and ADV faculty for mor than 3 semesters.

Steps that will be taken to attain a representative response

ADV candidates with active registration status in Banner, D2L courses, will be accessed.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Annual data review within departments and unit data retreat review.

Exhibit A.2.2 C Knowledge of content applied in the field Requirement:

Rela

tions

hip

to S

tand

ard

or

Com

pone

nt

Explicit link of the intended data/evidence to the standard or component it is meant to inform

Culminating experiences in which candidates demonstrate their proficiencies through problem-based tasks or research (e.g., qualitative, quantitative, mixed methods, action) that are characteristic of their professional specialization as detailed in component A.1.1.

A description of the content and objective of the data/evidence collection

Each ADV program has intentional outcomes for application of content knowledge in the field. The outcomes are included in the state Major review posted in the D2L learning management system for unit data review. Also please see table in this standard titled Content Knowledge of ADV Candidates used in the field

Tags for data/evidence

A.1.1, A.2.2

Tim

elin

e an

d Re

sour

ces

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

September 2020- November 2020 All major review files will be reviewed, updated, and filed in D2L for annual data review. November 2020 Major review files shared with site visit team April 2021 SLO reports due annually completed filed with University office of Institutional Research May 2021 review annual alumni and employer survey data collected by University office of Institutional Research

A description of the personnel, technology, and other resources available;

Graduate Assistants to review data in Major Review files. Adv faculty modify any collaborative measure with site partners

Data

Qua

lity

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

State major review queries are a property assessment measure. All major review files are revised and evaluated by 3 groups independent reviewers and verified by the state department of education.

Steps that will be taken to attain a representative response

All ADV candidates with an active registration in course work as verified through Banner and D2L learning management systems will be included.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Annual review and data retreats with program modifications.

A. 3 and 4 Phase in responses and Phase in Plan

EPP Response: How do we plan to reach 20% response rate.

• Use multiple methods: online survey, phone survey, passive commercial social-media scan • Individual online invitations sent from department chairs rather than a “faceless” bureaucrat • Multiple, Staggered invitation times • Communicate the importance of confidentiality • Make your call-to-action very clear • Pay careful attention to design • Launch survey at same time annually

In addition to this table please see attachment titled ADV Content testing ADV Programs knowledge linked to standards by program and attachment titled A.1.1 Proficiency Matrix (CAEP six essential components with evidence collected by each ADV program):

ADV Program/SPA

Rdg Specialist ILA/IRA

Alignment of the Assessment with Specific SPA Standards: IRA 1.1, 1.2, 1.3, 2.1, 2.2, 2.3, 3.1, 3.2, 3.3, 3.4, 6.1, 6.2, 6.3. The licensure examination, Praxis II, assesses elements of each ILA standard, The Praxis exam clearly requires candidates to understand the theoretical and evidence-based foundations of reading and writing processes and instruction (Foundational Knowledge-Standard 1), the application of that knowledge in designing curriculum and instruction to support student learning (Standard 2), the application of one’s foundational knowledge in diagnosing and assessing appropriately (Standard 3), and to recognize and facilitate professional learning and leadership as a career-long effort (Standard 6). During the 2015-2016, 2016-2017, and 2017-2018 academic years 20 students enrolled within the program scored at or above the criteria for licensure established by the PDE, a score of 164; a 100% pass rate for each year.

Sch Psy NASP

NASP Domains are reflected in the Praxis II exam. The scores obtained and the average percentage obtained by students in the different categories tested by Praxis II for three reported cohorts show candidate scores at one hundred percent (100%) of the students have passed the Praxis II on the first attempt. In 2016 one hundred percent (100%) of the interns (n=10) passed the Praxis exam with an average performance 174 (range of 160-192). One hundred percent (100%) of the 2017, interns (n=10) passed the praxis with an average performance of 178 (range =170-190). In 2018 (n=9) one hundred percent (100 %) of the students who took the exam passed with an average performance of 175 (range=168-183).

Sch Nurse NCBSN

National Council Licensure Examination-Registered Nurse (NCLEX-RN) exam that is administered by the National Council of State Boards of Nursing (NCBSN). All candidates accepted into the School Nurse program must pass the test and hold a Registered Nurse license. The RN license and baccalaureate degree is verified by Millersville University’s graduate admissions.

Sch Counselor NCBSN

Data from 2013-2019 reveal school counseling candidates had a pass rate for the Praxis content-specific exams above 90%. The school counseling program provides in-depth content knowledge and practical clinical experiences which deepen our candidates understanding of children and how to provide a delivery of services that is above average. Our category scores also exceed the state and national averages. For example, in 2018-19 the average score for Millersville students was 180. The average for the state of Pennsylvania was 177 and for the State the average was 172. ETS recommends against interpreting subscale scores and using for decision-making because of the lower reliability of these scores. The category scores outpace the National, state percentages, and reflect the strength of our candidates on content tests.

Principal ELCC

The Praxis exam is an overall examination of a candidate’s ability to be an effective school-level administrator. As such, it meets all ELCC standards. Specifically, the assessment provides information about the candidate in regard to: 1. facilitating the development a vision of learning 2. applying best practice and designing comprehensive professional growth

3. managing organization and resources 4. collaborating 5. acting ethically 6. responding to the larger social context 7. completing an internship This test is intended to assess a candidate’s knowledge of functions of an administrator or supervisor, including background information needed to complete these functions. The examination is intended primarily for those who are candidates for a Master’s degree or who already possess a Master’s degree and are seeking first appointments as administrators or supervisors. This assessment instrument reflects the most current research & professional judgment & experience of educators across the country. The test is designed to capture what is essential about the role of school leader – what makes the difference in whether a school community can provide experiences that ensure all students success. The assessment covers 5 content areas: Vision and Goals (20% of total test content), Teaching and Learning (20% of the total), Managing Organization Systems and Safety (10%), Collaborating with Key Stakeholders (15%), Ethics and Integrity (15%) and The Education System (10%).

Supervisor ADV supervisor candidates are a subset for the Educational Leadership program where candidates seek certification as a supervisor, Principal, or Superintendent. There have been no candidates completing this program for the past 7 years. Overall, in supervision programs with the same required courses as Principal program. ADV supervisor candidates from the Principal program have a 100% pass rate among first-time test-takes is strong external evidence of the strength of the program. Average scaled scores for Millersville students are above both national and state averages for each of the last three years. Sub scores for modules are above national averages for every category and every year except module 6 – The Educational System. We are particularly proud that our students score substantially above state and national averages for the modules on Leadership and Vision, Managing Organizational Systems, and Ethics and Integrity. On the Ethics and Integrity module, for example, Millersville students averaged 94, 85, and 73 for the last three years compared with national averages of 75, 70, and 70. Considering the growing recognition of the importance and need for direct instruction on ethics, this is very positive evidence for candidates from our program.

Please scroll down for continuation of plan

Please see table above for analysis and alignment of content tests and National Assessment Associations for each ADV program. Also access the attachment titled Content Testing ADV Programs Content test knowledge linked to standards by program ADV Program Proprietary

Assessment Praxis 2018-2019

Content Knowledge Pass Rate

# Test Takers

Proprietary Assessment Praxis 2019-2020

Content Knowledge Pass Rate

# Test Takers

Counselor 5421 100% 8 5421 100% 8 Rdg Spec 5301 100% 16 5301 100% 6 Prin 5412 100% 4 5412 92.86% 5 Nurse No PA state test such as Praxis required. School nurse candidates must take National Council Licensure

Examination-Registered Nurse (NCLEX-RN) exam that is administered by the National Council of State Boards of Nursing (NCBSN).

Sch Psy 5402 100% 15 5402 100% 8 Supervisor 5412 100% 3 5412 100% 2

ADV Phase in Plans for standards A.3 and A.4

Exhibit A.3.1 A recruitment of candidates into advanced programs. Monitoring of either employment outlooks for advanced program areas or progress for candidate recruitment into program areas. Use of technology and stakeholders to increase co-construction of recruitment efforts. Requirement:

Rela

tions

hip

to S

tand

ard

or C

ompo

nent

Explicit link of the intended data/evidence to the standard or component it is meant to inform

Data Evidence of adjustments to EPP to recruitment practices based on self-monitoring of ADV program progress or success. ADV program data has been not consistently been collected by the unit except for SPA and PDE state major review. During practicum and the year prior to internship, students in the program are required to develop goals that are specific to their strengths and needs. These goals are co-created in collaboration with both the university faculty member and the site supervisor. During the University Field Supervisor meetings, progress towards internship and practicum goals are discussed and evaluated. In addition, the Field Supervisor Evaluation is used as a discussion tool to determine the extent to which the student is demonstrating mastery of competencies. Areas in need of improvement are addressed through specific remediation plans. These plans are monitored multiple times throughout the year to ensure the student has made adequate growth. If goals are not met, alternative options are discussed, such as an extended internship, adjustment of internship site and/or supervisor, etc.

A description of the content and objective of the data/evidence collection

Student progress is evaluated in internships and practicums for all enrolled ADV students using a remediation plan, see above. This plan is not standardized across all ADV programs but incorporates the evaluation of both academic and clinical performance of candidate. The currently used remediation plans are created alongside of site coordinators and supervisors. Any feedback that suggests programmatic changes will become part of the annual or affirmation of the current programmatic requirements and be included in unit data retreats. We plan to collaborate across all ADV programs to make the remediation plan uniform for all ADV programs. Data driven program modifications, with included citations for site partners, posted on ADV program web pages will inform potential candidates of how site partners increase the skill of candidates through support and guidance. Publicizing programmatic changes made using the feedback from sites in districts with teacher shortages or other identified needs, potentially increases recruitment potential for ADV programs. Additionally, a collaboration among the ADV program faculty and coordinators to create a uniform remediation plan will increase communication and intent of purposeful support for candidates across all programs. The Associate Dean of EDHS plans to gather the ADV program faculty in September 2020 to collaboratively create a Professional Development Plan like the one used in the INT programs. Please see attachment Sample PDP Plans (SCROLL down for continuation of plan) We plan to increase Hybrid/distance options advertised n recruiting for course delivery. Making this option more publicized will attract a broad range of diverse individuals that are not necessarily close in geographic proximity to the campus.

ADV programs will continue to make effort at state conferences to recruit diverse candidates to its graduate program. Data will be shared along with program improvements. ADV programs will continue to offer courses on site in school districts with under representative populations as a recruitment and best practices tool.

Tags for data/evidence

A.3.1

Tim

elin

e an

d Re

sour

ces

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

Beginning fall 2019, the Millersville main web pages have been re-designed to be a recruiting tool. Pages have intentionally provided information for prospective students as a way of recruiting. In the future we plan for the ADV programs to continue to include specific program requirements for admission to the program (i.e. for Sch Nurse a copy of Nursing License is required), information about licensure and how to obtain a PDE license, course descriptions link, and professional organizations the programs candidates use to their advantage for employment, https://www.millersville.edu/psychology/graduate-programs-webpages/index.php September – October 2020 We will plan to include language in response to individual student inquiries about flexible program options. ADV programs will continue and increase efforts by program faculty to engage practitioners in a discussion about the needs (teacher shortage or ADV program positions such as Counselor) of the field. Increased data will be posted in D2L for advanced programs tracking. November 2020 The relevant program policies and evaluation forms are provided as supporting documentation. We also plan to provide a sample Performance Improvement Plan as will provide evidence of how support is offered for students who do not demonstrate adequate progress. Continue practices in place in attachment titled ADV Recruitment and ADV Program Support for Graduates.

A description of the personnel, technology, and other resources available;

Millersville’s web page developers for content of program pages; IT personnel. Graduate assistants monitor recruitment information on program pages. Site partners’ survey includes developing survey questions to receive input into recruitment and monitoring of student progress. Dean’s office support of attendance at conferences for faculty.

Data

Qua

lity

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Since ADV programs have an established pattern of surveying graduates and partners, ADV programs have an established readability, content, written quality of questions for established validity. A thorough review of the added questions validity will keep the overall validity of the survey. Proficiency level attributes are defined in actionable, performance-based, or observable behavior terms. Qualtrics is normally selected as the survey tool and its tools do a crosstabs analysis to provide a confidence level. The ADV programs have a list of contacts on file through the Banner system and Alumni office.

Steps that will be taken to attain a representative

response

A concerted attempt to maximize the response rate, and we adhere to as many best-practice procedures as can be managed, to make our sample as representative of the full population of graduates is planned. ADV programs plan to utilize Emails, follow up phone calls, texting, and use of social media are planned. A counter installed on the program page monitoring the number of times the page is accessed would provide page popularity for recruits and also to track increase/decrease of visitors to page

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Data will be reviewed annually for PDE annual reporting and SLO reports. Recruitment data will be collected and comparted to previous year’s numbers to show increase or decrease in recruits. Monitoring of D2L posting of remediations plans is planned.

Exhibit A. 3.1 B Evidence that diversity of admitted and completing candidates is given explicit attention across advanced programs Requirement:

Rela

tions

hip

to S

tand

ard

or C

ompo

nent

Explicit link of the

intended data/evidence to the standard or component it is meant to inform

Please see tab marked Admission Requirements for ADV programs in ADV Recruitment excel workbook. All ADV programs have access to diverse group of student data across the campus through the Graduate Admissions office. Additionally, the Graduate Admissions Office has an assigned staff member for recruiting of ADV program candidates. Once admitted candidates are assigned an advisor for their program and are required to come to advising sessions when candidates are completing core competency projects, portfolios, and preparations for field placements. During the University Field Supervisor meetings, progress towards internship and practicum goals are discussed and evaluated. In addition, the Field Supervisor Evaluation is used as a discussion tool to determine the extent to which the student is demonstrating mastery of competencies. Areas in need of improvement are addressed through specific remediation plans. These plans are monitored multiple times throughout the year to ensure the student has made adequate growth. If goals are not met, alternative options are discussed, such as an extended internship, adjustment of internship site and/or supervisor, etc. The remediation plan data will be transcribed in a format so data can be used across all PEU advanced programs.

A description of the content and objective of the data/evidence collection

To address this standard component, we plan to continue to use the radiation plans as a support system to evaluate student progress. WE plan to develop a uniform remediation plan so evaluation data can be used from internships and practicums when the ADV programs are conducting an annual review process developed by faculty and coordinators of the programs. The standardized form will incorporate evaluation of both academic progress and candidate professional behaviors. The annual review will provide students with feedback regarding their performance and progress in relation to academic and professional expectations of the program. It will also allow program faculty to initiate a plan for supporting students who are not demonstrating adequate performance and progress.

Tags for data/evidence

A.3.1

SCROLL down for continuation of plan

Tim

elin

e an

d Re

sour

ces

A description of the personnel, technology, and other resources available.

September 2020 meet with ADV program coordinators and faculty to devise an annual assessment form and to modify currently used remediation plan. Decide on which assessments will be included on the annual data collection form to offer the best information for students and program improvement. Develop the process for ADV candidates who elect to miss a semester or not attend advising. October 2020 present a modified remediation plan or new plan to Assessment committee for feedback. Edit data collection plan to reflect feedback. Assess if using a modified PDP currently used in INT programs is acceptable November 2020 discuss plans for annual data collection, or what is currently in place, during site visit, share data collected from the past if any. Graduate Admissions Recruitment staff member will be interviewed during site visit. December 2020 being to collect data from the fall semester and plan for spring 2021 data collection January 2021 include support and data collection plan in advising information to students and faculty. Cite input of site supervisors/partners. May 2021 Complete remediation data assessment form and review. September 2021 share in data day unit review for unit improvement and on web page.

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

PEU assessment system technologies: record data using technology system, includes site supervisor access; review of annual data during data day review D2L learning management system: may be used by ADV faculty or advisor Assessment Committee members: feedback for form for annual data review Department coordinators and faculty: annual review of data. INT faculty and PDP plan currently used. Present tests for reliability and validity already completed and posted in D2L.

Data

Qua

lity

Steps that will be taken to attain a representative response

Selected ADV program to share form and data collection with assessment committee for feedback validity of data collected. Access D2L for tests for reliability, validity, and interrater tests completed on INT PDP plan which meets level 3 of CAEP assessment rubric.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Data will be reviewed at each annual Data Day assessment retreat with follow up plans for program improvement

SCROLL down for continuation of plan

Exhibit A.3.1 C

Monitoring of employment outlooks for advanced program areas or progress for candidate recruitment into program areas.

Requirement:

Rela

tions

hip

to S

tand

ard

or C

ompo

nent

Explicit link of the intended data/evidence to the standard or component it is meant to inform

ADV programs plan to monitor state census knowledge to fully understand regions of greatest teacher shortages in the state. We plan to utilize state teacher employment data identified by initial (INT) programs to identify addresses of employment opportunities in schools, districts, and/or regions.

A description of the content and objective of the data/evidence collection

To meet the component 3.1 of this standard review of state published employment will be reviewed. Additionally, employment surveys and the alumni survey data will be analyzed to determine placement of ADV candidates in areas of high teacher shortage. Monitoring and adjusting site placements fore ADV candidates in placements of high teacher shortage will be monitored for potential rise in employment in areas of teacher shortage. ADV program will include advertisement of any tuition breaks afforded to those employed in high teacher shortage or state determined high need school districts.

Tags for evidence/data A.3.1

Tim

elin

e an

d Re

sour

ces

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

August -October 2020 Begin review of state employment data and Millersville University launched alumni survey data over the last two years to determine a baseline of where ADV candidates are employed. November 2020 discuss findings of data review with site team during site visit December 2020- January 2021 add supplemental questions to University launched alumni survey to ask about employment goals. Advertise high needs school district incentives during advisement and recruitment early in ADV program. Collaborate with career services and Experiential Learning (Internships) to publicize high needs school district employment incentives.

A description of the personnel, technology, and other resources available;

Career Services and Experiential Learning (Internships) publication of high needs employment incentives for teacher education. State employment data to set benchmark for employed ADV candidates Annual alumni and job services data through office of institutional resources and planning.

Data

Qua

lity

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

State data publication is propriety survey data which meet level 3 or above of CAEP assessment rubric.

Steps that will be taken to attain a representative response

A representative sample will be guided using the Banner course registration system for names of ADV graduates. The graduates will be found using the state reporting of hires in school districts.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Based on the areas of employed graduates furnished through the state department of education, program modifications will be adjusted to meet the needs of state teacher shortages. If more graduates are employed in areas of greatest need, citing program changes though the ADV program web pages will provide evidence of continuous improvement to potential students and visitors to the web pages for ADV programs.

Exhibit A. 3.4 A High standard for content knowledge in the field of specialization

Rela

tions

hip

to S

tand

ard

or

Com

pone

nt

Requirement: Explicit link of the intended data/evidence to the standard or component it is meant to inform

A more intensive effort to invite and include ADV programs will be included in the annual review retreat, Data Days, to offer data demonstrating ongoing monitoring of candidate performance, in respect to academic performance and professional behaviors; remediation plans completed by site supervisors during clinical experiences. Part of the plan for meeting 3.4 components is to offer evidence of supporting candidates who demonstrate inadequate progress or performance in one or more areas of assessments. ADV programs may consider use of Professional Development Plans (PDP) currently implemented in the initial (INT)cert programs. Post plans on D2l to inform ADV faculty.

A description of the content and objective of the data/evidence collection

To address this standard element, we plan to collect annual data that will provide monitoring data for academics and professional behaviors. Data from the content tests taken at ADV completion of the program m and the GPA used for admission to the ADV programs indicate a high level of content knowledge demonstration. See included table at the beginning with this plan.

Tags for data/evidence

A.3.4

Tim

elin

e an

d Re

sour

ces

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

September 2020 Review annual data collection for ADV programs currently used (PDP in initial programs) or develop a tool for gathering of data. October 2020 review evidence of support for candidates not making adequate progress. Post plans on D2L December2020-January 2021 Collect data for candidates not making adequate progress. Share in ADV department meetings for feedback January-May 2021 implement support plan, or continuation of plans already in use, collect data. May 2021 Present data at annual review retreat

A description of the personnel, technology, and other resources available;

ADV program faculty for review and submission of data from practicum and internships; remediation plans. D2L to post remediation plans. Graduate assistants to update program web pages PDP plans and tests for reliability and validity of plans used in initial programs

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Documented validity and reliability for PDP plans in use if that is the path the ADV programs want to pursue. If new plan is developed pilot the plan and feedback from randomly selected faculty or students. Collect data, and Assessment Committee or randomly selected faculty members provide feedback to ensure plan’s readability, content, question quality, and comprehensiveness. ADV programs have an established pattern of surveying graduates and partners, ADV programs have an established readability, content, written quality of questions for established validity. A thorough review of the added questions validity will keep the overall validity of the survey. Proficiency level attributes are defined in actionable, performance-based, or observable behavior terms.

Data

Qua

lity

Steps that will be taken to attain a representative response

Purposeful sampling will be used. Candidates not performing well will be chosen through advising sessions for ADV programs. Registered candidates acquired through Banner, D2L, system. Annual review of data in ADV program meetings and at EDHS unit annual data review retreats. Review of SLO reports submitted through University system and review by college deans for use at ADV department meetings to improve programs.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Annual review of data in ADV program meetings and at EDHS unit annual data review retreats. Review of SLO reports submitted through University system and review by college deans for use at ADV department meetings to improve programs.

Exhibit A.3.4 B Data literacy and research-driven decision making

Rela

tions

hip

to S

tand

ard

or

Com

pone

nt

Requirement: Application of data literacy is a focus of ADV programs as the candidates are pursuing a graduate license including developing and using research are required components of ADV programs and assessed by evidence as indicated in the attached titled A.1.1 Major Review ADV.

Explicit link of the intended data/evidence to the standard or component it is meant to inform

A description of the content and objective of the data/evidence collection

To address this standard component a needs review of how each program meets the data literacy and research-driven decision making was a first step see attachment titled A.1.1 Major Review ADV in this addendum. The next step is getting the data published so the unit can benefit. Our plan is to include the assessment data in the PEU system or D2L system for program annual review. Because ADV candidates are specialist degree candidates, they are involved in the practical application of action research the candidates develop on site. The site, candidates, and ADV programs the use data results to provide evidence of data driven decisions made at ADV sites.

Tags for data/evidence

A.3.4

Tim

elin

e an

d Re

sour

ces

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

September 2020 Review annual data collection for ADV programs currently used (PDP in initial programs) or develop a tool for gathering of data. October 2020 review evidence of support for candidates not making adequate progress. Post plans on D2L November 202 ADV coordinators/faculty will share remediation plans and respond to interview questions during site visit December2020-January 2021 Collect data for candidates not making adequate progress. Share in ADV department meetings for feedback January-May 2021 implement support plan, or continuation of plans already in use, collect data. May 2021 Present data at annual review retreat Review of SLO submissions

A description of the personnel, technology, and other resources available;

Graduate assistant used of Nuventive for SLO reports. Associate Dean and assessment committee members offer feedback and share INT program tools for development of annual assessment tool. DL2 learning management system for posting data,

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Since ADV programs have an established pattern of surveying graduates and partners, ADV programs have an established readability, content, written quality of questions for established validity. A thorough review of the added questions validity will keep the overall validity of the survey. Proficiency level attributes are defined in actionable, performance-based, or observable behavior terms.

Data

Qua

lity

Use of proprietary tool already in use will inform reliability and validity. Piloting of the tool used for randomly selected individuals’ feedback will inform reliability and validity. Training of scorers and checking on inter-rater agreement and reliability will be documented. Evaluation categories or assessment tasks are aligned with CAEP, InTASC, national/professional and state standards

Steps that will be taken to attain a representative

response

Purposeful sampling will be used. Candidates not performing well will be chosen through advising sessions for ADV programs. Registered candidates acquired through Banner, D2L, system.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Annual review of data in ADV program meetings and at EDHS unit annual data review retreats. Review of SLO reports submitted through University system and review by college deans for use at ADV department meetings to improve programs

Exhibit A. 3.4 C Applications of technology

Rela

tions

hip

to S

tand

ard

or C

ompo

nent

Requirement: Data to support ADV programs effectively prepares teachers to integrate technology into their curriculum, planning and teaching and to differentiate instruction.

Explicit link of the

intended data/evidence to the standard or component it is meant to inform

Multiple courses in the ADV programs integrate criteria related to technology in assignments and in assessments of candidate work but data across programs is not collected. Collection of data using technology from courses will sufficiently show that candidates in all advanced programs attain a high standard of specialized competency in this identified proficiency. Data collection from courses, beyond grades, will provide monitoring of students through transition points.

A description of the content and objective of the data/evidence collection

Collected data will provide a basis for review of candidate progress and areas for program improvement. Training of scorers and checking on inter-rater agreement and reliability will be documented. Evaluation categories or assessment tasks are aligned with CAEP, InTASC, national/professional and state standards. Repeated use of the PEU data assessment system will provide consistency of results. Review of data collected by assessment committee will provide feedback on quality of content, comprehensiveness, and question quality. Review during g annual retreat of assessment will inform reliability and validity.

Tags for data/evidence

A.3.4

Tim

elin

e an

d Re

sour

ces

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

ADV program courses that assess integrated technologies will add an assessment to the PEU assessment system or develop a data review mechanism. Included will be plan for candidates’ support system. ADV programs will design a matrix that includes point of assessment (transition point), evidence collected, and data. August 2020 Discuss data collection and align with course work already in place that integrates technologies. September 2020 Complete matrix or PEU assessment October 2020 share Matrix with Assessment Committee

A description of the personnel, technology, and other resources available;

ADV faculty to assess technologies in use and develop Matrix D2l, Banner, and PEU assessment system, IT personnel as needed. \ Associate Dean EDHS to make PEU adjustments Assessment Committee members for review and feedback

Data

Qua

lity Description of

procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Coursework already in use and established reliability and validity of assessment used for multiple iterations. Interrater reliability established over long term use of assessment. Choose ADV program and courses to review pilot data collection and compare with previous collections.

Steps that will be taken to attain a representative response

Purposeful sampling and attainment of representative sample will be achieved though registration for courses, Banner, and D2L.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Review of data during annual Assessment Committee data review Programmatic review within departments annually. SLO annual report

Exhibit A.3.4 D Applications of dispositions, laws, codes of ethics, and professional standards appropriate for the field of specialization Requirement: Demonstrated Professional Behaviors for adv programs have not been assessed as

part of the unit data review.

Rela

tions

hip

to S

tand

ard

or

Com

pone

nt

Explicit link of the intended data/evidence to the standard or component it is meant to inform

To address these standard components, we have developed a plan to gather the needed data from our program completers. ADV professional behaviors assessment will be included in the PEU system and be reviewed annually with all unit professional behaviors data. ADV professional behaviors data will indicate weaknesses in the professional behaviors demonstrated in the field.

A description of the content and objective of the data/evidence collection

ADV faculty and Associate Dean will modify current PEU assessment system to include ADV professional behaviors assessment or ADV faculty will develop a system to share current professional behaviors data collection with unit through the Assessment Committee annual data review. Recognized programmatic improvements will be posted in D2L Assessment Committee “Course”

Tags for data/evidence

A.3.4

Tim

elin

e an

d Re

sour

ces

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

August-September 2020 ADV faculty will develop an assessment, sub scoring criteria aligned to InTASC and CAEP standards and elements for disaggregated by ADV programs October 2020 final review of assessment and load into PEU system November 2020 assessment available for review by site visit team.

A description of the personnel, technology, and other resources available;

PEU system (Banner) Associate Dean for PEU modification ADV program faculty

Data

Qua

lity

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

ADV programs have an established pattern of surveying graduates and partners, ADV programs have an established readability, content, written quality of questions for established validity. A thorough review of the added questions validity will keep the overall validity of the survey. Proficiency level attributes are defined in actionable, performance-based, or observable behavior terms.

Steps that will be taken to attain a representative response

Consistent review of PEU assessments by Assessment Committee by semester. Annual sharing of assessment data in ADV program meetings. Alumni services can provide list of graduates’ emails.

Steps to analyze and interpret the findings and make use of them

Annual review of data at programmatic meetings, SLO report sharing and ADV programs participation at annual unit data retreats.

for continuous improvement.

Exhibit A 4. 1 A Data literacy and research-driven decision making Requirement:

Rela

tions

hip

to S

tand

ard

or C

ompo

nent

Explicit link of the intended data/evidence to the standard or component it is meant to inform

We plan to determine advanced completers performance in years 1-3 post exit by completing a common observation scoring rubric to judge case study data. Pre- and post-assessments will be included in the case studies.

A description of the content and objective of the data/evidence collection

Using a scoring rubric, like those by Network for Educator Effectiveness, we plan to meet the elements of this standard through annual data collection and review. Assessment committee will contact school districts to ask about sharing evaluation data on 1-3-year completers. PVASS data will be reviewed and used in conjunction with other measures of success of novice teacher’s impact on P-12 students.

Tags for data/evidence

A. 4.1

Tim

elin

e an

d Re

sour

ces

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

September 2020 A common observation rubric will be developed by the Assessment Committee. The rubric will be distributed to selected school partner faculty for review and feedback. October 2020 Assessment Committee will finalize rubric, case study, or Teacher work sample ways to collect data on novice teachers. Affiliation Agreement edits will be reviewed. November 2020 Rubric and focused plan for data collection shared with site review team December 2020 collect one sample for one semester as a pilot January 2021 formalize affiliation agreements for collection of data February – May 2021 collect case study and observation rubric evaluation data and review. Modify tools as needed. August 2021– May 2022 implement plan and collect Pre and Post data. Review in unit data retreat.

A description of the personnel, technology, and other resources available;

Assessment committee and faculty School district Principals and MU graduates Qualtrics survey or MU technologies for data collection, D2L State PVASS system

Data

Qua

lity

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Assessment Committee members, randomly chosen school Principals will review surveys completed to assess surveys and provide feedback on the surveys’ readability, content, question quality, and comprehensiveness. This step ensures survey validity. ADV programs have an established pattern of surveying graduates and partners, ADV programs have an established readability, content, written quality of questions for established validity. A thorough review of the added questions validity will keep the overall validity of the survey. Proficiency level attributes are defined in actionable, performance-based, or observable behavior terms.

Steps that will be taken to attain a representative response

Convenience sample of selected first-third year teachers from various teaching fields. Alumni services can provide list of graduates’ emails. State department supplied employment information will be utilized

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Annual review of data at data review retreats for plans for program improvement. Develop Pre and Post data collection process.

Exhibit A 4. 1 B Culminating experiences in which candidates demonstrate their proficiencies, through problem-based tasks or research (e.g., qualitative, quantitative, mixed methods, action) that are characteristic of their professional specialization Requirement:

Rela

tions

hip

to S

tand

ard

or C

ompo

nent

Explicit link of the intended data/evidence to the standard or component it is meant to inform

In order to meet CAEP Advanced standard A.4.1, The College of Education‘s ADV program plans to utilize multiple measures to determine employers’ satisfaction with completers’ preparation and that completers reach employment milestones such as promotion and retention. This will be achieved through separate measures such as employer satisfaction surveys as well as an EPP created database that tracks the employment, retention, and promotion of completers.

A description of the content and objective of the data/evidence collection

Currently ADV programs, School Nurse, Sch Psy, Counselor, Rdg Specialist, Principal and Supervisor launch program surveys for graduates. Many of the candidates in our ADV programs have reached 3 years’ experience in the public school and are seeking specialist degrees. In the case of school nurse, the initial certification offered is for school nurse and includes PDE required coursework. In Red Specialist program our on-line opportunities and requirements are offered for candidates’ professional specialization beyond their initial 1 teaching license. Please see attachment titled ADV Program Support for Grads for disaggregated by program evidence of support for graduates. We plan to update the quality of our plans by applying CAEP minimum standards to our current surveys.

Tags for data/evidence

A.4.1

Tim

elin

e an

d Re

sour

ces

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

September-October 2020 review and update surveys to meet CAEP minimum standards November 2020 share updated surveys progress with site team December 2020 complete process of updates of survey. Discuss expansion to reach more employers. Utilize the university alumni and job placement survey data. Reach out to Office of Institutional Research and Planning to devise plan to assist in ADV program response to annual survey. April -June 2021 launch ADV department survey and gather data from random sample. July 2021 launch survey to all ADV graduates.

A description of the personnel, technology, and other resources available;

Office on Institutional Research and Planning staff for ideas to gather more survey data through increased response. Graduate assistants to review surveys ADV faculty and Associate Dean of EDHS to edit for CAEP sufficiency.

Data

Qua

lity

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

The CAEP level 3 assessment rubric will be used to modify current surveys. A close match to the CAEP rubric and including the rubric elements on the survey will assure all survey responders of the inclusion of accreditation agency in survey development. A random sample of graduates will complete a pilot survey to assure readability, content, question quality, and comprehensiveness. This step ensures survey validity. Proficiency level attributes are defined in actionable, performance-based, or observable behavior terms.

Steps that will be taken to attain a representative response

Representative sample will be acquired through active advising lists supplied by Banner. Alumni services can provide list of graduates’ emails.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Survey results will be shared annually with site supervisors, unit data retreats and program meetings. Program modifications through partner input will be advertised to new candidates and graduates through web pages.

Exhibit A 4. 1 C Application of technology Requirement:

Rela

tions

hip

to S

tand

ard

or

Com

pone

nt Explicit link of the

intended data/evidence to the standard or component it is meant to inform

Graduates with 1-3 years of experience will be asked to participate in completing case studies and evaluation rubrics. Lists of employed graduates will be gathered from state reporting of employment data so employers can be contacted. Additionally, the rubric used for evaluation will be shared with potential case study participants. Graduates demonstrating weaknesses on the evaluation rubric will be offered online professional development through EDHS.

A description of the content and objective of the data/evidence collection

This standard element is addressed though online professional development using the graduate’s classroom environment as curricula for practicing professional development skills taught in the online offering. The assessment committee will develop a survey to assess the success of the on in professional development offered to graduates.

Tags for data/evidence

A. 4.1

Tim

elin

e an

d Re

sour

ces

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

September 2020 A common observation rubric will be developed by the Assessment Committee. The rubric will be distributed to selected school partner faculty for review and feedback. October 2020 Assessment Committee will finalize rubric, case study, or Teacher work sample ways to collect data on novice teachers. Affiliation Agreement edits will be reviewed. Develop survey for input from participants in case studies. November 2020 Rubric and focused plan for data collection shared with site review team December 2020 collect one sample for one semester as a pilot January 2021 formalize affiliation agreements for collection of data February – May 2021 collect case study and observation rubric evaluation data and review. Modify tools as needed. August 2021– May 2022 implement plan and collect Pre and Post data. Review in unit data retreat.

A description of the personnel, technology, and other resources available;

Qualtrics (or other survey tool) Assessment committee members for feedback Graduate Assistants and student workers Associate Dean’s office PEU assessment system

Data

Qua

lity

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Randomly selected individuals will be asked to review the two surveys and provide feedback on the surveys’ readability, content, question quality, and comprehensiveness. The CAEP rubric will be used to develop or edit the planned survey.

Steps that will be taken to attain a representative response

Representative sample will be taken from registration through Banner system. Advisor lists with active ADV candidates will be used for cross checking for representative sample. Alumni services can provide list of graduate’s emails.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Survey results will be shared annually with site supervisors, unit data retreats and program meetings. Program modifications through partner input will be advertised to new candidates and graduates through web pages.

Exhibit A 4. 1 D Use of data analysis/evidence to develop supportive school environments. Requirement:

Rela

tions

hip

to S

tand

ard

or

Com

pone

nt

Explicit link of the intended data/evidence to the standard or component it is meant to inform

Novice 1-3-year teachers from MU will be assessed with a Common Observation Rubric (COBS) evaluation of Studies in selected school partner schools. The Program Impact Studies will be included in the Affiliation Agreements (AA) signed by superintendents to establish partnerships between JSU and local education agencies (LEAs). Sharing of rubric results will be communicated to school partners for program improvement and feedback about assessment criteria for candidates before graduation.

A description of the content and objective of the data/evidence collection

To address this standard component data collection in the form of Action Research, Portfolio Collection (similar to Teacher Work Sample), School Plan, or Case Studies and an observation rubric to evaluate the case study will be collected from selected program candidates.

Tags for data/evidence

A.4.1

Tim

elin

e an

d Re

sour

ces

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

September – November 2020 develop a collaborative evaluation form to use by all ADV programs. Use of remediation plans will assist in identifying potential case study participants. Utilize site supervisors to select case student candidates and procedures. Potentially a teacher work sample, school goals and vision (Principal and Supervisor), or action research plan may prove to be more beneficial to ADV candidates than a case study or Tea her Work Sample, Nov 2020 share with site visit team progress in developing an observation tool and selection of potential ‘case study’ participants for pilot.

Exhibit A. 4.2 A Monitoring of either employment outlooks for advanced program areas or progress for candidate recruitment into program areas. Data will be collected through the EDHS launched Employer survey disaggregated by ADV and INT programs. Additionally, the annually launched Alumni and Job satisfaction survey data are available to programs Requirement:

Rela

tions

hip

to S

tand

ard

or C

ompo

nent

Explicit link of the intended data/evidence to the standard or component it is meant to inform

The initial graduate studies included in A.4.2 C of this plan included a convenience sample of selected first-third year teachers from various teaching fields. The studies planned can consist of a case study, action research project, school plans (visions, goals) or portfolios according to the needs of the ADV candidates job requirements. Example, in School Psy case studies are useful tools to assess the success of our graduates but at the same time are useful for the place of employment be it a school or clinical setting other than a school.

A description of the content and objective of the data/evidence collection

The plan for ADV programs is to develop a collaborative rubric that can evaluate the study completed by the ADV program candidate and approved by the site supervisor. The presently used remediation plan completed by site supervisors and faculty could be a springboard for use with graduates for the program for employee milestones assessment. We plan to collect PVASS data from the state department if it is provided for ADV programs. At present PVASS data is collected on P-12 students and not graduates of ADV programs. Our plan for ADV programs, such as principal/supervisor, will must include input from the district level administrators that evaluate new administrators. Our plan will depend on a collaborate effort between partners and ADV program faculty. District and state data will be mined for outlooks for potential jobs in areas for potential recruitment efforts or districts experiencing ADV program graduate shortages, i.e. Counselors.

Tags for A.4.2

December 202 train reviewers for ‘case study’ in tool for data collection February 2021 begin process of data collection and review for pilot June 2021 launch survey to all ADV candidates not participating in pilot. Follow up of University launched Alumni survey

A description of the personnel, technology, and other resources available;

Graduate assistants/student workers to follow up, phone calls, emails, and social media, survey launch for better response return. Banner system to get representative sample ADV faculty for review and collaborative efforts to refine graduates’ support Site supervisor input in development of rubric survey, ‘case study’. Review for program improvement.

Data

Qua

lity

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Randomly selected individuals, or the Assessment Committee members, will be asked to review the two surveys and provide feedback on the surveys’ readability, content, question quality, and comprehensiveness. The CAEP rubric will be used to develop or edit the planned survey.

Steps that will be taken to attain a representative response

Banner system will supply active candidates registered in program. Alumni will supply emails for graduates.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Survey results will be shared annually with site supervisors, unit data retreats and program meetings. Program modifications through partner input will be advertised to new candidates and graduates through web pages.

data/evidence Ti

mel

ine

and

Reso

urce

s Strategies, steps, and

a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

October – November 2020 review of employment data and verification of employment site percentages matching clinical placements. November 2020 share alumni data and job satisfaction data with site review team. December 2020– April 2021 modify survey of ADV programs and alumni survey. Launch graduate survey May-June 2021 collect and review data. Share with unit and department.

A description of the personnel, technology, and other resources available;

Graduate assistants/student workers to follow up, phone calls, emails, and social media, survey launch for better response return. Banner system to get representative sample ADV faculty for review and collaborative efforts to refine graduates’ support Site supervisor input in development of rubric survey, ‘case study’. Review for program improvement.

Data

Qua

lity

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Randomly selected individuals will be asked to review the two surveys and provide feedback on the surveys’ readability, content, question quality, and comprehensiveness. The CAEP rubric will be used to develop or edit the planned survey.

Steps that will be taken to attain a representative

response

Banner system will supply active candidates registered in program. Alumni will supply emails for graduates.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Survey results will be shared annually with site supervisors, unit data retreats and program meetings. Program modifications through partner input will be advertised to new candidates and graduates through web pages.

Exhibit A.4.2 B Evidence that diversity of admitted and completing candidates is given explicit attention across advanced programs Requirement: Relationship to Standard or Component

Explicit link of the intended data/evidence to the standard or component it is meant to inform

Data from ADV programs graduates’ surveys currently provide evidence candidates are satisfied with their preparation. The survey data are not consistent for all programs and we plan to collaborate to assure all ADV program graduates receive a survey and follow up calls and communications. The studies conducted to monitor our novice graduates will also be utilized to monitor graduates’ satisfaction with their skills as they begin their employment. The collection of data from a case study, action research project, school plan (visions, goals) or portfolios according to the needs of the ADV candidates job requirements, will provide an opportunity to evaluate skills present to complete such research. Collection of state and district data will increase our knowledge of the satisfaction of our graduates concerning their preparation for their employment.

A description of the content and objective of the data/evidence collection

To meet this standard element, we will need data focused on the satisfaction of our graduates. The annual alumni and job placement survey launched by the University will provide some data, but our plan is to increase responses for that survey for our ADV programs. We also plan to continue launching ADV program surveys. Also needed to meet the elements of this standard are the diversity percentages of ADV program graduates. ADV program include a wide variety of clinical sites, many self-chosen by candidates. The clinical sites of graduates from ADV programs will need to be compare to where those graduates are ultimately employed. The current belief of the ADV programs is a great majority of the graduates obtain jobs in the same place as the clinical placement but this will have to be confirmed though data collection.

Tags for data/evidence

A.4.2

Timeline and Resources

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

October – November 2020 review of employment data and verification of employment site percentages matching clinical placements. November 2020 share alumni data and job satisfaction data with site review team. December 2020– April 2021 modify survey of ADV programs and alumni survey. Launch graduate survey May-June 2021 collect and review data. Share with unit and department.

A description of the personnel, technology, and other resources available;

Graduate assistants/student workers to follow up, phone calls, emails, and social media, survey launch for better response return. Banner system to get representative sample ADV faculty for review and collaborative efforts to refine graduates’ support Site supervisor input in development of rubric survey, ‘case study’. Review for program improvement.

Data Quality Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Assessment Committee members, ADV program faculty colleagues or randomly selected individuals will be asked to review the two surveys and provide feedback on the surveys’ readability, content, question quality, and comprehensiveness. The CAEP rubric will be used to develop or edit the planned survey.

Steps that will be taken to attain a representative

response

Banner system will supply active candidates registered in program. Alumni will supply emails for graduates.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Survey results will be shared annually with site supervisors, unit data retreats and program meetings. Program modifications through partner input will be advertised to new candidates and graduates through web pages.

Exhibit A.4.2 C Data literacy and research-driven decision making by graduates in job

Requirement: Relationship to Standard or Component

Explicit link of the intended data/evidence to the standard or component it is meant to inform

Our plan for collecting data of our graduates use of data literacy and research-driven decisions will coordinate with the monitoring of our 1-3-year ADV program graduates. As they complete the planned ADV graduate studies, case study, action research project, school plan (visions, goals) or portfolios, ADV grad will be demonstrating their knowledge of data literacy and research-driven decision making. The planned collection of this type of after graduation monitoring has the increased benefit of provided a way for employers to monitor ADV graduates’ ability to make school/program changes as a novice employee.

A description of the content and objective of the data/evidence collection

As seen in attachment ADV Program Support of Graduates, some monitoring of data literacy and research-driven decision making is in progress. Our plan is to increase the consistency of the assessment of our graduates and develop a storage area to make more consistent use of the data for programmatic improvements.

Tags for data/evidence

A. 4.2 The provider demonstrates that advanced program completers perceive their preparation as relevant to the responsibilities they confront on the job, and that the preparation was effective.

Tim

elin

e an

d Re

sour

ces

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

September – October 2020 Review current survey data and questions from ADV program surveys and university launched surveys. See attachment ADV Programs Survey and Analysis Early November 2020 share with site team what the review of current practices provided in the way of data. Late Nov – December 2020 Edit survey data to meet CAEP sufficiency requirements and revise surveys as needed.

A description of the personnel, technology, and other resources available;

Qualtrics for current survey data competed by some ADV programs. Office of Institutional Research and Planning for annual survey data and possible editing of ADV program survey. ADV program coordinators and Associate Dean of EDHS to review, edit, and assess CAP sufficiency

Data

Qua

lity

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Assessment Committee members, ADV program faculty colleagues or randomly selected individuals will be asked to review the two surveys and provide feedback on the surveys’ readability, content, question quality, and comprehensiveness. The CAEP rubric will be used to develop or edit the planned survey.

Steps that will be taken to attain a representative

response

Banner system will supply active candidates registered in program. Alumni will supply emails for graduates.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Survey results will be shared annually with site supervisors, unit data retreats and program meetings. Program modifications through partner input will be advertised to new candidates and graduates through web pages.

Exhibit A.4.2 Completer applications of dispositions, laws, codes of ethics, and professional standards appropriate for the field of specialization demonstrating perception of their preparation as relevant and effective to get and retain employment.

Requirement: Relationship to Standard or Component

Explicit link of the intended data/evidence to the standard or component it is meant to inform

Program completers perceive their preparation as relevant and effective for their employment. This component relates to the collection of 1-3 year employment ADV program graduate assessment through , case study, action research project, school plan (visions, goals) or portfolios, ADV grad will be demonstrating their knowledge of data literacy and research-driven decision making. These factors are the foundation for successful employment.

A description of the content and objective of the data/evidence collection

To meet the elements of this standard the data collected through case study, action research project, school plan (visions, goals) or portfolios, ADV grad will be demonstrating their knowledge of data literacy and research-driven decision making will describe the impact of the ADV program candidates on the population they serve, Depending on the ADV program this may not be a P-12 school setting. Our plan for data collection for A.4.4 is to include data collected at the state and school district levels. The annual alumni survey launched by the Office of Institutional Research and Planning includes data on satisfaction with their ADV program preparation, but only in quantitative form. A short answer qualitative addition to the survey would provide more input for programmatic improvement for ADV program preparation. Use of the remediation plan developed during internship and practicum could be utilized as a baseline to find outgrowth of graduates as they gain more experience as employed educators. The plan can be updated with skills developed after graduation. Discussions among ADV faculty will result in the feasibility of interviews, via Zoom or other technology, to gather graduates’ feedback regarding their job preparation.

Tags for data/evidence

A.4.2 The provider demonstrates that advanced program completers perceive their preparation as relevant to the responsibilities they confront on the job, and that the preparation was effective.

Timeline and Resources

Strategies, steps, and a schedule for collection through full implementation, and indication of what is to be available by the time the site visit

September – October 2020 Review current survey data and questions from ADV program surveys and university launched surveys. See attachment ADV Programs Survey and Analysis Early November 2020 share with site team what the review of current practices provided in the way of data. Late Nov – December 2020 Edit survey data to meet CAEP sufficiency requirements and revise surveys as needed.

A description of the personnel, technology, and other resources available;

Qualtrics for current survey data competed by some ADV programs. Office of Institutional Research and Planning for annual survey data and possible editing of ADV program survey. ADV program coordinators and Associate Dean of EDHS to review, edit, and assess CAP sufficiency

Data

Qua

lity

Description of procedures to ensure that surveys and assessments reach level 3 or above on the CAEP assessment rubric;

Assessment Committee members, ADV program faculty colleagues or randomly selected individuals will be asked to review the two surveys and provide feedback on the surveys’ readability, content, question quality, and comprehensiveness. The CAEP rubric will be used to develop or edit the planned survey. Development of questions will be founded on

Steps that will be taken to attain a representative

response

Banner system will supply active candidates registered in program. Alumni will supply emails for graduates.

Steps to analyze and interpret the findings and make use of them for continuous improvement.

Survey results will be shared annually with site supervisors, unit data retreats and program meetings. Program modifications through partner input will be advertised to new candidates and graduates through web pages.