Return to the Fayetteville State University Home Page

NCATE Accreditation Information

FSU HomeNCATEStandard 2Standard 2 Narrative

Standard 2 Narrative

 Standard 2: Assessment System and Unit Evaluation

The Unit has an assessment system that collects and analyzes data on the applicant qualifications, the candidate and graduate performance, and unit operations to evaluate and improve the unit and its programs.

2.1 Assessment System and Unit Evaluation

How does the unit use its assessment system to improve candidate performance, program quality, and unit operations?

In 2008, units at FSU collaboratively identified an electronic Assessment System (AS) to guide data collection and analyses to improve program quality. The EPP designed an Assessment Plan 2.4.a, using the CF themes, to guide the assessment process. Taskstream has been implemented and is used to support the AS through the submission and evaluation of assessments at all Transition Points (TPs) for all programs. Taskstream's Learning Achievement Tools (LAT) Platform allows programs and departments to post, collect, assess, and monitor candidate performance across transition points. The use of the platform facilitates the EPP's capacity to export data results and evaluate outcomes and key evidences that are reviewed using standards-based rubrics created within LAT 2.4.a5. Rubrics and disposition instruments are aligned to the CF, which is aligned to state standards 2.4.za.

Programs were revised and submitted to the DPI for review in 2009. With program approval, the LAT workspace was developed to reflect the EPP's CF I.5.c and four TPs. Key evidences for initial programs include unit plan, content project, case study, leadership project, Certification of Teaching Capacity (CTC) form, and portfolio. At the advanced level, key evidences include the leadership and collaboration project, action research paper, and the dissertation process. The AS and the six capstone evidences were piloted in fall 2010 with changes made using the pilot data.

The Assessment Committee (AC) monitors the assessment process. During an AC meeting in summer 2012, faculty groups (including representatives from CAS) engaged in a face validity exercise of required evidences 2.4.a4. This process was infused to address possible bias in the assessment practice. External partners serve as second evaluators of key evidences to strengthen inter-rater reliability 2.4.c. At the initial level, the CTC form is scored by university supervisor, cooperating teacher, school principal, and candidate, a measure of external validity. Since 2012, the AC continues to review findings, assessment processes, and unit operations. The assessment process is guided by a strategically designed feedback loop 2.4.d that demonstrates the annual collection, analysis, sharing, and use of data to support program changes. Data reviewed includes Student Learning Outcomes (SLOs) results 2.4.zb, TP I, II, III, and IV assessments results 2.4.a3, and the candidates' success in meeting the 22 PTS competencies 2.4.zc as are reflected on sample evidences. Recommendations on unit operations begin with the faculty, are presented and vetted at different levels, and endorsed by the TEC. Changes are monitored2.4.g.

All programs have admission and program completion criteria 2.4.b and use data from key assessments that are comprehensive and fully integrated within the curriculum as reflected by TPs in the Assessment Plan 2.4.h. Key assessments are formative and summative and are evaluated using rubrics posted to Taskstream. Candidates receive clear directions, guides, and training on each assessment through manuals, syllabi, and assessment-specific directions. Candidates who do not successfully progress through the TPs receive a Corrective Action Plan (CAP) 2.4.i that outlines a prescriptive course of action, with support and resources, to improve performance.

The EPP maps courses in the curriculum aligned to professional standards and through which candidates are introduced to both "emerging" and "developing" level material, in addition to material being assessed for "proficient-level" knowledge and skills 2.4.j. Syllabi state the timing and structure of key assessments. This information is communicated with pre-/candidates during instructional time, Majors' Meetings, 2.4.k, and Dean's Forums 2.4.l each semester.

The Early Disposition Inventory has six measures and is used during the early field experiences required in EDUC 211/SPED 480 2.4.m. The SOE Disposition Inventory (DI) is aligned with the seven themes of the CF and contains multiple and specific dispositional indicators. The SOE DI is applied at TP I, II, and III. The DI is administered at TP I as a candidate self-assessment; at TP II by the team conducting the admission to student teaching interview (initial) or advancement to candidacy (advanced); and at TP III by the team conducting the exit portfolio evaluation (initial and MSA), the Product of Learning (M.Ed.), and the Dissertation Committee (Ed.D).

The EPP uses surveys as a source of data to document program effectiveness and to support program changes. Surveys are administered electronically using Qualtrics. Pre-candidates/ candidates complete Satisfaction Survey to provide feedback on the efficiency and effectiveness of advisement and instructional processes 2.4.n. At the conclusion of the clinical practice, candidates complete a Student Teacher Exit Survey on Internship Placement 2.4.o. Each semester, FSU administers a Graduating Senior Survey to all initial completers and data are provided to units to drive discussion and program changes 2.4.p. Since DPI no longer disseminates a survey to graduates, employers, or mentors, the EPP developed its own Alumni "Job Readiness" Survey 2.4.q, administered in the spring of each academic year effective 2012. The EPP also administers an Employer Survey to assess job readiness of all graduates and completes 2.4.r. Surveys seek data on graduates and completers through three-years in the field. Data from the EPP's alumni and employer surveys on satisfaction and preparedness attributable to preparation at FSU, professional behaviors and dispositions, and indicators related to the CF are distributed regularly to stakeholders. The data are also made available to pre-candidates and candidates, with results posted to the SOE webpage. UNC General Administration (UNCGA) developed a recent graduate survey aligned to NC Professional Teaching Standards (PTS) 2.4.s, which was field tested in spring 2014 and results reported to each institution. This survey will be administered annually.

Beginning in 2010 and bi-annually since, UNCGA collaborated with Carolina Institute for Public Policy to collect teacher effectiveness/value-added data on recent completers from the NC institutions. The EPP shares these data with all involved in teacher preparation, including the FSU Board of Trustees, and posts reports on the SOE website at the Program Quality Indicators link. For example, the 2012 teacher effectiveness report data revealed that secondary education social studies completers were not effective in the classroom. The EPP contacted these completers, asked them to identify areas of need, and invited them to the 2013 Excellence in Teaching Conference (EITC) for an all-day workshop to address expressed needs. UNCGA also collects effectiveness data on MSA completers. However, a low response rate did not provide usable information in 2012. UNCGA are putting new processes are in place to collect valid data on the effectiveness of MSA graduates.

Teachers are monitored on effectiveness to implement course of study in the classroom through the North Carolina Educator Evaluation System aligned to the PTS. Beginning spring 2012, the EPP receives teacher effectiveness/value-added data from DPI that detail the success of graduates with up to three years of experience as measured on PTS 1-5 by the principal or assistant principal with "proficiency" as the state required minimum rating. Standard 6 measures a teacher's impact on student growth 2.4.t. Spring 2013 results show that 87% of recent completers had a positive impact on student growth, up from 80% reported in the inaugural year. The 2013 state average on student growth was 75%, down from 80%. The data validate the positive impact of EPP completers on the children they teach. Since employer survey responses on teacher effectiveness have been limited, the EPP considered the standards 1-5 LEA evaluations of beginning teachers with 1-3 years in the classroom as evidence of employer feedback on teachers' readiness. Results are used to enhance program activities. For example, the 2012 standard 5 data show completers were not reflecting on pedagogy. The EPP developed prompts to encourage reflection 2.4.u and documented a slight rating increase from 90% (the lowest rating on all standards that year) to 92% in 2013. The 2013 standards 1-5 data show candidates are 89% and 88% effective, at least a proficient rating, on standards 3 (content) and 4 (pedagogy) respectively. The EPP invited recent completers, current methods candidates, and student teachers to the 2014 EITC to participate in professional development sessions designed to address content and pedagogy with presentations from partner LEAs, SOE, and CAS faculty. The EPP will continue to monitor ratings to support programs and candidates.

The EPP hosts a Majors' Meeting and a Dean's Forum each semester to share data and allow candidates and pre-candidates to voice concerns or suggestions. Additionally, the EPP adheres to FSU's required use of the Student Complaint Form 2.4.e; 2.4.f posted on multiple FSU websites - department, SOE, Academic Affairs, and Student Affairs. Grade appeals are an institutionalized process with guidelines posted in catalogs and shared with pre-candidates and candidates. The appeal process for advanced candidates is detailed in the Graduate Admissions Manual 2.4.v, which is implemented by a Graduate Admissions Committee 2.4.w that submits recommendations to the SOE Dean for approval and submission to The Graduate School (GS). The GS Appeals Committee, 2.4.x of which the Dean is a member, reviews advanced candidates' appeals from the CAS, SBE, or SOE. The GS communicates the appeal decision to candidates.

2.2.b: Continuous Improvement

•·         Summarize activities and changes bases on data that have led to continuous improvement of candidate performance and program quality

•·         Discuss plans for sustaining and enhancing performance through continuous improvement as articulated in this standard

Since the NCATE visit in 2007, the SOE has engaged in prescriptive activities informed by assessment process, and made several systematic changes based on data that have led to continuous improvements in candidate performance and overall quality in the EPP. The following is a summarized list of the data-informed changes in the EPP, followed by the EPP's plan for sustaining and enhancing performance through continuous improvement in this standard.

•·         In 2008, the SOE modified its assessment system based on feedback on the effectiveness of data collection, analysis, and sharing. Adjustments included adding a standing Assessment Committee (AC), securing an assessment coordinator, and purchasing a robust data management system to become systematic in identifying key artifacts for initial, advanced, and school professional programs, as well as refining TPs to collect and organize trend data. The AC meets monthly.

•·         The annual Assessment Retreat, established 2012, supports continuous improvement efforts where the AC reviews data, data collection efforts, and analyzes program quality. Membership varies and makes recommendations based on findings related to candidate Knowledge, Skills, and Dispositions (KSD). During the retreat, administrators, faculty, and other internal and external stakeholders review the SOE assessment system processes, data collection, and findings, and make recommendations to improve the assessment system and EPP's evaluation policies and procedures.

•·         In 2009, the SOE revised programs and assessments using DPI guidelines to promote program quality and assurance. Initial licensure and the MSA programs were revised, aligned with new professional teaching and school executive standards, and piloted in fall 2010. Common assessment evidences with uniform rubrics are in place to afford reliable data. The SOE Disposition Inventory (DI) is a uniformed instrument that was developed to assess candidate dispositions at multiple transition points. The SOE DI was approved in 2011 and provides data for analyses across program levels. During the pilot year, a default rating was placed in Taskstream and faculty made changes based on serious deficits or on outstanding dispositions. Scoring measures were applied at each transition point in fall 2012.

•·         Feedback from LEA partners on pre-candidates' dispositions during early field experience, and reported from faculty, led to refinement of requirements for admission to initial licensure programs. Since 2012, the EPP requires an admission interview for TE and ST, which is evaluated using a uniform rubric, and includes questions on the CF.

•·         DPI provided completer and employer satisfaction survey results to the EPP until 2010. Recognizing the need for data on completer and employer satisfaction to determine effectiveness, the EPP designed its own satisfaction surveys, which were administered in 2012 to completers with less than 3-years' experience and to employers of those completers. Feedback from employers was limited. Therefore discussions occurred to determine the best time in the academic calendar, and the process, to submit the request to generate a better response rate. The 2014 surveys were administered in an individualized manner to principals with cover letter naming the teacher on whom feedback was being requested. The response rate increased from 11 in 2012, 8 in 2013, to 54 in 2014.

•·         Not all candidates progress smoothly through each TP or assessment. Consequently, to provide guidance to each candidate, in 2009, the EPP constructed a Corrective Action Plan (CAP) that outlines steps toward proficiency.

•·         Effective fall 2012, the EPP instituted a Criminal Background Check (CBC) to be utilized prior to completion of experiential learning. Methods course experiential learning requires a full background check completed through the FSU Office of Legal Affairs. Results of candidates with criminal concerns are reported to the Office of Teacher Education (OTE). In preparation for student teaching the CBC is completed through the county's HR office where the capstone experience is assigned. Initial field experience requires a self-reporting process, using a Self-Disclosure Form. Acknowledging a criminal offence requires a full background check. The CBC provides insights into activities that could prove unacceptable for educators.

•·         To address pre-candidates with GPAs below 2.5, the EPP launched a Teacher Education Admissions Appeals Committee (TEAAC), a sub-committee of the TEC. A policy statement was placed in the fall 2013 catalog indicating that pre-candidates with GPA below 2.5 for a semester are placed on academic probation. After two consecutive semesters with a GPA below 2.5, s/he will be placed on academic suspension from the SOE and must complete an appeal to continue enrollment in education core courses and have the option to change major to one that accepts a 2.00 GPA. Suspension will be in effect fall 2014.

•·         The 2011-2012 NSSE Survey results indicated that candidates were not receiving prompt feedback from faculty. The EPP utilized the option in Taskstream for candidates to submit a draft for review and feedback. Consequently, faculty members are expected to provide feedback within two weeks, as stated in syllabi. The NSSE measure of receiving prompt feedback from faculty reflected a score of 3.74 in 2013 over the 2.95 received in 2012. The EPP requires draft submissions of assessments for feedback on appropriate completion of assessment to maximize candidates' success.

•·         To ensure that candidates understand expectations and that assessment procedures match the requirement, directions for the evidences were revised after the piloting of assessment since 2010-2011, especially for the elementary education program.

•·         In 2013, the faculty re-examined the MSA program's vision for learning competence to provide greater opportunity for application as an administrator. As a result, candidates participate in discussion and self-reflection assignments to aid them in developing a vision for learning.

•·         During the MSA revision process the faculty used partnership feedback and comments to align activities and program SLOs in order to provide candidates with skills and experiences deemed important by LEAs. In the MSA program, candidates now have more opportunities to use simulated data sets provided through state mandated data systems.

Additional programmatic changes that were made since the 2007 accreditation visit include:

•·         In 2010, the Health Education and the Physical Education programs were combined to form the Health/Physical Education Program in order to serve dual roles with one licensed staff and meet the needs of LEAs.

•·         In fall 2011, the Department of Health and Physical Education was merged with the Department of Middle Grades, Secondary, and Special Education to form the Department of Middle Grades, Secondary, and Specialized Subjects (MSSS).

Low enrollment resulted in the discontinuation of the following programs I.5.j: BS in Social Sciences (Secondary Education) (fall '11); BS in Marketing Education (fall 2011); M. Ed in English Education (fall 2011); BS in Spanish Education (fall 2011); BS degrees in mathematics education, science education, and English education (fall 2012). BS degree program plan in Secondary Education in mathematics, science, and history/social studies to be submitted through the approval process and will be housed in the SOE. BA in English with Education/Licensure Concentration and a Spanish LO program was approved spring in 2014.

Leadership changes include the approval to establish and hire a Director of Teacher Education Recruitment and Advisement position to lead the SOE Academic Advisement and Retention Center (SOEAARC) (2008); approval to establish, and hired a Coordinator of USTEP/PDS position (2009); secured an Associate Dean position to replace the Assistant Dean position (2011). An Associate Dean was hired in August 2012. Hired the Wells Fargo Endowed Chair of Education (2011) to strengthen faculty and candidate scholarship and be a liaison to schools and community agencies; established the SOE Dean's Advisory Board (2009) with two supporting committees (Fundraising and Curriculum & Academics). The Fundraising Committee hosted three consecutive scholarship fundraising banquets. The EPP hosted annual spring Excellence in Teaching Conference (2009) to provide professional development to pre-service and in-service teachers; revised the SOE Mission and Vision Statements and EPP strategic priorities and goals (revised and approved in 2010 and revisited in 2013).

The Operational Plan and Assessment Record (OPAR) process is reviewed and refined annually led by the Provost and Vice Chancellor (VC) for Academic Affairs and the institution's Associate VC for Accreditation. The OPAR provides evidence of the EPP's continuous improvement in candidate learning and performance and provides an overall view of program and EPP quality and effectiveness. Departments and the EPP participate in a holistic evaluation of each OPAR using a campus-wide rubric 2.4.y. The 2013-2014 academic year was the second year of this evaluation process and each department and the EPP have reflected continuous improvement in all measures of the rubric. In spring 2014, prior to the institution's review, the EPP hosted is own OPAR review, using the same rubric, to provide formative assessment.

2.3       Areas for Improvement Cited in the Action Report from the Previous Accreditation Review

Summarize activities, processes, and outcomes in addressing each of the AFIs cited for the initial and/or advanced program levels under this standard.

During the 2007 continued accreditation visit, the EPP was identified as having a Standard 2 Area for Improvement (AFI). Specifically, the AFI indicates that "the unit does not systematically collect, analyze, or use assessment data for improving candidate performance for the program in the Ed.D. for Educational Leadership."  Strategies to address the AFI are detailed annually in the NCATE Part C Reports 2.4.z.  In 2008, the faculty in the Department of Educational Leadership launched a Doctoral Program Advisory Board to secure feedback from constituents while building partnerships with educational agencies, higher education colleagues, and the doctoral candidates 2.4.aa. Faculty members also implemented a doctoral graduate survey through which they expect to receive feedback from recent graduates on program areas that could be strengthened and prove beneficial for enrollees and future candidates. Results led to faculty revisiting program learning outcomes, which are now monitored annually through the OPAR process2.4.ab-3.  

To support candidates and guide them through the terminal degree, the dissertation chairs meet each semester to support each other and to identify common areas of struggles by their candidates 2.4.ac. Discussions resulted in awareness that some courses were not aligned to the EPP's CF.  All syllabi are now aligned to program SLOs as well as CF, professional, and state standards 2.4.ad. The Dissertation Handbook 2.4.ae was also revised, the department now sponsors dissertation writing seminars 2.4.af, and the Doctoral Student Association was established 2.4.ag.   

In 2009, the faculty continued to revisit program goals and data to verify program quality while continuing to guide candidates to meet program expectations with consistency 2.4.ah. Faculty reviewed program activities to align to 21st Century leadership learning and initiatives, based on current trends and practices. Discussion with LEA partners resulted in revised internship objectives 2.4.ai so that the internship experiences are closely aligned to candidates' employment goals. During this period of review, the faculty revised the dissertation matriculation rubric and developed dissertation proposal and defense rubrics. With the assistance of the Assessment Coordinator, the faculty developed a Data Collection Form to monitor the collection of data by transition points and to provide easy and consistent sharing of information to candidates and faculty for continuity and clarity. Program requirements, as detailed on the Data Collection Form, were posted to Taskstream to facilitate the submission and evaluation of the data across all TPs, but particularly TPs I, II, and III.

In 2010, during an annual review of Taskstream data, the faculty determined there was need to enhance the alignment between the data provided and the data needed to create improvement in the program. Analysis revealed that many assignments on the Data Collection Form were relevant for program formative assessment but were not capstone assessments. The six major assessments are now submitted to Taskstream across TPs 2.4.aj. Other program requirements were removed from Taskstream but maintained in the program. Candidate survey data highlighted the need for timely feedback from dissertation chairs in order to progress to program completion. The faculty decided that each faculty would chair to no more than six dissertations at a given time. (One faculty was serving as dissertation chair more than six candidates and the department allowed her to work with those previously assigned to her but not assign new candidates to adhere to this process.) The decision to limit the number of dissertations that a faculty can chair will result in timely feedback to candidates. Faculty determined that three weeks is a realistic period to provide candidates with feedback and this expectation is posted in syllabi. Candidate persistence rate is 88.5% for 2013-2014. Since the analysis of SLOs and alignment of standards, the department has gained improvement scores in its ability to document and use assessment for continuous improvement 2.4.ak.  

Another major area for program improvement as expressed by candidates is the need to have a more systematic way of preparing candidates for the dissertation process. As a result, the Graduate School (GS) collaborated with the Ed.D program director and faculty to revise the dissertation section of the Graduate Student Handbook 2.4.al to provide greater details about preparing the dissertation. A dissertation submission timeline is provided to candidates to guide them in preparing and submitting the dissertation 2.4.am. The department revised the Ed.D. Program Manual to reflect procedural expectations from both candidates and committee members. As a result, the department revisited the advisory process and aligned candidates to faculty who share common research or employment interests and also encourages candidates to ask qualified individuals outside the department to serve on the dissertation committee 2.4.an. The 2011-2012 institution advisement survey result showed that 78.6% of candidates were satisfied with the advisement they received as a doctoral candidate. The fall 2013 advisement survey results reflected a satisfaction rating of 90.5 % (5.43/6.00). There is an 86% (5.16) candidate satisfaction rating on advisement for spring 2014.  

Meetings with candidates, analysis of the dissertation evaluations, including revisions required of candidates' documents by the Institutional Review Board (IRB) and the GS editor, comments during the dissertation oral defense, comments on early evidences, and reflections by faculty on the process, showed a need to assist candidates in strengthening their research skills. A Dissertation Boot Camp was implemented to provide hands-on assistance to candidates in the dissertation phase 2.4.ao. Core courses now reflect an increase in faculty-led, targeted, monthly workshops and an increase in research-based writing assignments, literature reviews, and application of research methods. Meetings with doctoral candidates each semester provide an avenue for information and feedback. Greater interaction with doctoral candidates facilitates continuous improvement of candidate performance and program quality 2.4.ap. Candidates expressed an appreciation for these initiatives designed to assist them in the completion of quality dissertations 2.4.aq. In fall 2014, an analysis of the feedback from the GS editor resulted in an appeal from the department for an editor to be assigned to doctoral candidates prior to the submission of the dissertation for defense 2.4.ar.

Based on candidates' surveys, boot camps, and reflections from past years, continued efforts to strengthen the dissertation readiness of candidates resulted in a change in the Comprehensive Exam format from a common content-based assessment to an individualized, advisor-led, dissertation-focused mini-proposal, with an emphasis on research methods and a literature review 2.4.as. The Dissertation Boot Camp sessions resulted in candidates from early cohorts reestablishing connections to the program. In 2012, faculty conducted a review of cohorts and determined to reclaim ABD candidates back to the program 2.4.at. ABDs have reenrolled, indicating the value of this activity. There are still ABD candidates who are not retained. The approval of EDLE 999 as a 3 credit course with the official dissertation advisor as the professor of record was designed to reduce the number of ABDs and assure continuity of the dissertation process.

The faculty determined that admitting an annual cohort affected the time each faculty can devote to candidates during the dissertation process (especially since ABDs were being encouraged to return to the program) and decided to conduct biennial admission. The expected ripple effect of biennial admission is increased graduation and persistence rates, while simultaneously reducing faculty course overloads emanating from teaching core content courses to two different cohorts.  In fall 2013, DPI required a remodeling of all state doctoral programs that lead to the superintendent's license. The FSU Blueprint was submitted to DPI on February 14, 2014, as requested 2.4.au. All blueprints across the state, will be reviewed by teams of faculty for final approval by the SBE, and once approved, will be implemented in fall 2015.

The Ed. D. faculty continue to focus on collecting, analyzing, and using assessment data for improving candidate performance and program quality. Since improvement strategies were implemented, faculty members now conduct an annual review of achievements of these measures and TP III data to strengthen the program. Data emanating from Taskstream assessments, candidates, alumni, faculty, dissertation evaluations, enrollment process reviews, and institutional evaluations continue to provide avenues for review. CIR data - across 6 measures - program SLO achievement, advisement results, persistence rate (retention + graduation), OPAR evaluation, SCH/FTE ratio, and teaching evaluation (submitted by candidates) reflect an improvement in program results over the past three years 2.4.av.

Finally, the doctoral program has admitted candidates in cohorts with historically more candidates admitted to the K-12 than the Higher Education (HE) cognate. Currently, as candidates are admitted, admission to each cognate 2.4.aw is closely monitored so that even within cognates, the candidates can function as a true cohort and assist each other through the process. Recruitment procedures, primarily the responsibility of the program director, have changed to a department-wide strategy involving faculty, staff, and candidates. The 2013 fall recruitment numbers have increased.

 

 

A Constituent Institution of The University of North Carolina