Institutional effectiveness is the term used at Eastern Mennonite University to refer to the collection of processes and policies that together assure that the university is successful in achieving its mission and vision and meeting the needs of its stakeholders (students, employees, as well as the churches and communities it serves). Institutional effectiveness might just as easily be viewed as a widely-held commitment to continuous improvement on the part of all areas of the university.
Planning at EMU takes place at a variety of levels within the organization. All planning simultaneously informs and is shaped by the university’s strategic plan. A formal strategic planning process is undertaken every three years, and concludes with an updated three-year plan for the institution. The three-year plan is adjusted as-needed between major updates. Each year departments and programs within the university develop or revise their own multi-year plans, all of which are linked to the priorities of the university’s three-year plan.
Assessment efforts at EMU are built around student learning outcomes and operational outcomes that academic programs and departments articulate. Each year, faculty and staff review data according to their own assessment cycles. The findings from assessment work inform plans for growth and improvement. Additionally, program review cycles provide an opportunity for more in-depth review of program effectiveness, viability and future direction.
Institutional Effectiveness Policy
In order to ensure the EMU’s commitment to principles of continuous improvement, based on a systematic and documented process of assessing institutional performance with respect to mission in all aspects of the institution, the university as a whole and its various programs and services engage in the following processes:
- The university maintains a strategic plan that articulates priorities aligned with its mission, vision and values.
- The university annually develops an operating budget which is approved by the board of trustees.
- All operational units (i.e., programs, departments, offices, etc.) maintain annual plans that are aligned with the university’s three-year priorities.
- All academic programs, and academic and student services identify expected outcomes, assess the extent to which these outcomes are achieved, and provide evidence of seeking improvement based on analysis of assessment results.
- All administrative support services identify expected outcomes and demonstrate the extent to which the outcomes are achieved.
- The university engages in annual review of its performance vis-a-vis the goals of the strategic plan as well as the specific goals of the various operating units.
These processes inform decision-making at all levels and provide a sound basis for budgetary decisions and resource allocations.
Guiding Principles for Institutional Effectiveness
Lest they devolve into bureaucratic exercises, EMU’s institutional effectiveness processes should be participative, flexible and responsive, relevant, and closely linked to university planning. To this end, we strive to maintain an institutional effectiveness framework that adheres to these principles:
- Sustainability - Our institutional effectiveness processes must be sustainable at the department/program and institutional levels (i.e., they can be maintained at a high level of quality with the resources available)
- Focus is important, and authentic improvement takes time; the framework should facilitate the implementation of a variety of improvements that will be at a variety of points in a process at a given time (i.e., we needn’t attempt to improve all aspects of our programs simultaneously)
- A Spirit of Inquiry - The system should foster a spirit of inquiry and continuous improvement, not compliance; it should serve as the means by which we hold ourselves accountable to our own aspirations. Annual focus should be on straightforward, key questions:
- What were your outcomes for this year? (Which ones were you focusing on?)
- How were you measuring those outcomes?
- What were your goals with regard to those measures?
- How did you do at achieving your goals, and what will you do differently in the future in light of that?
- Support - The campus should be supported in its assessment work through professional development, training and one-on-one help; in the academic areas, institutional effectiveness/assessment should be tightly linked to faculty development
PACE - Planning & Assessment Cycle at EMU
PACE is the process which facilitates the implementation of the university’s institutional effectiveness policy. The PACE process relies on a web-based system, Strategic Planning Online (SPOL) that tracks planning objectives, division- and institution-level priorities, student learning outcomes, operational outcomes, and much of the data associated with assessment of these elements.
PACE - Planning
University Strategic Plan
EMU uses a three-year time horizon for its strategic plan, making minor adjustments as needed for the plan’s second and third years based on campus input and environmental considerations.
Strategic Planning Process
The three-year plan is developed via the focused work of a representative six- to eight-member task force over a six- to nine-month participatory process. The task force gathers input from key campus constituencies via surveys, focus groups and interviews. This input is combined with an environmental scan in order to develop university priorities for the next three years.
The strategic planning task force is comprised as follows:
- Faculty senate representatives (x2)
- Staff advisory council representative
- Staff supervisory council representative
- Student life representative
- Student government association representative
- VP for Institutional Effectiveness & Planning, chair/facilitator
Though the specifics of the planning process will be adapted as needed for each cycle, the basic planning process may include all or some portion of the following steps:
- Initial constituent input, (likely surveys and/or focus groups) including but not limited to: students (via SGA), faculty (via faculty senate), staff and administrators (via staff advisory council and staff supervisory council), parents (via parents council), alumni and donors (via targeted survey), board of trustees, and president’s second century advisory council.
- Task force synthesis and environmental scan - task force assembles constituent input, articulating key themes; members also review high-level information regarding patterns and trends in EMU’s operating environment (local, state and national trends, changes in Mennonite constituencies, etc.)
- Processing with president’s cabinet - president’s cabinet reviews key themes in focused planning session, providing task force with input to shape emerging priorities
- Task force develops preliminary priorities - based on constituent input and president’s cabinet guidance, task for articulates preliminary priorities
- Board of trustees review of preliminary priorities - Board of trustees reviews preliminary priorities and offers guidance to further refine and shape the plan
- General campus comment period - The preliminary priorities are posted for the campus community to review and comment; feedback forwarded to task force; group discussions hosted as applicable (e.g., staff advisory council, faculty assembly, SGA, staff supervisor’s council, etc.)
- Task force finalizes strategic plan - Task force assembles all feedback and finalizes the plan’s priorities
At the conclusion of the process, the task force forwards its recommended strategic plan to the president’s cabinet for approval. Finally, the plan is approved by the board of trustees.
Strategic Plan Implementation
Upon approval, each of the plan’s priorities is assigned to a member of the president’s executive team who will serve as its point person. As point person for a priority, the executive team member will coordinate efforts across the university in support of the priority (in some cases, in conjunction with another member of the team). The point person is responsible for developing a measurement and evaluation plan for the priority (metrics, data collection plan, etc.) in collaboration with the institutional research office.
Each year the point person is responsible for preparing progress reports to the president and board of trustees for the priority. These progress reports often are incorporated into regular (three times per year) reporting that is incorporated into the docket of materials for each board of trustees meeting. At a minimum, annual reports on each of the strategic plan priorities are completed over the summer, distributed to campus in the early part of the fall semester, inform the president’s annual report, and are included for review by the board of trustees at its November (annual general) meeting.
Three times each year, usually at the conclusion of the fall, spring and summer terms, the president’s cabinet also formally meets to evaluate progress on the priorities and to consider whether adjustments to the plan are necessary based on internal or environmental considerations. Any adjustments to plan are shared with the campus community at the opening of the fall semester.
In the third year of the plan, the president’s cabinet convenes a new strategic planning task force to begin preparing the next three-year plan.
Academic Program and Administrative Department Planning Timeline
Each year, individual academic programs and administrative departments engage in a localized planning process that considers two academic/fiscal years simultaneously and consists of the following components:
- Fall: refine and sharpen plans for the year underway
- Spring: develop high level plans and associated program/curriculum, staffing and or budget requests for the next year in conjunction with the budget process facilitated by the VP for Finance
- Spring/summer: evaluate progress on plans for the year coming to an end; use findings to refine/sharpen plans for the year beginning in fall
Program- and department-level plans are, whenever possible, directly linked to university strategic plan priorities in the planning system (SPOL). Point persons for strategic priorities are responsible to review and coordinate efforts across the university linked to their priority/ies.
During the academic year, supervisors of academic program directors and administrative unit heads check-in on a regular basis regarding the status of planning objectives of the various academic and administrative units. These updates can be recorded in SPOL to facilitate ease of end-of-year and strategic planning reporting.
Each summer, deans and VPs review progress reports from their academic programs and administrative departments and provide feedback to inform future direction. VP review of program/department progress reports also inform the PACE Executive Summaries (further described below) they prepare for the president and for the board of trustees annual general meeting each November.
See Appendix 1 for a list of EMU's planning units as identified for PACE planning purposes.
Annual Budget Process
Each year in December the chief financial officer kicks off the formal process for development of the university operating budget and capital expenditure priorities for the next fiscal year. Prior to this formal launch, academic programs and administrative departments can begin preparation of budget requests and capital requests. These requests will likely have emerged through the program’s/department’s local planning processes, and can be submitted via the planning system (SPOL).
The basic sequence of the annual budget process is outlined in the table below.
Programs and departments develop budget and capital requests independently, based on their local planning; these requests are submitted via the SPOL planning module
Core baseline budget parameters:
Budget worksheets to VPs and Deans:
Capital budget meetings with primary stakeholders
Budgets returned to VP for Finance
Presentation of preliminary budget outlook to Board of Trustees,
Revisions to preliminary budget; enrollment adjustments
Final Budget worksheets to VPs and Deans:
Board of Trustees, Institutional Resources & Sustainability Committee:
Final Operating and Capital budget based on actual enrollment
Preliminary “final” budget loaded into ERP for reporting and tracking
Final approval by Board of Trustees, Institutional Resources & Sustainability Committee
PACE - Assessment
Outcomes Assessment is the means by which the university’s programs and departments hold themselves accountable to their own aspirations. The SACSCOC principles of accreditation offer helpful guidance with regard to the scope and nature of the university’s assessment efforts.
Academic programs maintain student learning objectives that describe the knowledge, skills, and attitudes/responsibilities/dispositions their graduates will demonstrate upon program completion. Academic and student services identify outcomes of their work that support student success. Section 8 of the 2018 SACSCOC principles articulates the following requirements:
8.2 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of seeking improvement based on analysis of the results in the areas below:
- Student learning outcomes for each of its educational programs
- Student learning outcomes for collegiate-level general education competencies of its undergraduate degree programs
- Academic and student services that support student success
Particularly important in the SACSCOC language above is the requirement that programs “[provide] evidence of seeking improvement based on analysis of … results.” The PACE process and the tools of SPOL are designed to scaffold this work and support departments in maintaining relevant data and documenting their efforts.
Administrative departments must likewise articulate the expected outcomes of administrative support services. Section 7 of the 2018 SACSCOC principles articulates the following requirements:
7.3 The institution identifies expected outcomes of its administrative support services and demonstrates the extent to which the outcomes are achieved.
It is worth noting that the SACSCOC principles do not require that administrative departments provide evidence of seeking improvement. Administrative support services may find success in articulating standards of performance/service (outcomes) and consistently achieving them. Likewise, administrative departments may demonstrate compliance with SACSCOC expectations for outcomes assessment via well-designed planning objectives with clear outcomes (via PACE planning) and/or performance-focused outcomes (via PACE assessment).
See Appendix 1 for lists of EMU’s academic programs and services and its administrative departments as identified for PACE assessment purposes.
Assessment Timeline and Data Collection Cycles
The outcomes assessment work of academic programs and administrative departments will generally take place late in the academic/fiscal year. By design, this work should be completed prior to taking up planning work for the upcoming and future years and can inform budget requests and planning objectives for the program or department. A basic timeline:
- Fall Semester - Collection of data from fall courses/activities as per program/department assessment cycle requirements; survey administration per university survey calendar
- Spring Semester - Collection of data from fall courses/activities as per program/department assessment cycle requirements; survey administration per university survey calendar
- April - September - Window for data analysis and outcomes assessment reporting in SPOL
- Summer - Collection of data from summer courses/activities as needed
- August - September - Develop assessment follow-up planning objectives for upcoming year to seek improvement in student learning on the basis of assessment findings
It is important to note that academic programs and services and administrative departments need not collect and analyze data for each of its outcomes every year. A program/department may develop a multi-year (ideally 2-4 year) cycle over which it assesses each of its outcomes. Such a cycle allows for data collection over multiple years, and allows programs/departments to provide more focused attention on a subset of their outcomes in any given year. Beginning with the rollout of the updated academic program review cycle in 2020-21, academic programs are expected to have collected and analyzed data for each student learning outcome at least twice between each program review. See below for further details on program review.
Further, in order to ensure the sustainability and validity of assessment, small academic programs may opt to extend assessment data collection over several years in order to achieve adequate sample sizes to support analysis. Programs that opt for this approach should collect data for all outcomes each year, but may still focus on analysis of results for a subset of student learning outcomes each year.
PACE - Executive Reporting
At the conclusion of each academic year (usually over the summer and the early fall), deans and vice presidents prepare a PACE Executive Summary for the school or division. The executive summary is based on the dean’s or vice president’s review of their program/departmental activities (i.e., planning and assessment results). The 2-5 page summary addresses the following points:
- Strengths identified through review of the assessment results/data
- Challenges identified through review of the assessment results/data
- Report on changes made based on analysis of assessment results
- Key accomplishments in the school or division identified through review of planning results
In addition to the PACE Executive Summaries prepared by VPs and deans, the VP for Institutional Effectiveness prepares annually a University-Wide PACE Report. This report highlights:
- Program-level and departmental participation in PACE activities
- Assessment methodology utilization
- Program and departmental demonstration of “seeking improvement based on analysis of data”
Academic Program Review
In addition to the PACE process, all academic programs that do not undergo an external accreditation review are reviewed on a six-year cycle (adapted as-needed to specific program considerations). See Appendix 2 for more details on the review process. See the Program Review Cycle for details on when each program is reviewed. This comprehensive review is conducted by a faculty task force, overseen by the provost council and includes consideration of:
- Academic program design and success in achieving student learning outcomes
- Program efficiency, as measured by performance indicators such as student-faculty ratio and program cost per student credit hour
- Future potential, as determined by external forces such as market demand
- Other key factors identified by the provost council or the review task force
The review process consists of the following steps:
- Peer review by team of internal and external reviewers
- Response (including action plans, budget implications and timelines for new initiatives, and follow-up via annual PACE reports)
Possible outcomes of the review are:
- Program continues as-is
- Establish goals for improving quality or efficiency with no new resources
- Invest to improve the program’s capacity, quality, or efficiencies
- Restructure the program, merge its activities with another program, or discontinue
Appendix 1 - EMU Planning Units and Programs for PACE
For the purposes of institutional effectiveness work, EMU organizes its academic programs, academic and student services, and administrative support services into planning units and academic programs. The configuration of these programs and units largely aligns with the university's Organizational Chart. However, in some cases--due to uniqueness in focus or scope of impact--organizational units are excluded from PACE processes. For example, PACE processes exclude academic and public service centers that (a) do not have associated academic programs, and (b) have a small scope of influence within the university as evaluated by considerations such as direct impact on student-facing educational activities, external focus, fiscal footprint, and/or governance.
The tables below outline the organizational units (i.e., academic programs, academic and student services, and administrative services) that are incorporated into the PACE processes.
|Undergraduate Planning Units and Academic Programs|
|Biology & Chemistry|
AS Health Science
BS Clinical Lab Science
BS Environmental Science
BS RN to BS Nursing
BS Computer Science
Visual and Communication Arts
BA Digital Media and Communication
Bible, Religion & Theology
Applied Social Sciences
|Graduate Planning Units and Academic Programs|
|Doctor of Nursing Practice|
|Eastern Mennonite Seminary|
MA Church Leadership
Master of Divinity
|Center for Justice & Peacebuilding|
MA Conflict Transformation
MA Restorative Justice
MA Transformational Leadership
Organizational & Leadership Studies
MA Org. Leadership
Master of Bus. Admin.
|Academic and Student Support Units||Administrative Support Units|
Academic Success Center
Admissions - Undergraduate
EMU at Lancaster
Graduate & Professional Recruitment
Marketing and Communications
Safety and Security
Washington Community Scholars' Center
Appendix 2 - Academic Program Review Process Details
- Review cycle: review each program approximately every six years (program review cycle). The cycle is adapted to address specific program considerations as needed. This review is intended only for programs that don’t go through review by an external accrediting body.
- Cost: $500 + reimbursement of expenses to two (or more, if a program cluster has several disparate programs that require experts from different discipline areas) external reviewers.
- Loading: The program director will receive 1+ load hour release to complete the self-study report, host the campus visit, and complete the response report. The internal peer reviewer will be offered some measurable reduction in workload during the academic year when the review is conducted (such as release from a committee or release time).
- Typical Review Schedule:
- March of year preceding review: Deans notify program directors of upcoming review. At this point the deans and program director identify a list of benchmark/aspirational programs for which data and curricula will be gathered.
- April-May: IR team and library prepare a packet of data and information based on the benchmark program list.
- May: Programs up for review meet with their respective dean to outline tasks and timeline, review available data, determine the key questions for reviewers to address, and submit names of potential external reviewers..
- July: Deans select and contact peer reviewers (typically two external and one internal per program cluster).
- August-December: Program cluster self-study completed and sent to peer reviewers. (Likely, much of the self-study could be compiled from PACE reports.)
- February or March: Peer reviewers visit campus and review programs. Review should include feedback solicited from current students and alumni; when appropriate, a focus group of practitioners of the field would be formed to provide additional feedback.
- April: Peer reviewer report due.
- May: Program cluster reads and responds to peer review report; the response will likely include an action plan (including budget implications and timeline for any proposed new initiatives) that can be incorporated into the PACE process.
- September of year following review: Program cluster meets with their dean to discuss the action plan and other follow-up items.
The evaluation team will typically consist of two external peer reviewers within the discipline(s) of the program cluster to be reviewed, and one internal peer review. Ideally, one external reviewer will be from an institution that is comparable to EMU and a second reviewer will be from an “aspirational” institution. The internal peer review will typically be a tenured faculty member from a different program cluster; the internal peer reviewer will be offered some measurable reduction in workload during the academic year when the review is conducted (such as release from a committee or a dean’s hour). The program cluster should contact nominees for the evaluation team prior to submitting names to determine interest and availability. The program cluster should submit the names of at least three potential external reviewers, along with vitae, to the dean, with additional comments or a ranking if desired.
The program should also identify an upper-level student currently enrolled in the program to support the work of the evaluation team. The student will assist in gathering input from current students in the program (likely by assisting with focus groups) and will meet with the evaluation team at least once to synthesize student input into the final review report.
Outline of Academic Program Self-study Report
A. Academic Program
- Describe the program.
- List the majors and minors (if undergraduate) or programs and certificates (if graduate).
- Provide the program cluster’s mission statement.
- Describe how the program supports the mission of the university.
- Describe how the program prepares students for career pathways and/or further study.
- For undergraduate programs, describe how the majors support the liberal arts within the university.
- Note how courses in program cluster majors are interwoven with general education requirements.
- Discuss courses offered as “service courses” for the liberal arts curriculum, and note the typical enrollments in these courses. [IR will provide summarized data for recent years regarding service course enrollments.]
- Describe how the program interacts with other majors.
- Describe any admission-to-program requirements for students.
- Provide a list of required courses for the majors/programs and note any recommended electives.
- List the student learning outcomes, along with the measures and criteria/benchmarks for each major or program and any minors or certificates that are not “miniature majors/programs.”
- Create or review and update the student learning outcomes, curriculum map, and assessment plan for the current curriculum. [IR can provide guidance on creating and revising SLO’s, curriculum maps, and assessment plans.]
- Describe how the program advances the university DEI goals.
- List activities and ways the program contributes to making the university a desirable place to learn, work and increase a sense of belonging for all students, faculty, and staff.
- Describe how you systematically collect and analyze data about DEI and use it to assess the program’s ability to support the needs of the university’s increasingly diverse student population.
- Describe what efforts have been made to support faculty development to implement student-centered pedagogies that aid faculty in teaching to an increasingly diverse student body and developing courses that include the voices of BIPOC and LGBTQIA+ people.
- Assess the structure and content of the program in comparison to current practices in the discipline.
- Reflection on the future potential of the program, as determined by external forces such as market demand.
- Locate the program’s growth trajectory within the context of other similar/competitor programs using IPEDS completions data [IR will provide this information]
- Analyze the instructional and informal environment in the program cluster.
- Assess the amount and quality of contact between students and faculty. [IR will provide the most recent results of the Student Satisfaction Inventory (SSI) and/or Adult Student Priorities Survey (ASPS) for the program cluster’s majors/programs, including data on satisfaction with advising.]
- Describe how support, collaboration, and cooperation among students is encouraged.
- Assess how active learning experiences are encouraged.
- Present a table listing academic credentials, areas of expertise, courses taught, advisee load, years of service, and academic rank of each faculty member of the program cluster. [IR will provide an Excel file with five years of history of faculty teaching in the program cluster, including rank, degrees, advisee load and years of service.]
- List faculty research and performance activities in the last five years.
- Discuss any perceived gaps in faculty expertise or limitations on the number of faculty members necessary to fulfill the program cluster’s goals and courses within the majors/programs.
- Describe the use of adjunct faculty to teach courses in the major or program. [IR will provide a five year history of course offerings, including whether courses were taught by ongoing or part-time/adjunct faculty.]
- Describe the primary faculty service involvements within the university.
- Describe how faculty service meets program needs and contributes to the mission of the university.
- Present a table listing the current number of students within each major and minor in the program cluster. Comment on whether these numbers are increasing, decreasing, or holding steady over the past 10 years (or so). [IR will provide a 5-10 year history of majors and minors within the program cluster.]
- Provide the number of degrees awarded in the past five years. [IR will provide a 5+ year history of the number of degrees awarded.]
- Describe post-baccalaureate or post-program completion pursuits of graduates of the program cluster. [IR will provide overall survey results from the most recent alumni survey for the program cluster’s graduates.]
- Provide information on the cost of the program per student credit hour. [IR will provide data, including benchmarks when available.]
- Assess laboratory equipment and facilities, studio facilities, etc.
- Evaluate external resources utilized by the program cluster: grants, contracts, service fees, other.
- Evaluate technology available.
E. Summary and Questions
Provide any concluding statements that may be helpful for the review team, along with any specific questions that you would like the team to consider in their evaluation.
Academic Program Review Process revised and approved by Provost's Council June 29, 2021