8/15/15 Academic Program Review Considerations &

4m ago
24 Views
0 Downloads
504.54 KB
10 Pages
Transcription

8/15/15Academic Program ReviewConsiderations & Proposed ModificationsRATIONALEPeriodic comprehensive academic program review has traditionally been embraced by universities to help ensure strong academicprogram quality, drive appropriate resource alignment, and encourage program improvement. Despite these important objectives,because most reviews typically occur on a five to seven year cycle, programs can wait a relatively long period of time to haveimportant discussions about program outcomes.Recent debate and research have called into question the impact of these traditional reviews and the expense involved. The nationalconversation has resulted in the emergence of two very different approaches to academic program review. One approachrecommends the review of all academic programs at the same time with the goal of program prioritization and resource reallocation(Robert Dickeson, 2010). An alternative approach proposes that traditional periodic reviews are outmoded in today’s disruptivehigher education environment and calls for more nimble strategies and metrics to take the pulse of academic programs. Researchsupporting this option points out that the traditional program review requires significant time, effort and resources which lead toresults that are “predictable and unhelpful and typically have little impact on important resource allocation or performanceimprovement decisions” (EAB 2012, p.12).Given these alternative scenarios, RIT should consider re-imagining its program review process and projected calendar. Even thoughRIT’s Comprehensive Program Review Framework was approved by Academic Senate in 2010, it has not yet been formallyimplemented, and in fact it was placed on hold until 2015 so that the University could complete its work on calendar conversion. Thereview schedule is slated to be established during the academic year (2014-2015). Such reimagining should include an annualprogram analysis process as the first step in the review continuum. While the overarching goal would remain the quest for programexcellence and student success, consideration of other factors (e.g. program size, revenue/expense) should be factored in asappropriate and relevant to the type of program.RECOMMENDATION: Based on the above considerations, the Provost’s Council recommends that RIT develop a three-tieredprogram analysis and review process that is more agile, less resource intensive and focused on an end goal of programimprovement, reinvention and revitalization.1

8/15/15Academic Program ReviewConsiderations & Proposed ModificationsThis process would include 3 tiers of analysis.TIER 1: (Applies to every program, every year)Tier 1 would be fashioned as an Annual Program Analysis Process – a nimble annual screening process through which the dean,department head and Provost examine program performance on a set of key agreed-upon metrics which fall into one of fourcategories: Enrollment, Student Learning Outcomes, Student Success, and Revenue/Expense (see Appendix A1, A2, A3). Data will be generated centrally on an annual basis and distributed to each collegeData will be reported in absolute numbers or percentages with a comparison to Institutional Goals or college averages. Wherepossible, benchmarks will be established and adjusted as appropriate after each three year trend (See Appendix A)While all metrics will be taken together and reviewed holistically; trends in enrollment and revenue/expense, if declining, maytrigger further analysisAnalysis would occur in October-November time frameActions taken following this analysis would occur in December and would fall into three categories:Category 1: No perceived red flags – no further actions or analysis neededCategory 2: Specific metrics (particularly Enrollment and Revenue/Expenses) are trending in wrong direction and raise “headsup” concern – Tier 2 action plan including a deeper dive into further data may be required. Input from EnrollmentManagement related to the program’s market share and competition as well as future recruitment potential will besought in Tier 2 and Tier 3 analysis.Category 3: Serious issues about program viability emerge from longitudinal analysis – Tier 2 or Tier 3 review is recommendedbased on discussion between the program, dean and Provost.TIER 2: (Applies to programs flagged through Tier 1 analysis)Tier 2 review typically would require a deeper dive analysis including a response from the program. The current RIT Framework forComprehensive Program Review (2010) (without the external review component) may also be recommended. Should theComprehensive Program Review approach be warranted, the Self-Study would be written from January-May and the review wouldoccur the following Fall.2

8/15/15Academic Program ReviewConsiderations & Proposed ModificationsTIER 3 (Optional): (Applies to programs flagged through Tier 2 analysis recommended for comprehensivereview and programs voluntarily wanting to use the comprehensive approach)The Comprehensive Program Review (2010) approach with an external team of reviewers would be an optional approach used onlyif the College and/or the Provost recommended this option.In no case, however, would it be expected that RIT would have more than 10 programs undergoing Tier 3 Comprehensive Review inany academic year.For Further Consideration as this model is rolled out: The design and cost model for Ph.D. programs are so different that they will need a different tiered approach to bedeveloped moving forward.Graduate programs will need to review the Tier 1 metrics to determine if these are the right ones, keeping in mind that themetrics should not be overly burdensome and should be able to be generated centrally.Process recommended here for Tier 1 does not directly address program/departmental operations and faculty quality. Isassessment of these areas satisfactorily addressed at college level through the normal on-going efficacy assessments thatoccur? Is there any additional review needed?What role should/could the ICC and Graduate Council play in Tier 1 analysis? Graduate Council, in particular, is veryinterested in playing a role.This will be an iterative process and we will expect to fine tune the metrics and the benchmarks as we gain experience. Forexample, the deans have raised the following questions:o where does interdisciplinarity fit into the model?o what benchmark should we use for institutional r graduate school placement rate goals and job placement goals?o Do we have the right metrics? How do we communicate a formative tone and ensure that the process gives facultyimportant information on their curriculum so that steps can be taken to mitigate any continuing red flags in themetric domains?Following the first year we will debrief as a leadership group to assess whether this Tier 1 process is achieving its intended outcomesand what changes need to be made.3

8/15/2015Academic Program Analysis: UndergraduateAppendix A1: Tier 1: Undergraduate—Main Campus Only[Metrics Reported in #’s and/or %’s over 3 year period]Trigger MetricsEnrollment Student Headcount (FT & PT)o First Timeo Transfero Overall Enrollment Internal Transfers Continuing StudentsBenchmark: Overall Headcount (3 yr average)o BS/BA 30o AAS/AS/AOS 15o CT/DP/UND 7 Overall Headcount Change (3 yr average)Benchmark: Stable enrollment trend:o BS/BA: 10% decline (3 yr average)o AAS/AS/AOS: 10% decline (3 yraverage)o CT/DP/UND: 10% decline (3 yraverage)Supporting MetricsLearning Outcomes Program Learning OutcomesAssessment ResultsSupporting MetricsStudent Success First Year RetentionBenchmark: Met or exceeded RIT goalBenchmark: Met or exceeded program’sachievement benchmark Program Improvement Results Graduation Rateo First Major 150% of Program Time 100% of Program Time (On-TimeRate)Benchmark: Met or exceeded RIT actual rate Met or exceeded RIT target goals Career Outcomes (6 months aftergraduation)o Employmento Further Studyo Alternative PlansBenchmark: Used assessment resultsfor programimprovement (Y or N)Trigger MetricsRevenue / Expense Program NetSurplus/Deficit*(College overhead costsonly)Benchmark: Positive net revenuepositionBenchmark: Met or exceeded RIT goal* Surplus / (Deficit) at College Cost Responsibility: Net tuition revenue from students matriculated in each program less the cost of instruction for all credit hours consumedby these same students including only expenses within the colleges’ collective control.4

8/15/2015Academic Program Analysis: Graduate ProgramsGraduate programs at RIT are a complex portfolio, and thus a set of metrics applicable to all programs is not possible. At minimum, 3 classes ofgraduate programs should have a specific set of metrics as follows: Professional degrees (MS, MBAs) MFAs PhDWORKING DRAFTAppendix A2Tier 1: Graduate (Professional Degrees and MFA) – (Under Development)Trigger MetricsSupporting MetricsSupporting MetricsTrigger MetricsEnrollmentLearning OutcomesStudent SuccessRevenue / Expense Student Headcounto Full-Timeo Part-Time Overall Enrollment TrendBenchmark: Stable enrollment trend: 10% decline (3 yr average)Application Yield/Acceptance Rateo Appliedo Admittedo EnrolledBenchmark: x% Benchmark: Met or exceeded achievementbenchmark level (Y or N)Benchmark: Total enrollment ? Program Learning OutcomesAssessment Program ImprovementPersistence (F/T Students)Benchmark: x% (3 yr average) Graduation Benchmark: # of graduates x (3 yraverage)Time to Degree (F/T Students)Benchmark: Used assessment results forprogram improvement (Y or N) Programs Net Surplus/ Deficit*(College Overhead Costs Only)Benchmark: Positive Net RevenueBenchmark: or 1 ½ x published programlength Career Outcomes (6 months aftergraduation)o Employment Overall Related to Field of StudyBenchmark: or RIT benchmark (3 yr avg)* Surplus / (Deficit) at College Cost Responsibility: Net tuition revenue from students matriculated in each program less the cost of instruction for all credit hoursconsumed by these same students including only expenses within the colleges’ collective control.5

8/15/2015Academic Program Analysis: Ph.D. ProgramsWORKING DRAFTTier 1: Ph.D. (Under Development)Trigger MetricsEnrollment Met Projected Enrollment(Y/N)Supporting MetricsLearning Outcomes Benchmark: Application Yieldo Appliedo Admittedo EnrolledBenchmark: or ?%Average GRE/GMAT of Program Learning OutcomesAssessmentBenchmark: Met or exceededachievement benchmarklevel (Y or N)Program ImprovementSupporting MetricsStudent Success Persistence (F/T Students)Trigger MetricsRevenue / Expense Programs Net Surplus/Deficit* (CollegeOverhead Costs Only) % of external supportBenchmark: x% (3 yr average) Benchmark: Used results for programimprovement (Y or N)GraduationBenchmark: # of graduates x% (3 yr average) Entering Grad StudentsBenchmark: 25% of ?Time to Degree (F/T Students)Benchmark: 1 ½ x published program lengthBenchmark: or x score Career Outcomes (6 months aftergraduation)o Employment Overall Related to Field of StudyBenchmark: x% (3 yr average)6

8/15/2015Tier 2Appendix BAcademic Program AnalysisPotential Deeper Dive Metrics for Tier 2/3 Analysis (Undergraduate and Graduate)Enrollment Application Yieldo Appliedo Admittedo EnrolledStudent Success Benchmark: Met Enrollment Projectionso Yes or Noo Were quality studentsturned away?Benchmark:% Receiving Financial Aid(?) Benchmark: Average UGGPA/GRE/GMAT ofEntering StudentsOverall GPA of Graduates(?)Benchmark:rd2 to 3 Year RetentionBenchmark: Met or exceeded RITGoal: represented asquartile for RITrd3 to 4th Year RetentionInstructional Activity Benchmark: Represented as quartilefor RITLicensure Pass Rate, ifappropriate On-time Graduation Rate(to be developed perdegree program)Benchmark: Graduate Programs:o Top 10 employers,Fortune 500 employersBenchmark:# of Sections Taught perFTE Faculty in ProgramHome Dept.Faculty Benchmark: Benchmark:Benchmark: nd FTE Students Taught perFTE FacultyAverage Salary by RankRevenue / Expenses Benchmark: % with Terminal DegreeBenchmark:Benchmark:Professional Recognition of Program’s Graduates(Awards, Global Visibility,Indicators)Research Awards ( amount per T/TTfaculty)Benchmark:Benchmark: Leadership Positions ofProgram Graduate andOther Indicators of Success% of T/TT Faculty PI’s# of Student CreditHours GeneratedBenchmark: Net Tuition RevenueBenchmark:Benchmark: Benchmark:Amount of ExternalFundingBenchmark:7

8/15/2015Tier 2Appendix BAcademic Program AnalysisEnrollment Time to Degree for EachGraduateBenchmark:Student Success Instructional ActivityFacultyRevenue / ExpensesGraduate Programs:o Placement in academicteaching jobs, top tenemployerso Placement in academicteaching jobs, top tenemployerso Professional recognitionof program’s graduates(e.g., awards in the arts,Pulitzer prizes,professionalorganization awards)o 3 year average creativeand scholarlyactivities/presentationsat national andinternationalconferences, etc.)o 3 year average ofgraduate facultycreative andscholarly/researchendeavorso 3 year average ofgraduate facultyprofessional and artisticrecognitionBenchmark:8

8/15/2015Tier 2Appendix BAcademic Program AnalysisEnrollmentStudent Success Instructional ActivityFacultyRevenue / ExpensesGraduate Programs:o Top 10 employerso Employer type distribution(academic, government,industry, other)o 3 year average number ofpeer-reviewedpublications by Ph.D.graduate (as seniorauthor, coauthor)o 3 year average number ofpresentations at nationaland internationalconferenceso 3 year average ofcompetitive fundingsecured by facultyo 3 year average ofgraduate faculty scholarlyproductivityo Citation impact of Ph.D.student and Ph.D. facultyscholarshipo Graduate student awards(fellowships, grants, bestconference papers, etc.)o Professional recognitionof Ph.D. graduates(national and globalvisibility indicators)Benchmark:9

8/15/2015Tier 2Appendix BAcademic Program AnalysisEnrollment Annual #