If you work in Institutional Analysis, Institutional Research or similar, you know getting your five-year plan approved can often be more of a marathon than a sprint. There are consultations (and consultations, and consultations…), new directions, and almost always a surprise or two. For good measure, the world added a pandemic worth of behavioural changes. Oh, and the lead analyst on this project also happened to leave, so now it’s your turn to try your hand at predicting the future.
I’ve been both the joining and leaving analyst, so I can empathize!
If you’re not familiar, here’s an idea of how the process works at many academic institutions.
At Queen’s University in Ontario, Canada enrolment targets and projections are developed by the Strategic Enrolment Management Group (SEMG), which includes the “Provost, Vice Provosts, Deans, faculty members, recruitment, admissions, and budget office representatives.” They consider enrolment targets and projections for a three-year cycle. The SEMG reviews data on wider regional and sector trends, provincial policy, faculty, and school enrolment plans. They also meet with each Dean to review priorities, applicant demand, capacity, and related issues. Targets are provided on a three-year rolling cycle, with the first two years submitted for approval and the third year submitted as information. SEMG can recommend changes to previously approved targets if required. SEMG submits their recommendations to the SCAD (Senate Committee on Academic Development and Procedures), who reviews and recommends targets to Senate for final approval.
We had a similar process for approval when I worked at the University of British Columbia and Simon Fraser University.
I’ve found each institution has their own preferences for how hands-on individual Deans and their teams can be in the process. At some institutions, it is highly collaborative – each Dean is able to use an enrolment forecasting tool to provide their input into what may happen, with the tool providing projections. At other institutions, the process is centralized in Institutional Analysis. Each Dean still gets input, but they don’t need to run the models. There’s almost always some horse-trading that takes place. One faculty may have additional members on sabbatical or another is recovering from a bulge in enrollment in previous years. Maybe another faculty has new programs.
My preference is the collaborative approach. I believe the enrolment plans we produced were better because we were able to collaborate, and as a bonus I got to learn a ton about the constraints faculties operate under, how they support the academic mission, and how changes one faculty makes can impact another. My thanks to all the Associate Deans and Finance Directors who collaborated to improve the process.
It’s not always obvious that Institutional Research and Analysis Professionals can spend the majority of their year working on these processes. In a lot of cases, they serve a facilitation role in this process. They may run models on behalf of their stakeholders, or work to help showcase the changes that occur when a faculty changes plans or an admission average drops. They’re also responsible for sourcing the data, maintaining, and improving the model.
They may also be corralling dozens of scenarios to eventually arrive at the final plan that is presented to Senate or equivalent. While the buck ultimately stops with a Provost or similar, the analyst must make sure their model is sound, free of errors (human or otherwise), a reasonable predictor of the future, illustratable to users who question it, justifiable, and robust enough to adapt to changing circumstances.
I’ll end this post with a few lessons learned from my own past that may help you:
Involve more than one staff member in your process. If your lead analyst gets a new job or wins the lottery, you don’t want to be left high and dry.
Make sure that you have a solid quality assurance process to ensure that mistakes don’t propagate through to your final plan. Be on the lookout for odd results. Zeros or massive growth are cause for suspicion and deeper investigation.
Spend time crafting your communication strategy. Few things will torpedo your efforts faster than people not understanding your modelling and therefore not believing it.
Be wary of relying on small samples. For example, if your program has 10 people and grows to 12, it may not repeat that 20% growth rate forever. You’re far better served to use rates based on more aggregated groups like program type, length, faculty, etc.
When your work finally gets approved at Senate, be proud of your contribution to an important process, and take a moment to reflect on what went well and what you’d improve. The next cycle will be here before you know it.