The dual teaching/research nature of the university means that institutional focus can be split between teaching students and facilitating research. Universities have to direct resources to both, and can have distinct goals for each focus. For example, many universities have chosen to try to improve their research quality through acknowledging the important role of applied research in economic development. As a consequence of this, the sector is entering a golden era of industry partnerships and a greater integration of universities into their communities.
But the importance of the industry partnerships can’t ignore that universities are also a place of education for the future workforce, and it is important that graduates are not only educated, but also healthy, and that they possess a wealth of student experience beyond the classroom. Many of the things that make for vibrant student experiences drive what OUSA advocates for and why it develops policy that would benefit students across the province.
With any focus, whether it be for industry partnerships for research or improving student mental health, it is critical to have specific goals, assessed using metrics (often called Key Performance Indicators or KPIs). Having KPIs helps measure whether goals are being achieved, and whether resources being spent to achieve them are efficient and effective. So, with a dual focus on students and research, where and how can institutions and government make sure that their KPIs are effectively capturing student learning and experience?
There are four areas that I see where having student-centric KPIs are important, and there are different practices that can help produce quality, student-centred metrics for each.
First, and most institutional are internal institutional metrics. The types of programs that would benefit from student-centred metrics are varied, from housing, teaching evaluation, strategic planning, orientation and more. With a varied set of programs, how can universities develop good metrics? Firstly, universities should have a clear vision of the end goals of their programs and policies, whether it be to increase teaching quality or to better student wellness. Then it is important to talk to students and look at other institutions’ policies to determine what that success might look like.
A critical obstacle may be not being able to track data that supports those more specific goals, so it is important for any policy creator in a university to have a good understanding of who in the university collects what types of data. In some cases, it may be advisable to ask the university to put more resources behind data collection to allow for quality metrics. And finally, it is important to have student in the room all along the process if possible, whether that be advisory panels or committee memberships, having students around could cause flaws to be pointed out, or may elicit good suggestions.
Secondly, universities and government have recently begun a process of adopting Strategic Mandate Agreements, or SMAs, which give the government some concise indication of what direction universities wish to go, and what KPIs universities will use to measure success. These SMAs have focus on teaching and research, but could do more to incorporate broader student experience. Universities for their part, if they are committing to student-centric KPIs on the institutional level should have little trouble in pulling out some of those metrics for an SMA. For governments, there are two main ways in which they are empowered to ensure there is focus on student metrics. The first way is through having control over the SMA template and dictating what sections are in a university’s draft. The second is by being able to request revisions to draft SMAs with guidance.
To make use of the power to change the template, government should pull out student experience from the SMA Teaching and Learning Assessment into its own category. SMAs currently track some general engagement data, but government could encourage through changing the template that universities set specific and strategic goals on where institutional improvement is needed in areas of student experience. Government can also require that student population data include not only the amount of marginalized or vulnerable students, but also comparatively track success against the general student body to ensure institutions commit to providing adequate supports for student success to those populations.
To make use of the power to revise SMA drafts, government should ensure that revisions occur until the following elements are in a final draft. A draft should include metrics on student wellness. Government should also commit to vetting data sources and processes to ensure that some goal or goals in each relevant category that have some level of student feedback incorporated into the metric.
Taken together, these recommendations for the SMA process would go a long way to improving the quality of institutional tracking and strategic goal setting for student experience.
Another level that metrics are important for is other post-secondary policy and programs. Some examples of government level policies include the government’s sexual violence action plan, administration of OSAP, and new initiatives relating to experiential learning. These don’t manifest in the same way that an SMA do, and are often more unilateral than the SMA process, so a large onus is on the government to adequately consider student needs. To do so requires the same requirement to have a clear vision of desired outcomes and to design specific metrics accordingly. The nature of these policies can be diverse, so the advice here is simple; whether it will be a popular policy or not, it is important to drive towards meaningful metrics by working in partnership with post-secondary education stakeholders such as students, student groups and universities to ensure that metrics will not be perversely interpreted.
Finally, there is a lot of policy which is not explicitly related to universities, but intersect greatly with students in other parts of their life, things like housing policy and Landlord and Tenant Boards, the health and mental health systems. With about 450 000 undergraduate students in Ontario at any given time, government must work to ensure that these policies include KPIs that track success with the student population. We can take lessons from federal work on gender based analysis, where all government decisions are held to closer scrutiny for gender equality. While something on this level may not be necessary for students, the prospect of breaking silos and integrating consideration for these groups and how impacts are measured is informative for provincial policies. One step that the government could take would be to put post-secondary liasons within key government units, or simply within every ministry. Furthermore, consultations in policy should make an effort to have dedicated discussions with current students on policy implementation.
Hopefully this admittedly long and geeky blog post has shed some light on why student-centric metrics are important, but also on what students can push for and ask for from their universities and governments to make sure that policies and programs are effective for them.