Abstract:
The advancement of Open Science in Canada necessitates a fundamental shift in how research is assessed, rewarded, and incentivised. While institutions have made strides in open access publishing and FAIR data management, progress is constrained by entrenched reliance on traditional metrics. This panel will explore the imperative for responsible research assessment—emphasing qualitative evaluation, diverse research outputs, and societal impact—as key elements in systemic change. Drawing on previous initiatives, an intitial Call to Action, international frameworks, and emerging practices, we will examine how Canadian stakeholders can collaborate to align assessment practices with global standards for open scholarship, and foster a more equitable research ecosystem.
Summary of Conversations
The discussion centered on the urgent need to overhaul research evaluation systems to foster widespread adoption of open science principles. The consensus was that the current assessment model, which prioritizes quantitative metrics like funding and publication volume, does not acknowledge the time and effort required for practices such as FAIR data sharing, community engagement, and promoting equity. Responsible evaluation must be viewed as a necessary mechanism to unify mission-driven goals and align the research enterprise with complex societal challenges, rather than simply another policy layer.
Significant global momentum exists, with multiple declarations outlining principles for change. National and provincial funding agencies have begun this cultural shift, notably through the introduction of narrative-style curricula vitae, which allow for a broader account of contributions. However, a critical disconnect persists, as institutional promotion and review processes often lag behind funder requirements, continuing to rely on traditional, narrow measures of excellence. Realizing this reform requires a long-term, coordinated approach across the entire research ecosystem. The absence of national strategy for science is a major lacuna that affects the ability to move this particular piece forward.
Take Away Messages/Current Status of Challenges
- Cultural Mismatch in Assessment: While funding agencies have started to adopt new responsible research assessment (RRA) tools like the narrative CV, internal institutional review committees still need support to change, and there is little coordination by universities to push away from traditional, quantitative-based metrics for performance evaluation.
- Growing Researcher Burden: Researchers are continuously asked to integrate increasing expectations—including EDI, knowledge mobilization, and open science practices—without these vital contributions being formally acknowledged, rewarded, or counted towards their scientific review and career advancement.
- Need for an Evidence-Based Transition: The reform process lacks a sufficient body of empirical evidence, requiring intentional investment in “Research on Research” to ensure the new assessment system is demonstrably effective and superior to the flawed traditional model.
- Lack of Conceptual Clarity: There is a current lack of consensus on definitions for key RRA criteria, particularly the multifaceted and often vaguely-defined concepts of ‘impact,’ and ‘value,’ which hinders consistent evaluation.
- Time-Intensive Quality Evaluation: Moving toward quality-based, holistic assessment requires significantly more time and resources for both researchers (to produce and curate rich outputs like data sets) and peer reviewers (to thoroughly and thoughtfully evaluate diverse contributions).
- Risk of Workforce Disengagement: A major risk of delaying reform is the alienation and potential loss of the next generation of researchers, who are increasingly disinclined to participate in a research enterprise that they feel does not value contributions beyond simple quantitative measures of productivity.
- Intractability of Open Science Practices: While beneficial for transparency and public trust, the effective implementation of open science (e.g., public engagement, curating meaningful data sets) is not a trivial exercise and requires significant dedicated effort and infrastructure that must be valued.
Recommendations/Next Steps
- Harmonize Ecosystem-Wide Assessment: Immediately address the misalignment by coordinating and aligning assessment policies between funders and institutions to ensure that the broader contributions valued by funding agencies are equally recognized in university-level hiring and promotion decisions.
- Develop and Utilize Progressive Tools: Fully implement concrete assessment tools, such as the narrative CV, and actively enforce policies that prohibit the use of reductive quantitative metrics like the journal impact factor and H-index in the peer review process.
- Invest in Comprehensive Reviewer Training: Provide training for all peer review committees to ensure they possess the necessary skills to effectively and consistently evaluate the broader range of outputs described in new tools like the narrative CV.
- Incentivize Diverse Research Contributions: Systematically recognize and value non-traditional outputs, including Open Access articles, curated research data sets, and meaningful community engagement, as critical elements within the application and review processes.
- Adopt a Phased Implementation Strategy: Apply changes incrementally, focusing first on altering the rules for powerful actors (funders and universities) before extending new requirements to individuals, thereby allowing the system to align before demanding significant new effort from researchers.
- Right-Size Expectations and Productivity: Reset the professional expectations for research productivity and adjust the requirements for data management to be proportional to the size, duration, and objective of the specific grant or project.
- Foster Coordinated National Strategy: Canada must establish a national scientific strategy, led by a dedicated body, to ensure cohesive and accountable progress on RRA implementation across all major research stakeholders, among other aspects of open science.
* This summary is generated with the assistance of AI tools


