Panel: 758

Why no one believes the science: Public trust, misinformation and science literacy in a polarized world

Organized by: Innovation, Science, and Economic Development Canada
Panel Date: November 20, 2025
Speakers:
Anthony Morgan (moderator)
Nicholas Diamond
Barbara Olfe-Kraeutlein
Mario Scharfbillig
Anton Holland
Marianne Mader

Abstract:
Over the past decade, the world has witnessed major changes in the information landscape. Rapid technological advancements driven in large part by the growing prevalence of AI and the proliferation of accessible online information offer enormous potential opportunities with regards to scientific literacy. At the same time, mobilizing science- and innovation-based policy agendas within the contexts of increasing global polarization, misinformation, and disinformation has become a significant challenge for science communication. This panel will present and engage the audience regarding best practices essential for effective science communication emphasizing trust-building between the science ecosystem and the public.

Panel 758 – Why no one believes the science: Public trust, misinformation and science literacy in a polarized world

Summary of Conversations

This highly interactive panel featured leading science communication experts from Germany, the EU, and Canada. The session was moderated by a CBC television science personality and employed a “catchbox cube” to engage the audience. The themes of polarization, trust, mis-/disinformation, and science literacy were explored against the backdrop of a significantly changing information landscape – one marked by rapid technological advancements driven in large part by the growing prevalence of AI as well as by the proliferation of accessible online information and the perceived rise of (and actual) information-based “echo chambers”. Central to the conversation was the notion that these contextual factors or stressors are impacting individuals and communities in a number of areas, including in their ability to place trust in our institutions, the science ecosystem, and the scientific process itself. With the help of the audience – composed of academics and practitioners, as well as those in government and other areas of the science (and social science) ecosystem – this illustrious panel discussed best practices around science communication with an emphasis on trust-building between the science ecosystem and the public.

Take Away Messages/Current Status of Challenges

  • The Business Model of Information: Algorithms are fundamentally designed to maximize user engagement by optimizing  content that may confirm existing beliefs, encourage the expression of identities, or induce outrage, which directly fuels the echo chamber phenomenon.
  • The Amplification Effect of Social Media: Social media acts as a powerful catalyst that amplifies existing societal polarization, conflict, and negative emotions, often making the goal of online discussion merely to “win” rather than to learn or engage productively.
  • False Consensus Effect: A potential outcome of information bubbles is that people who believe scientific misinformation dramatically overestimate the number of others who share those false beliefs, confirming that belief is strongly social and community-driven.
  • The ‘Trigger Bubble’ Paradox: Exposure to opposing views online often results not in depolarization, but in a counter-intuitive increase in polarization, as individuals primarily encounter and react to the most extreme or negative elements of the other side.
  • Outdated Deficit Communication Paradigm: The continued reliance on the ‘deficit model’—the assumption that simply providing more facts will solve public disbelief—is ineffective, outdated, and contributes to public feelings of powerlessness.
  • Communicating Uncertainty: Communicating uncertainty does not reduce trust when done with clarity and actionable advice. Transparency increases perceived honesty and credibility. Unacknowledged uncertainty is penalized more than acknowledged uncertainty.
  • Systemic Misinformation Exposure: The pervasiveness of misinformation in the current environment is so severe that individuals cannot combat it effectively on a daily basis, highlighting the urgent need for system-level policy interventions.
  • Inaccessibility and Elitism Perception: A major barrier to public engagement is the entrenched public perception that science is exclusive, ‘too hard,’ or only for the ‘elite’ people, which causes the public to disengage and reject some scientific information.
  • Institutional Trust Deficit: Scientific institutions and governments face a challenge in that a lack of public trust stems, in part, from their own failure to be consistently transparent, accessible, and perform communication in an optimal, trust-worthy way.

Recommendations/Next Steps

  • Implement Operational Transparency: Actively invite the public into the scientific process by fully laying out the challenges and opportunities, explaining the methodology, and demonstrating how and why conclusions are reached.
  • Prioritize Process Communication: Shift the focus from merely conveying facts to explaining the process of science, including how consensus is built and the role of uncertainty, which is essential for empowering people to handle scientific information.
  • Integrate Science Communication Training: Systematically incorporate effective public communication skills into science education, ensuring that students can translate their specialized knowledge for a general audience and not just for colleagues in their field.
  • Engage Consistently on Public Platforms: Research institutions and scientists should maintain a trusted presence on social media and other media or engage with the public directly. This aims to normalize science for the public, especially youth, and to ensure that scientific voices are available where people already congregate. Recognizing that not every scientist has the skills to do this, such communication can also take place as a shared task within research groups or in collaboration with communication professionals at the respective institutions.
  • Foster Curiosity and Compassion: Avoid aggressive engagement and instead adopt a strategy of leaning into curiosity, which involves asking people to explain their beliefs to enable a mutual deconstruction of non-scientific claims.
  • Design for Trustworthiness Internally: Institutions must perform a rigorous self-assessment and commit to designing their structures and communication strategies to be authentically trustworthy and transparent before seeking public compliance.
  • Advocate for Systemic Regulatory Change: Support and work toward policies that fundamentally alter the information environment so that incentives and amplification are not disproportionately optimized for rage, polarization, and controversy —such as regulation of platform business models—as this is necessary to effectively combat the daily onslaught of misinformation.
  • Build Non-Crisis Coalitions: Proactively establish connections with non-authority organizations that are already trusted by the public to create an “ecosystem response ability” that can be leveraged for effective information dissemination during future crises.

* This summary is generated with the assistance of AI tools

Disclaimer: The French version of this text has been auto-translated and has not been approved by the author.