Twitter
Facebook
YouTube
LinkedIn
RSS

Panel 212 - The social implications of emerging technologies: Are the most impotant quetions the least studied?

Conference Day: 
Day 1 - November 7th 2018
Takeaways and recommendations: 

The social implications of emerging technologies: Are the most important questions the least studied?

Organized by: Federation for the Humanities and Social Sciences, Peter Severinson

Speakers: Eric M. Meslin, Ph.D., FCAHS, President and CEO, Council of Canadian Academies; Jaigris Hodson, Assistant Professor of Interdisciplinary Studies, Royal Roads University; Dominic Martin, Professor, École des sciences de la Gestion of the Université du Québec à Montréal

Moderator: Peter Severinson, Policy Analyst, Federation for the Humanities and Social Sciences

Takeaways and recommendations

  • Too often the social and ethical implications (both intended and unintended) of transformative new technologies are overlooked. This requires thinking about the potential immediate, as well as mid-term and long-term consequences.

  • Words matter: the word “implication” is often seen as inherently bad, though it can be either positive or negative.

  • Decisions about rapidly developing technologies need to be made quickly.

  • Understanding how society is affected by emerging technologies helps to address risks, but it can also unlock potential benefits; e.g., training workers to effectively use new capabilities, to make smart decisions on how technology can address societal problems, and to ensure the benefits of new technologies are shared throughout our diverse society.

  • Technological revolutions have driven big societal changes in the past, but new technologies like artificial intelligence (AI) are viewed as having an even greater impact on people’s lives.

Starting the conversation

  • Don’t wait for a crisis to start conversations about, and potential regulation of, a new technology. That dialogue can begin even at the early research phase.

  • Dedicated funding is needed to explore the ethical, legal and social issues around a project or initiative.

  • These conversations and decisions can happen on many levels: the individual level (e.g., someone deciding not to use Facebook anymore); the organizational level (e.g., entrepreneurs considering the implications of their technologies); and the policy level (e.g., the European Union’s data regulations).

  • Many in the private sector, especially in AI, are aware of the need to have these conversations, and are being open and transparent about the potential effects of their technologies.

Technology education needs to evolve

  • Digital literacy doesn’t just mean knowing how to code or use a smart phone.

  • When teaching about technology, include the social the social sciences and humanities so people understand both how to use technology and think critically about the consequences of that use on different segments of society (e.g., many people view algorithms as neutral and they are not).

  • Teach ethical, legal and social issues as part of the core science curriculum.

  • Increasingly, the way we educate people divides them into two camps:

  1. People who know how technology works but are ill-equipped to use it in a way that causes no harm (whether deliberately or inadvertently).

  2. People who receive training understand the consequences of technology use, and therefore are more likely to use it in a positive way, but don’t understand how to use the tech itself.

  • Include examples of technologies that have had positive ethical and social impacts.

  • At every level of education (starting at primary school) teach that you cannot do good science without good ethics.

  • Curriculum at all levels should teach people to recognize their own biases and how these biases can influence technology development.

Breaking down silos

  • Understanding the social implications of emerging technologies requires an interdisciplinary approach.

  • Integrate other ways of knowing, including traditional and community-based knowledge (e.g., Indigenous).

  • Use project-based team challenges, as opposed to traditional lecture learning, to foster interdisciplinary skills.

  • Scientists are underutilizing opportunities to integrate technology into their research, including platform technologies like AI. Organizations and academic institutions should consider ways to make technology more accessible to scientists.

Ensuring a diversity of perspectives

  • Ethics are not monolithic. They vary across cultures and within cultures.

  • At the same time, a common morality is accepted when we decide to live in a specific community, region or country.

  • Ethics are complicated. The West does not hold a monopoly on enlightenment.

  • The most vulnerable will be disproportionately affected in a negative way by new technologies (e.g., companies may rely more on algorithms than human oversight when hiring, especially for lower-level jobs).

  • Diverse voices from different socioeconomic and cultural backgrounds, disciplines and sectors will lead to more meaningful conversations about social implications.

Documents: 
Photos: