CFP: Generative AI Governance – Special Issues in Information,
Communication & Society
Call for Papers for Special Issue in Information, Communication & Society
Generative AI Governance: Innovations, Institutions, Imaginaries
Special Issue Editors:
Fabian Ferrari, Utrecht University, [email protected]
Joanne Kuai, Karlstad University, [email protected]
From democratic to authoritarian contexts, governments worldwide face
the challenge of setting up oversight mechanisms for generative AI
systems. The European Union recently reached an agreement for the
proposed EU AI Act. A few months earlier, China had enacted one of the
first laws in the world to regulate generative AI systems. Navigating
this global policy landscape is challenging, not only due to differences
between regulatory regimes but also because of the fast pace at which
new variations of generative AI systems are developed.
However, the governance of generative AI systems is not only a task for
governments and regulators – it also occurs at the workplace and at the
institutional level. For example, in the 2023 Hollywood labor disputes,
the use of generative AI systems featured prominently. Such developments
signify a need for a new understanding of generative AI governance,
expanding its scope beyond a narrow focus on regulatory frameworks. As
such, this special issue asks: How does the landscape of generative AI
governance shape the discursive and material dimensions of generative AI
systems, and what are the underlying factors influencing this development?
To address this question, the special issue invites contributions that
cover a wide cultural and geographical range of case studies or
comparative studies. Given that the governance of generative AI systems
is a globally interconnected phenomenon, engagements with regional,
national, and supranational forms of generative AI governance are
encouraged. Potential empirical entry points include but are not limited to:
– Assessing the efficacy of designated AI oversight measures (e.g., risk
assessments, audit procedures, red-teaming) in light of swiftly evolving
material properties of generative AI systems.
– Examining strategies of generative AI providers to shape and influence
governance regimes – for example, through research, lobbying, tool
development, and terms of use/usage policies.
– Understanding how institutional dynamics in particular sectors and
cultural industries (e.g., journalism, education, entertainment) shape
the design, use and effects of generative AI systems.
– Comparing how governance regimes observe, inspect and modify
generative AI systems (e.g., China’s central algorithm registry vs. the
EU’s proposed database for high-risk AI systems).
– Investigating how sociotechnical imaginaries about generative AI
(e.g., perceived “existential risks”) inform the design, substance and
enforcement of particular regulatory frameworks.
– Connecting considerations about the global political economy of AI to
geopolitical issues about digital sovereignty, as illustrated by US
export controls for specific AI chips.
Submission Instructions and Timeline:
Please submit an abstract of 500-800 words (including references) to
[email protected] and [email protected] no later than 15 February 2024.
The abstract should specify: 1) the problem or question being addressed,
2) the paper’s methodological or analytical approach, and 3) the
anticipated results or conclusions of the research. Decisions about the
selection of abstracts will be communicated to authors by 15 March 2024.
The deadline for submitting invited papers is 31 August 2024. The
special issue will follow the submission and review guidelines of
Information, Communication & Society (See Instructions for Authors).
Each invited paper will be peer-reviewed. An invitation to submit a full
paper does not automatically ensure its acceptance in the special issue
or in the journal. No payment from the authors will be required. For
questions, please reach out via email to the guest editors.
Check out the full call for papers here:
http://bit.ly/generative-ai-governance