Narrative based simulation game and 2 interactive explainers
General Overview & Rationale
KER 4 is a narrative based simulation game with two interactive explainers. The narrative design of the game development provides alternative paths and multiple endings that are shaped by player choices and behaviours, in contrast to a traditional linear plot. In this instance, users are in control of the narrative and their experience is structured around individual decision-making.
The aims of KER 4 are twofold:
- Protect citizens from the possible negative impacts of advanced knowledge technologies on fundamental rights and democracy
- Empower users and thereby enhanced their civic and democratic participation
The purpose and motivation behind the KT4D narrative based simulation game is to translate the needs of citizens and citizen-facing organisations, in regard to enhancing participation and democracy into actionable specifications for novel educational approaches and materials.
The KT4D gamified experience and two interactive explainers intend to build skills and raise awareness about how AI/ Big Data can enhance democratic processes and the role of the citizens in it.
Input from Use Cases
The range of topics and format structure implemented in Use Cases 2 & 3 were designed to advance the development of KER 4 using a co-creation approach. Specifically, topics related to the different possible uses of AI, facing technological dilemmas and learning how to impose limits to AI, and requesting concrete examples (personal accounts/stories) are essential for understanding user comprehension and core elements needed to develop proper educational materials.
The interactive and engaging formats designed for these Use Cases provide an ideal platform for inquiry and knowledge acquisition, especially the stepped approach, where users define AI, explain risks and discuss development and regulation. Additionally, discussions around the accessibility of resources (language and complexity) and the support of an assistant or tutorial to help users at any time throughout their journey are insightful to capture how users behave in specific situations.
Findings & Next Steps
The continued development of Modules F and G of the Social Risk Toolkit will provide KER 4 with a conceptual framework and pedagogical design, while KER 4 development also feeds the focus of Modules F and G.
Stakeholder Groups:
Stakeholders & Testing
Gamified Experience Overview
The Gamified Experience assists in translating user needs into functional and non functional requirements, scenarios related to the interaction of the user with the tool were defined, with corresponding user stories linked to concrete steps of this interaction.
Several dilemmas representing different themes such as disinformation, microtargeting, profiling, automated decision- making and so on, have been chosen based on to frame the experience of the player and test their critical digital literacy. Users play the role of making informed decisions about AI use and impact in the role of a head of state.
Interactive Explainer Overview
The purpose of the interactive explainers are to inform citizens about concrete effects of AI and big data, and the risks associated with democratic deliberation and participation.
The first explainer is linked to exploring the effects of AI and information quality, with a focus on deep fakes, while the second explainer is more focused on big data and individual autonomy in personal choices, with a priority focus on the effects of recommendation algorithms on content curation and cultural participation.
Interactive Explainer 1 (Big Data - Exploring the effects of recommendation algorithms)
The goal of the first interactive explainer is to show users how recommendation algorithms work and how they influence the information they encounter and consume online, having potential social and political implications, and detrimental effects on democratic and civic participation.
Big Data and machine learning can support users with personalised recommendations linked to previous behaviours, creating seamless navigation journeys for citizens. However, these same algorithms can also be manipulated to suggest predictive information and tailored materials that may be biassed, narrow, or corrupted. It is important for citizens to develop a critical eye for the content that is being placed in front of them and to recognise that all recommendation systems do not have the best user intentions in mind.
Interactive Explainer 2 (Artificial Intelligence & Deep Fakes)
The second interactive explainer aims to provide users with critical digital information related to deep fakes:
- How to address the threats these digital artefacts pose in democratic deliberation and participation.
- Foster critical digital literacy skills to provide a sense of agency over the information citizens consume online.
Useful Resources
Why is it important?
LOREM IPSUM