Panels (days 1 & 2)

As well as individual papers, days 1 & 2 will also feature a selection of thematic panels from a variety of different researchers and stakeholder organisations. These include:

NVE, Mixed, Salad Bar? Towards a Definitional Framework for Modern Attackers

Dr Nagham El Karhili (GIFCT): LinkedIn

Jessa Mellea (GIFCT): LinkedIn

Recent attacks, the rise in prominence of networks like Com, and irony-poisoned, incoherent motives have challenged the ideological structures used to classify and analyze terrorism. In this roundtable session, participants will collaboratively create working definitions for these emerging networks and movements. Through semi-structured discussions, participants will craft a working definition of the motivations and characteristics of these attackers.

Design, Detect, Decide: A Hands-On Lab for Terrorism Safety in AI

Representatives from OpenAI

As generative AI becomes embedded in social and information ecosystems, platforms must rapidly adapt their safety processes for violent extremist content and related harms. This workshop, led by OpenAI’s Protection team, is a structured, hands-on exercise where participants collaboratively design and test a simplified end-to-end Trust & Safety system for how high-risk content may surface across AI platforms and services. Focused on usage policies and user behaviour, the session blends policy thinking, detection strategy, AI-assisted analysis, and case evaluation, reflecting the operational approaches used by real-world platforms.

Researching on TikTok

Representatives from TikTok

Members of the TikTok team will provide an overview of the platform's Research Tools for accessing data on content, accounts, shops, and more. Building on the 2024 TikTok Research API demo at TASM, the panel will focus on presenting case studies to showcase relevant methods such as natural language processing, network analysis, coding  and share best practices for working with TikTok data while researching extremism and related topics. This session is primarily designed for academic researchers and will allow time for Q&A and discussion..

Exit strategies for Violence Fascination

Christopher Stewart (Human Digital): LinkedIn

Gareth Harris (Human Digital)

Jake Dixon (Human Digital): LinkedIn

Immersion and participation in digital ecosystems, often from an early age, and the convergence of overlapping harm sets such as pornography, violent misogyny, gore and school massacre ideation have contributed to the emergence of individuals with a non-ideologically motivated fascination with violence. Globally there are efforts from governments and law enforcement to collect evidence on the function violent content plays within pathways to violence, and to use that evidence to design proportionate intervention and exit strategies. This session is designed to get attendees thinking practically about how existing strategies could be applied to this growing harm set.

The Current Violent Extremist Threat Landscape: Networks, platforms, and technologies

The VOX-Pol Institute

This session will provide participants with an overview of recent developments in the use of digital technologies by violent extremist actors. First, it will examine the current tactics and capabilities of jihadist networks online, including how they operate across a range of online platforms and services, their strategies for evading detection (and the extent to which such strategies are in fact necessary), and their uses of AI. Second, it will explore the latest trends in how right-wing violent extremist groups and networks are using digital platforms and services, as well as their connections to violent communities with less clear ideological motivations.

Online Safety Regulation in Action: Balancing Rights and Risk

Alistair Pullar (Ofcom): LinkedIn

Tom Caleb (Ofcom)

Laura Waters (Ofcom)

Charlotte Hall (Ofcom)

In this session, after each convenor/panellist provides a brief presentation on their team and its functions, we will provide workshop participants with a fake service, and information about this service and where risk arises on it. This risk could be the result of a risky functionality, a particular user demographic, a cultural proclivity etc. We will then split the workshop participants into groups who will act as constituent internal Ofcom teams represented by the above speakers who will take each smaller group through the process of Ofcom engagement by their constituent team. This will include identifying data needs, forming necessary stakeholder partnerships to identify compliance, engaging with services to identify what changes can be made through informal engagement, and establishing appropriate and proportionate policy responses. Ofcom colleagues will go around the smaller teams and provide insight and support while these teams identify what activities would be necessary as part of the wider objective of ensuring compliance and delivering user – especially child user – safety. The room will then reconvene and identify where the process is effective, but also where it could and/or should be improved to better ensure user safety and/or affirm user rights, whilst fulfilling Ofcom’s regulatory duties under the law. We hope this breakout session will both improve our internal processes, and improve stakeholders’ understanding of these processes, but also allow stakeholders to understand where engagement with Ofcom could be valuable and appropriately targeted in the future.

Project Catalyst: Comparative interventions tackling violent extremism & misogyny in gaming communities and platforms

Galen Lamphere-Englund (Christchurch Call Foundation): LinkedIn

Anne Craanen (Institute for Strategic Dialogue): LinkedIn

Amee Wurzburg (American University): LinkedIn

Hesbone Ndungú (Search for Common Ground): LinkedIn

In 2025, the Christchurch Call Foundation (CCF) started work on Catalyst, an 18-month initiative designed to address online misogyny and gender-based violence linked to violent extremism, focused largely on gaming and gaming adjacent platforms and communities. Implemented through a Consortium led by CCF, Catalyst brings together expert partners including the Institute for Strategic Dialogue (ISD), Meedan, Search for Common Ground, the Polarization and Extremism Research & Innovation Lab (PERIL) at American University, and the Blavatnik School of Government at the University of Oxford. 

The project encompasses 12+ programmatic, research, and policy tracks in Canada, Jordan, and Kenya, as well as across transnational gaming communities. During this panel, we will share preliminary insights from research in Arabic, Swahili, and English, along with comparative offline and online interventions to addressing online misogyny and its connections with violent extremism.

At TASM, the Catalyst Consortium will present on key pillars of work that explore Catalyst research and interventions that are designed to support online communities to better tackle misogyny and extremism. Members of the Consortium will present findings from their respective areas of intervention: livestreamer led campaigns; community moderators bystander training on gaming platforms; offline peer mentor programs; building LLM classifiers to detect tech facilitated gender-based violence content in low-resource languages; and, the mapping and responses to the ‘manosphere’ across the three contexts. Other efforts on civil servant policy training and global policy dialogues are also underway. The session is intended to share research findings and lessons learned from direct interventions designed to build resilience amongst gaming communities along with support mechanisms for platforms working to address harmful content.

Discord and Deception: Radicalisation Through the Lens of Danny’s Story

Samuel Kashti (Shout Out UK): LinkedIn

How can we make a counter-terrorism case study feel relevant to young people in Swansea? That was the central issue we grappled with when designing this project. This panel pulls back the curtain on a unique collaboration where we took a raw Cardiff CT case study involving the Extreme Right Wing (‘Danny' Story’) and "Swansified" it through the lens of local intelligence, with a target audience in mind. The goal was to devise a resource capable of being engaging as well as prebunking a section of society from harmful narratives.

We didn't just guess what radicalisation looked like; we tracked it. By combining SOCMINT (scraping and analyzing local Facebook groups looking for particular hot topics that the Extreme Right Wing were engaging in) with boots-on-the-ground OSINT (mapping extremist graffiti and physical identifiers across the city), we built a hyper-localised narrative. This involved searching the streets of Swansea for symbols of extremism, where this very conference is taking place. This data didn't just sit in a spreadsheet or inform an academic journal, it helped inform a skeleton script of a fictitious story inspired by true events, designed for schools and universities.

We’ll be discussing ‘Danny’s Story’ not as a theoretical exercise, but as a practical demonstration of how multimedia resources and storytelling can bridge the gap between police/OSINT data and classroom/public engagement, to bolster people’s resilience to harmful narratives and disinformation online.

Anonymity, Security and Resilience: The creative tension between regulatory policy and terrorist exploitation of the internet

Chair: Prof Maura Conway (Dublin City University; Swansea University): LinkedIn; BlueSky

Panellists:

Prof Miron Lakomy (University of Silesia): LinkedIn

Dr Ali Fisher (Università Cattolica del Sacro Cuore): LinkedIn

Arthur Bradley (VOX-Pol Institute): LinkedIn

Alessandro Bolpagni (Università Cattolica del Sacro Cuore): LinkedIn

Eleonora Ristuccia (Università Cattolica del Sacro Cuore): LinkedIn

Grazia Ludovica Giardini (Università Cattolica del Sacro Cuore): LinkedIn

This session examines original research situated at the nexus of Internet policy and the adaptive communication strategies of terrorist organizations, with particular emphasis on the operational security (OpSec) tradecraft these groups have developed in response to regulatory and technological pressures. 

Despite regular announcements of success against terrorist actors ranging from Salafi-Jihadi to Far-Right and Far-Left, they frequently remain online and, in many cases, have expanded their activity across major platforms.

In pursuit of their objectives of galvanizing core supporters and mobilising a mass movement, the Media Mujahidin have refined the operational security needed to preserve anonymity and secure communications with core supporters, while also surviving the disruption now common across mainstream media platforms.

Far-Left and Far Right networks have also engaged in a complex interplay between terrorism, violent extremism, and cybercriminality. They have exploited alternative or fringe social media platforms, the decentralised web, and gaming and adjacent platforms.

Combining the learning from the continued presence of these diverse groups, this session examines three interrelated areas that shape how terrorist groups continue to exploit social media and the internet: (1) the policy and regulatory environment in which these groups operate; (2) the OpSec tradecraft they adopt, which makes individuals difficult to identify and disrupt; and (3) the multiplatform networks that provide resilience and a persistent online presence.

Youth Radicalisation in Gaming Spheres: Evolving Threats, P/CVE Engagement and Policy Responses

Dr Erin Saltman (GIFCT): LinkedIn

Julien Bellaiche (GNET): LinkedIn

Dr Jessica White (RUSI)

Galen Lamphere-Englund (Extremism & Gaming Research Network): LinkedIn

The gaming sector is the world’s largest entertainment sector with around three billion people playing video games. In recent years, violent extremists of various ideological leanings – including right-wing extremists, jihadists, and violent nihilists – have been seeking to exploit the popularity and social connectivity of gaming spaces to spread their message, mobilise, and recruit. These efforts include the production and dissemination of propaganda games, the instrumentalisation of existing games, the exploitation of in-game communication features and gaming-adjacent platforms, the use of gaming aesthetics and cultural elements for propaganda, and limited use of fundraising via gaming surfaces. This activity raises concerns over threats of radicalisation of users - particularly among young audiences - as evidenced in recent national and regional counter-terrorism strategy reviews (e.g., in the UK and EU), with an increasing number of minors being charged with terrorism offences.

Since 2024, GIFCT has invited researchers, policymakers and subject matter experts to investigate these dynamics and offer actionable insights for prevention and countering extremism in gaming communities. The Extremism and Gaming Research Network has also convened and commissioned regular research on the subject since 2020 in partnership with RUSI. Building on this work, and over 133 published articles in the last five years, this panel will bring together GIFCT, GNET, EGRN, and RUSI experts to examine the state of violent extremist exploitation of gaming (-adjacent) platforms and the threat this poses for youth radicalisation in these spheres. It will discuss the latest threat trends and forthcoming changes in the field, including rapid digitization across new contexts (West Africa, for example), and the increasing hybridized ecosystems of violence underpinning youth radicalization. 

The discussion will also highlight promising prevention and positive intervention strategies as well as policy responses and best practices to prevent and disrupt violent extremist exploitation of gaming spaces

Distinct remits, but shared mission: How regulation can protect users from terrorism content online

Dr Murtaza Shaikh (Ofcom): LinkedIn

Jonathon Deedman (Ofcom): LinkedIn

ATKM participants

Colleagues from the UK’s Ofcom and the Netherlands’ Autoriteit online Terroristisch en Kinderpornografisch Materiaal (ATKM) will deliver a joint session that seeks to outline the divergent approaches both regulatory bodies take, but the shared objective of preventing users’ exposure to terrorism content online. Working from different national legislative and regulatory frameworks, both Ofcom and ATKM seek to protect users in our constituent locales from terrorism content in distinct ways. We hope this session can both inform TASM attendees how these different systems operate, and improve our own understanding of the harms we seek to regulate through solicitation of prevalence data, definitional frameworks and other theory work to address borderline content, understanding of users’ experiences and perspectives etc. This will see colleagues from Ofcom and from ATKM identifying our remits, our duties and responsibilities under the different legal frameworks, the challenges we have experienced in the last year, and how we intend to move forward and improve, including with input from breakout session participants.

Anti-gender ideology and the social media landscape in and beyond Canada

Prof Amy Mack (University of Lethbridge): LinkedIn; BlueSky

Dr Audrey Gagnon (University of Ottawa): LinkedIn; BlueSky

Dr Kathy Kondor (Norwegian Center for Holocaust and Minority Studies): LinkedIn; BlueSky

Kayla Preston (University of Toronto)

Luc Cousineau (Dalhousie University): BlueSky

Over the last decade, scholars have increasingly turned their attention to digital or cyber misogyny with the rise of incels, MRAs, and male supremacist influencers, to understand the relationship between violence, technology, and gender. In this panel, we expand this work to explore how “anti-gender ideology” groups and movements, which span the political spectrum and include populist parties, use social media to target and harass trans and gender-diverse people in Canada. Our first three papers use empirical case studies in Canada to demonstrate the complexity of these movements in terms of membership and tactics. This includes discourse analysis of the social media and web-presences of populist parties, content analysis of social media-based far-right and far-right adjacent groups, and qualitative interviews with teachers and activists grappling with social media-fuelled gender-based violence in their classrooms. The fourth paper makes connections between this Canadian context and the international flows of anti-gender ideology discourse. The final paper presents research on how activists and educators are pushing back against hate-based and “anti-gender ideology” forces in their communities to provide an empirical and data-driven path forward to countering hate.

“The Digital Bazar”: Understanding How Extremist Organisations are Exploiting TikTok to Share Propaganda and Radicalise Young Individuals

Chair: Prof Miron Lakomy (University of Silesia): LinkedIn

Panellists:

Alessandro Bolpagni (Università Cattolica del Sacro Cuore): LinkedIn

Silvano Rizieri Lucini (Università Cattolica del Sacro Cuore): LinkedIn

Alessandra Pugnana (Università Cattolica del Sacro Cuore): LinkedIn

Eleonora Ristuccia (Università Cattolica del Sacro Cuore): LinkedIn

Grazia Ludovica Giardini (Università Cattolica del Sacro Cuore): LinkedIn

Dr Kristin Weber (Centre for Criminological Research Saxony): LinkedIn; BlueSky

Erik Hacker (SCENOR): LinkedIn

This breakout session aims to explore how extremist groups exploit TikTok to share terrorist propaganda and violent narratives. The participants will shed light on the presence of a great number of users active in sharing either extremist content or propaganda material to spread extremist ideologies, incite violence, radicalise young individuals, and recruit new members. Particularly, the panel will show content related to extreme right-wing accelerationism, Salafi-jihadism, True Crime Community (TCC), and Christian extremism. The authors will delve into the typologies of actors and the set of tactics and techniques employed by users to share propaganda and extremist content. To summarise, participants will delve into how the aforementioned extremist groups are exploiting TikTok, presenting possible techniques and approaches to efficiently counter the spreading of extremist propaganda and content.

Harnessing Multistakeholder Power: Driving Collective Action for Global Challenges

Chair: Andy George (Christchurch Call Foundation)

Panellists:

Government representative

Tech industry representative

Civil society advocate

Academic researcher

Global challenges, such as eliminating terrorist and violent extremist content online, cannot be solved by governments, tech companies, or civil society alone. They demand shared responsibility and coordinated action. This workshop will:

- Discuss how multistakeholder frameworks operate to create impactful, scalable solutions in increasingly complex geopolitical and policy environments.

- Drawing on the Christchurch Call as a leading example, explore how multistakeholder approaches are evolving in increasingly complex geopolitical and policy environments.

- Look at the future of global multistakeholderism, generating ideas for future frameworks and collaboration.

The Only Way Is Ethics: An Open Conversation on Safety, Standards, and Solutions in Extremism Research

Dr Ashton Kingdon (University of Southampton)

Dr Ashley Mattheis (University of Manchester)

Dr Alyssa Czerwinsky (University of Manchester)

Bradley Galloway (Ontario Tech University)

Dr Antonia Vaughan (Moonshot)

Dr Elizabeth Pearson (Royal Holloway University)

Dr Nicola Mattheison (University of Liverpool)

Dr Audrey Gagnon (University of Ottawa)

Dr Meili Criezis (American University)

Extremism research has expanded rapidly, but ethical frameworks and institutional support lag behind. Researchers face challenges beyond standard protocols: accessing sensitive data, exposure to traumatic content, physical and digital safety risks, and the danger of amplifying extremist narratives. Institutional responses remain inadequate and inconsistent, with protection gaps varying by resources, location, and discipline.

This interactive roundtable critically assesses the current state of research ethics in extremism studies and charts a path forward. Using a ‘fishbowl’ format, audience members are actively encouraged to share experiences and ideas alongside panellists, generating honest reflections and healthy debate.

Key questions to be explored include:

- How do we balance understanding extremism with risks of amplifying harmful content?

- What protections are needed for researchers’ mental health when engaging with traumatic material?

- As platforms restrict data access, how can we maintain research transparency while respecting ethical boundaries?

- How do we better prepare and protect the next generation of extremism researchers?

- What does robust ethical infrastructure look like?

Participants will examine the adequacy of current ethical guidelines, researcher wellbeing and mental health support, cybersecurity challenges, the ethics of working with closed-source data, and institutional responsibilities in safeguarding researchers.

This session moves beyond problem identification toward actionable solutions, including specialised training, dedicated safety resources, and recognising digital security as integral to research ethics. Through cross-disciplinary knowledge exchange, we will establish a shared vision for ethical practice that prioritises both research integrity and researcher wellbeing while advancing understanding of critical societal challenges.

Researcher Safety by Design: Modelling practical OPSEC for studying digital extremism

Dr Michael Loadenthal (University of Cincinnati): LinkedIn; BlueSky

This interactive, hands-on workshop invites researchers, practitioners, and policymakers into a practical exploration of how to work safely, sustainably, and confidently in the study of violent extremism and contested digital environments. Rather than treating risk as an abstract concern, the session equips participants with concrete tools to identify, contextualize, prioritize, and mitigate threats that emerge across physical, digital, and psychosocial dimensions of research. Drawing on threat modeling and public health–inspired harm reduction frameworks, the workshop emphasizes resilience, adaptability, and long-term well-being.

The session unfolds in three dynamic stages. First, participants build individualized threat models using accessible, creative techniques such as asset inventorying, mind mapping, and structured analytic tools. This process helps attendees clarify how risks vary by research topic, method, and positionality. Second, these models are transformed into actionable mitigation strategies using a harm reduction lens that prioritizes realistic, proportional interventions over one-size-fits-all solutions. Participants learn how to decide what matters most, what can wait, and where small changes can meaningfully reduce harm.

The final stage delivers an engaging “101 crash course” in operational and digital security, tailored specifically to research workflows. Through live demonstrations, participants explore tools such as VPNs, secure cloud storage, password managers, virtual machines, browser extensions, identity separation techniques, and secure messaging platforms. These tools are applied directly to everyday research practices, with a focus on protecting identity, safeguarding sensitive data, and using technology to maintain healthy boundaries when engaging with violent or disturbing material.

Blending case studies, scenario-based exercises, and collaborative discussion, the workshop is designed to be energetic, inclusive, and immediately useful. Participants leave with both a clear risk-management framework and practical skills they can deploy right away—making this session especially valuable for early-career scholars and anyone seeking to build safer, more sustainable research practices.

Interacting networks – Collaborative clusters to counter online terror threats

Representatives from the NOTIONES and TATE projects, and SAHER (Europe)

This session will begin by introducing the NOTIONES project, the aim of which is to establish a network to connect researchers and industries with the intelligence community, facilitate exchange on new and emerging technologies, and equip solution providers with insights on the corresponding needs and requirements of practitioners. Members of the Tech Against Terrorism Europe (TATE) project will then present some of its key findings, including an analysis of platforms being exploited to host terrorist content and challenges faced by those implementing the EU’s Terrorist Content Online (TCO) Regulation. Finally, SAHER (Europe) will provide an introduction to the TCO Cluster.